Merge lp:~mike-amy/sahana-eden/climate into lp:sahana-eden

Proposed by Mike Amy
Status: Superseded
Proposed branch: lp:~mike-amy/sahana-eden/climate
Merge into: lp:sahana-eden
Diff against target: 7765 lines (+3996/-2732)
55 files modified
controllers/climate.py (+159/-96)
deployment-templates/models/000_config.py (+1/-1)
models/03_gis.py (+2/-0)
models/climate.py (+5/-0)
modules/ClimateDataPortal/MapPlugin.py (+430/-0)
modules/ClimateDataPortal/__init__.py (+201/-0)
modules/ClimateDataPortal/import_NetCDF_readings.py (+131/-0)
modules/ClimateDataPortal/import_stations.py (+53/-0)
modules/ClimateDataPortal/import_tabbed_readings.py (+154/-0)
modules/s3/s3gis.py (+784/-1031)
modules/test_utils/AddedRole.py (+25/-0)
modules/test_utils/Change.py (+25/-0)
modules/test_utils/ExpectSessionWarning.py (+14/-0)
modules/test_utils/ExpectedException.py (+13/-0)
modules/test_utils/InsertedRecord.py (+19/-0)
modules/test_utils/Web2pyNosePlugin.py (+106/-0)
modules/test_utils/__init__.py (+11/-1)
modules/test_utils/assert_equal.py (+60/-0)
modules/test_utils/clear_table.py (+4/-0)
modules/test_utils/find_JSON_format_data_structure.py (+54/-0)
modules/test_utils/run.py (+76/-239)
private/prepopulate/default/tasks.cfg (+0/-18)
static/scripts/S3/s3.gis.climate.js (+352/-134)
tests/climate/__init__.py (+101/-0)
tests/nose.py (+2/-2)
tests/unit_tests/gis/basic_map.html (+0/-91)
tests/unit_tests/gis/bing.html (+0/-62)
tests/unit_tests/gis/feature_queries.html (+0/-52)
tests/unit_tests/gis/google.html (+0/-67)
tests/unit_tests/gis/map_with_layers.html (+0/-150)
tests/unit_tests/gis/s3gis.py (+0/-417)
tests/unit_tests/gis/testgis.cmd (+0/-8)
tests/unit_tests/gis/true_code_paths.html (+0/-302)
tests/unit_tests/gis/yahoo.html (+0/-61)
tests/unit_tests/modules/s3/s3gis/BingLayer.py (+28/-0)
tests/unit_tests/modules/s3/s3gis/CommonScripts.py (+47/-0)
tests/unit_tests/modules/s3/s3gis/FeatureLayer.py (+25/-0)
tests/unit_tests/modules/s3/s3gis/FeatureQueries.py (+44/-0)
tests/unit_tests/modules/s3/s3gis/GPXLayer.py (+29/-0)
tests/unit_tests/modules/s3/s3gis/GeoJSONLayer.py (+28/-0)
tests/unit_tests/modules/s3/s3gis/GeoRSSLayer.py (+74/-0)
tests/unit_tests/modules/s3/s3gis/GoogleLayer.py (+84/-0)
tests/unit_tests/modules/s3/s3gis/KMLLayer.py (+54/-0)
tests/unit_tests/modules/s3/s3gis/LayerFailures.py (+117/-0)
tests/unit_tests/modules/s3/s3gis/OpenStreetMap.py (+31/-0)
tests/unit_tests/modules/s3/s3gis/TMSLayer.py (+30/-0)
tests/unit_tests/modules/s3/s3gis/TrueCodePaths.py (+307/-0)
tests/unit_tests/modules/s3/s3gis/UserInterface.py (+7/-0)
tests/unit_tests/modules/s3/s3gis/WFSLayer.py (+31/-0)
tests/unit_tests/modules/s3/s3gis/WMSLayer.py (+28/-0)
tests/unit_tests/modules/s3/s3gis/YahooLayer.py (+48/-0)
tests/unit_tests/modules/s3/s3gis/__init__.py (+68/-0)
tests/unit_tests/modules/s3/s3rest.py (+6/-0)
tests/unit_tests/modules/test_utils/find_JSON_format_data_structure.py (+52/-0)
views/climate/chart_popup.html (+76/-0)
To merge this branch: bzr merge lp:~mike-amy/sahana-eden/climate
Reviewer Review Type Date Requested Status
Fran Boon Needs Fixing
Review via email: mp+74043@code.launchpad.net

This proposal has been superseded by a proposal from 2011-09-05.

Description of the change

Added NetCDF importer for climate data, improved performance of the overlay layer generation.
Fixed tests, updated to use "current" global variable.

To post a comment you must log in.
Revision history for this message
Fran Boon (flavour) wrote :

views/climate/chart_popup.html
Please {{include jquery.html}} instead of hardcoding the jquery version.
You are already a point version out & we'll shortly go up to the newly released 1.6.3...

This should be reverted:
=== removed file 'private/prepopulate/default/tasks.cfg'

Can remove the r=request from:
+# self.url = "%s/%s" % (URL(r=request, c="default", f="download"),

I also see a very large number of single quotes in the Python code & some double quotes in the javascript....I'm happy to clean up the odd one that slips through, but this number seems excessive for me.

review: Needs Fixing
lp:~mike-amy/sahana-eden/climate updated
2308. By Mike Amy

fixed merge problems, cheers

2309. By Mike Amy

merged from trunk

2310. By Mike Amy

reverted references to self.debug when session isn't available and improved cache folder detection for KMLLayers

2311. By Mike Amy

merged from trunk

2312. By Mike Amy

added module config for climate to 000_config.py

2313. By Mike Amy

added parameter to the netcdf importer to pick which parameter is to be imported from the .nc file

2314. By Mike Amy

Quick import option

2315. By Mike Amy

Report status when loading image

2316. By Mike Amy

minor formatting

2317. By Mike Amy

month filter widget

2318. By Mike Amy

pre-demo commit so I can revert

2319. By Mike Amy

check in a fix before merging from trunk

2320. By Mike Amy

merged from trunk, trying to get postgres (or anything) working.

2321. By Mike Amy

merged from trunk

2322. By Mike Amy

Fix for the complaints about the missing relation 'scheduler_task'

2323. By Mike Amy

Map working again after moving the climate models out of their module.

2324. By Mike Amy

can add parameter tables whilst server is running

2325. By Mike Amy

Updated importing stations script.

2326. By Mike Amy

importing observed readings script updated

2327. By Mike Amy

Updated NetCDF (Gridded data) importer script

2328. By Mike Amy

added python script for easier running of scripts with web2py

2329. By Mike Amy

run.py uses script paths relative from itself

2330. By Mike Amy

Tabbed data importer script updated for dynamic database tables

2331. By Mike Amy

Tabbed data importer script updated for dynamic database tables

2332. By Mike Amy

Made station range parameters more explicit in tabbed data import script

2333. By Mike Amy

Typo fix

2334. By Mike Amy

map overlay updated for dynamic tables

2335. By Mike Amy

Split out climate data result caching code

2336. By Mike Amy

aggregation/statistic name configuration moved under control of the map plugin

2337. By Mike Amy

aggregation/statistic name configuration moved under control of the map plugin

2338. By Mike Amy

Chart code updated tto accept but ignore month filter and aggregation name

2339. By Mike Amy

Chart is shown imon loading climate data portal page

2340. By Mike Amy

Chart is shown imon loading climate data portal page

2341. By Mike Amy

Chart is shown on loading climate data portal page

2342. By Mike Amy

updated commands for removal of type field from the sample tables

2343. By Mike Amy

reverted the feeding of web2py's import_from_csv code, as it doesn't make things any faster, which is pretty lame.

2344. By Mike Amy

Use raw SQL inserts to speed up the import of the NetCDF data.

2345. By Mike Amy

Don't need to delete web2py's table definitions, let it do so itself.

2346. By Mike Amy

Added multi-column uniqueness constraint on the sample table definitions table

2347. By Mike Amy

DSL module, UI key scale changes.

2348. By Mike Amy

Popups show place info and observation data

2349. By Mike Amy

Note about styling the popups (they'll probably need it).

2350. By Mike Amy

Packed UI together and added, comparison interface. Layer tree collapses to leave room.

2351. By Mike Amy

Freeform and comparison query UI

2352. By Mike Amy

Changed year and month to combo box.

2353. By Mike Amy

Fixed a bug in the stringification of ToDate/FromDate and fixed the UI so that the dates get sent properly again

2354. By Mike Amy

Fixed the caching which wasn't using the str(expression), fixed some stringification problems. Expressions are added to the data for verification.

2355. By Mike Amy

Comparisons and free-form queries work. Values come back with meaningful units.

2356. By Mike Amy

freeform queries update show position of syntax errors

2357. By Mike Amy

DSL binary operators work with scalars as well as R data.frames

2358. By Mike Amy

DSL binary operators work with scalars as well as R data.frames

2359. By Mike Amy

numbers can be expressed as displacements, e.g. delta mm of rainfall

2360. By Mike Amy

Numbers don't have to be positive, unless they are marked as such.

2361. By Mike Amy

DSL Error reporting

2362. By Mike Amy

Charts now work from queries. Had to disable year-matching in queries to get sensible years.

2363. By Mike Amy

Fixed a weirdness in postgres where modulus is negative. Month filter now works for the map.

2364. By Mike Amy

Chart image size control

2365. By Mike Amy

Changed URL for place data to cope with missing trailing slash e.g. <host>/eden/climate

2366. By Mike Amy

Moved R process creation inside MapPlugin.__init__

2367. By Mike Amy

Firefox fixes

2368. By Mike Amy

rpy2 missing reference fix

2369. By Mike Amy

fixed missing import, show exact colour scale, charts understand monthly aggregation, charts display legends

2370. By Mike Amy

basic data purchase management screen

2371. By Mike Amy

Chart and data purchase popup URLs are explicitly passed.

2372. By Mike Amy

More discernible colour range, fixed incorrect colour problem

2373. By Mike Amy

More discernible colour range

2374. By Mike Amy

Graph date fixes, projected data importer, key scale changes.

2375. By Mike Amy

merged Michael's changes to add purchasing system, moved colour scale out of panel, above map

2376. By Mike Amy

Moved the freeform query to an editable map legend. Renmed FromDate, ToDate as From/To for easier reading.

2377. By Mike Amy

Added filter box widget, added error highlighting and error tooltips, reformatted climate controller file for clarity. Fixed a bug in the parsing of units (trailing space upsets pattern matcher).

2378. By Mike Amy

fixed a bug regarding incorrect end of year if month unspecified, which caused values for 1960-1960 to come out wrong

2379. By Mike Amy

Graphs: improved axis labels, show ° Celsius, legend are laid out better. Fixed a month filtering problem to do with incorrect month offset and postgres modulo weirdness with negative numbers.

2380. By Mike Amy

Legend wraps lines and lays itself out well.

2381. By Mike Amy

Added detailed linear regression (best-fit) line information to the legend.

2382. By Mike Amy

Moved legend below the graph which seems more common a placement. Moving to the right side seems more common, but is much more difficult.

2383. By Mike Amy

Better error reporting, accepts ints before AST nodes in expressions, better explanation of affine number error, map can be full window, query, filter and full window mode can be controlled by request.vars

2384. By Mike Amy

Added script for server-side rendering and screenshots via window.print(). The full window map removes unnecessary widgets and generally makes itself more printable.

2385. By Mike Amy

Printable Map PNG image downloads successfully

2386. By Mike Amy

Added observation station marker layer

2387. By Mike Amy

Added more space on to the graph legends, added a note about the screenshot script, started on KML shape file handling

2388. By Mike Amy

Added naive region detection for places. Detection is slow but subsequent filtering is fast

2389. By Mike Amy

Nepal districts KML load (hardcoded), moved place methods into new place class

2390. By Mike Amy

Removed unused place-space-matching algorithm code (asynchronous testing solves the UI lockup).

2391. By Mike Amy

within() filter function accepts multiple region names in anOR-like relationship

2392. By Mike Amy

Implemented a point simplification algorithm for Vector layers which makes them render much faster by removing points that make less than a pixel difference. Implemented an algorithm to speed up the point-in-region detection by 2.5 orders of magnitude. Vector layers run much smoother.

2393. By Mike Amy

Changes to the places file (cupreviously 1 1.6MB/250KB compressed) JSON format to allow zlib compression down to 145KB/6KB compressed.

2394. By Mike Amy

ignore places that have been filtered out by being outsife of nepal

2395. By Mike Amy

NetCDF imports generate a CSV file suitable for postgres' COPY command which is faster than INSERTS. Expression grid sizes are returned for simple expressions. The Map overlay uses these grid sizes to draw the correct size squares. Colours have been changed as per DHM request.

2396. By Mike Amy

R quietly imports libraries we need (trying to avoid hangs due to GIL mishandling in R's print back via python)

2397. By Mike Amy

escape non-alphanumeric characters in climate data table names for the parser regular expression

2398. By Mike Amy

Locating places in spaces depends now upon places and spaces being loaded rather than arbitrary timing.

2399. By Mike Amy

Added PreviousDecember handling, fixed error handling. removed some dead code in the controller. Added error handling for when linear regressions are not possible. Added grid size arg to add_table command.

2400. By Mike Amy

Added tooltips, fixed a bug that caused no datawhen no months were specified (came from PreviousDecember feature), fixed a bug that was stopping the webpage loading the initial filter and query, if given.

2401. By Mike Amy

Fixed off-by-one error on the colour map length which was causing 0 values to not be shown

2402. By Mike Amy

Fixed a bug that was stopping the reading of the initial expression, fixed a bug that allowed spaces in the filter box to be interpretted as a filter expression.

2403. By Mike Amy

Fixed the markers to show station data and t the display of values when the range of values is zero

2404. By Mike Amy

Better error explanations, in particular connection errors.

2405. By Mike Amy

added CSV download controller, changed printable map image to use os.system, cache the images and removed the .png file extension addition (cache already does it), started on real time database integration. index.html has less in it.

2406. By Mike Amy

Changed daily data importer. Hide daily data sets except for download.

2407. By Mike Amy

Implemented CSV download of purchased data

2408. By Mike Amy

Trailing space on downloaded CSV file

2409. By Mike Amy

changed value precision to 6 d.p.

2410. By Mike Amy

Shape files are not visible by default

2411. By Mike Amy

Fixed a bug which caused en arror in the SQL when no months are selected for annual aggregation. Display a message confirming successful response with no data if no data is returned.

2412. By Mike Amy

Downloaded purchased data does not show days in monthly data. removed the obsolete buy data button and template.

2413. By Mike Amy

Fixed a floating point rounding error in the point-in-region detection code (taken from OpenLayers). The shape file layer only become invisible once the loaded.

2414. By Mike Amy

Added region labels

2415. By Mike Amy

Show labels for regions.

2416. By Mike Amy

Hover delays

2417. By Mike Amy

Fixed broken place deselection

2418. By Mike Amy

Added Nepal Gov logo for putting all over the place

2419. By Mike Amy

Nepal Gov logo shows on the map overlay and on the charts as a watermark

2420. By Mike Amy

changed the chart popup window input to a text area to accommodate more text

2421. By Mike Amy

All places are listed in the chart legend, in order.

2422. By Mike Amy

Added user manual

2423. By Mike Amy

merged from trunk

2424. By Mike Amy

Chart labels are editable

2425. By Mike Amy

Years for which no data exist are greyed out but still selectable in the year comboboxes

2426. By Mike Amy

Years for which no data exist are greyed out but still selectable in the year comboboxes

2427. By Mike Amy

Removed side menu and created a climate menu in the top menu bar

2428. By Mike Amy

worked around a problem in openlayers where dragging the select feature control interferes with focussing on custom controls

2429. By Mike Amy

Fixed a performance problem where zooming in and out slowed everything down. Point samples are now shown as circles

2430. By Mike Amy

workaround for some kind of (new) f.p. precision loss in the striping algorithm.

2431. By Mike Amy

Added a quick region filter combo box.

2432. By Mike Amy

Added ability to specify zoom & coords via request parameters

2433. By Mike Amy

Typo in grid sizing code

2434. By Mike Amy

removed unnecessary call to map.updateSize()

2435. By Mike Amy

Purchase field renamings

2436. By Mike Amy

climate/purchase shows the username, changed purpose field to 'Receipt number / Student ID / other notes'

2437. By Mike Amy

Made prices editable

2438. By Mike Amy

Added download as CSV feature for map overlay data

2439. By Mike Amy

Added user manual

2440. By Mike Amy

Added model descriptions pdf

2441. By Mike Amy

Improvements to chart popup window.

2442. By Mike Amy

Previous commit was botched by bzr (committed multiple files when only one was selected):

Previous commit messages, per file:

climate.js
Fixed colour key range. Fixed error reporting.
Can download csv data. Ratios come out as percentages.

import_NetCDF_readings.py
Fixed floating point rounding errors affecting place matching.

MapPlugin, Cache.py:
Made caching filepaths application dependent.
Grid size detection and error reporting.
csv time series data function.

2443. By Mike Amy

mm -> precipitation mm

2444. By Mike Amy

allow units mm and precipitation mm

2445. By Mike Amy

NetCDF importer changes

2446. By Mike Amy

ignore values for places not on the map (was affecting the limits)

2447. By Mike Amy

Fixed bug that stopped the printable image showing

2448. By Mike Amy

detection of xvfb

2449. By Mike Amy

detection of xvfb

2450. By Mike Amy

proper xvfb command formatting

2451. By Mike Amy

mid-merge of changes from competition branch

2452. By Mike Amy

mid-merge of changes from competition branch

2453. By Mike Amy

mid-merge of changes from competition branch

2454. By Mike Amy

committing to be able to revert

Unmerged revisions

2454. By Mike Amy

committing to be able to revert

2453. By Mike Amy

mid-merge of changes from competition branch

2452. By Mike Amy

mid-merge of changes from competition branch

2451. By Mike Amy

mid-merge of changes from competition branch

2450. By Mike Amy

proper xvfb command formatting

2449. By Mike Amy

detection of xvfb

2448. By Mike Amy

detection of xvfb

2447. By Mike Amy

Fixed bug that stopped the printable image showing

2446. By Mike Amy

ignore values for places not on the map (was affecting the limits)

2445. By Mike Amy

NetCDF importer changes

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file 'controllers/climate.py'
--- controllers/climate.py 2011-08-06 18:24:53 +0000
+++ controllers/climate.py 2011-09-05 13:55:13 +0000
@@ -2,92 +2,14 @@
22
3module = "climate"3module = "climate"
44
5class ClimateDataMapPlugin(object):5ClimateDataPortal = local_import("ClimateDataPortal")
6 def __init__(self,6
7 data_type_option_names,7sample_type_names = ClimateDataPortal.sample_codes.keys()
8 parameter_names,8variable_names = ClimateDataPortal.tables.keys()
9 projected_option_type_names,9
10 year_min,10map_plugin = ClimateDataPortal.MapPlugin(
11 year_max11 data_type_option_names = sample_type_names,
12 ):12 parameter_names = variable_names,
13 self.data_type_option_names = data_type_option_names
14 self.parameter_names = parameter_names
15 self.projected_option_type_names = projected_option_type_names
16 self.year_min = year_min
17 self.year_max = year_max
18
19 def extend_gis_map(self, add_javascript, add_configuration):
20 add_javascript("scripts/S3/s3.gis.climate.js")
21 add_configuration(
22 SCRIPT(
23 "\n".join((
24 "registerPlugin(",
25 " new ClimateDataMapPlugin("+
26 json.dumps(
27 dict(
28 self.__dict__,
29 data_type_label = str(T("Data Type")),
30 projected_option_type_label = str(T("Projection Type"))
31 ),
32 indent = 4
33 )+
34 ")",
35 ")",
36 ))
37 )
38 )
39
40 def add_html(self, html):
41 statistics_widget = FIELDSET(
42 LEGEND("Statistics"),
43 UL(
44 _style="list-style:none",
45 *(
46 LI(
47 INPUT(
48 _type="radio",
49 _name="statistics",
50 _id="id_%s" % statistic,
51 ),
52 LABEL(
53 statistic,
54 _for="id_%s" % statistic,
55 )
56 )
57 for statistic in ["Mean", "Max", "Min"]
58 )
59 )
60 )
61
62 html.append(
63 DIV(
64 FORM(
65 _id="controller",
66 *(
67 SCRIPT(
68 _type="text/javascript",
69 *["""
70 """]
71 ),
72 climate_data_type_widget,
73 parameters_widget,
74 statistics_widget,
75 period_widget
76 )
77 )
78 )
79 )
80
81 def get_image_overlay(self, ):
82 from gluon.contenttype import contenttype
83 response.headers["Content-Type"] = contenttype(".png")
84 # @ToDo: Should be a file in static
85 return response.stream(open("/Users/mike/Desktop/red_wave.png"))
86
87climate_data_map_plugin = ClimateDataMapPlugin(
88 data_type_option_names = ["Observed", "Gridded", "Projected"],
89 parameter_names = ["Rainfall", "Temperature", "Wind", "Humidity", "Sunshine"],
90 projected_option_type_names = ["RC Model", "GC Model", "Scenario"],
91 year_max = datetime.date.today().year,13 year_max = datetime.date.today().year,
92 year_min = 1960,14 year_min = 1960,
93)15)
@@ -120,16 +42,16 @@
120 print_tool = {"url": print_service}42 print_tool = {"url": print_service}
121 else:43 else:
122 print_tool = {}44 print_tool = {}
12345
124 map = gis.show_map(46 map = gis.show_map(
47 lat = 28.5,
48 lon = 84.1,
49 zoom = 7,
125 toolbar = False,50 toolbar = False,
126 catalogue_toolbar=catalogue_toolbar, # T/F, top tabs toolbar51# catalogue_toolbar=catalogue_toolbar, # T/F, top tabs toolbar
127 wms_browser = wms_browser, # dict52 wms_browser = wms_browser, # dict
128 catalogue_layers=catalogue_layers, # T/F
129 mouse_position = deployment_settings.get_gis_mouse_position(),
130 print_tool = print_tool,
131 plugins = [53 plugins = [
132 climate_data_map_plugin54 map_plugin
133 ]55 ]
134 )56 )
13557
@@ -138,7 +60,148 @@
138 module_name=module_name,60 module_name=module_name,
139 map=map61 map=map
140 )62 )
14163
142def climate_image_overlay():64month_names = dict(
143 return climate_data_map_plugin.get_image_overlay()65 January=1,
14466 February=2,
67 March=3,
68 April=4,
69 May=5,
70 June=6,
71 July=7,
72 August=8,
73 September=9,
74 October=10,
75 November=11,
76 December=12
77)
78
79for name, number in month_names.items():
80 month_names[name[:3]] = number
81for name, number in month_names.items():
82 month_names[name.upper()] = number
83for name, number in month_names.items():
84 month_names[name.lower()] = number
85
86def convert_date(default_month):
87 def converter(year_month):
88 components = year_month.split("-")
89 year = int(components[0])
90 assert 1960 <= year, "year must be >= 1960"
91
92 try:
93 month_value = components[1]
94 except IndexError:
95 month = default_month
96 else:
97 try:
98 month = int(month_value)
99 except TypeError:
100 month = month_names[month_value]
101
102 assert 1 <= month <= 12, "month must be in range 1:12"
103 return datetime.date(year, month, 1)
104 return converter
105
106def one_of(options):
107 def validator(choice):
108 assert choice in options, "should be one of %s, not '%s'" % (
109 options,
110 choice
111 )
112 return choice
113 return validator
114
115def climate_overlay_data():
116 kwargs = dict(request.vars)
117 kwargs["parameter"] = kwargs["parameter"].replace("+", " ")
118
119 arguments = {}
120 errors = []
121 for kwarg_name, converter in dict(
122 data_type = one_of(sample_type_names),
123 statistic = one_of(("Maximum", "Minimum", "Average")),
124 parameter = one_of(variable_names),
125 from_date = convert_date(default_month = 1),
126 to_date = convert_date(default_month = 12),
127 ).iteritems():
128 try:
129 value = kwargs.pop(kwarg_name)
130 except KeyError:
131 errors.append("%s missing" % kwarg_name)
132 else:
133 try:
134 arguments[kwarg_name] = converter(value)
135 except TypeError:
136 errors.append("%s is wrong type" % kwarg_name)
137 except AssertionError, assertion_error:
138 errors.append("%s: %s" % (kwarg_name, assertion_error))
139 if kwargs:
140 errors.append("Unexpected arguments: %s" % kwargs.keys())
141
142 if errors:
143 raise HTTP(500, "<br />".join(errors))
144 else:
145 import gluon.contenttype
146 data_path = map_plugin.get_overlay_data(
147 env = Storage(globals()),
148 **arguments
149 )
150 return response.stream(
151 open(data_path,"rb"),
152 chunk_size=4096
153 )
154
155def list_of(converter):
156 def convert_list(choices):
157 return map(converter, choices)
158 return convert_list
159
160def climate_chart():
161 kwargs = dict(request.vars)
162 import simplejson as JSON
163 specs = JSON.loads(kwargs.pop("spec"))
164
165 checked_specs = []
166 for spec in specs:
167 arguments = {}
168 errors = []
169 for name, converter in dict(
170 data_type = one_of(sample_type_names),
171 parameter = one_of(variable_names),
172 from_date = convert_date(default_month = 1),
173 to_date = convert_date(default_month = 12),
174 place_ids = list_of(int)
175 ).iteritems():
176 try:
177 value = spec.pop(name)
178 except KeyError:
179 errors.append("%s missing" % name)
180 else:
181 try:
182 arguments[name] = converter(value)
183 except TypeError:
184 errors.append("%s is wrong type" % name)
185 except AssertionError, assertion_error:
186 errors.append("%s: %s" % (name, assertion_error))
187 if spec:
188 errors.append("Unexpected arguments: %s" % spec.keys())
189 checked_specs.append(arguments)
190
191 if errors:
192 raise HTTP(500, "<br />".join(errors))
193 else:
194 import gluon.contenttype
195 response.headers["Content-Type"] = gluon.contenttype.contenttype(".png")
196 data_image_file_path = map_plugin.render_plots(
197 env = Storage(globals()),
198 specs = checked_specs
199 )
200 return response.stream(
201 open(data_image_file_path,"rb"),
202 chunk_size=4096
203 )
204
205def chart_popup():
206 return {}
207
145208
=== modified file 'deployment-templates/models/000_config.py'
--- deployment-templates/models/000_config.py 2011-08-25 09:17:02 +0000
+++ deployment-templates/models/000_config.py 2011-09-05 13:55:13 +0000
@@ -249,7 +249,7 @@
249 strict_hierarchy = False,249 strict_hierarchy = False,
250 # Should all specific locations (e.g. addresses, waypoints) be required to250 # Should all specific locations (e.g. addresses, waypoints) be required to
251 # link to where they are in the location hierarchy?251 # link to where they are in the location hierarchy?
252 location_parent_required = False,252 location_parent_required = False
253)253)
254# Set this if there will be multiple areas in which work is being done,254# Set this if there will be multiple areas in which work is being done,
255# and a menu to select among them is wanted. With this on, any map255# and a menu to select among them is wanted. With this on, any map
256256
=== modified file 'models/03_gis.py'
--- models/03_gis.py 2011-08-26 08:12:56 +0000
+++ models/03_gis.py 2011-09-05 13:55:13 +0000
@@ -1369,6 +1369,8 @@
1369# =============================================================================1369# =============================================================================
1370def gis_map_tables():1370def gis_map_tables():
1371 """ Load the GIS Map Tables when needed """1371 """ Load the GIS Map Tables when needed """
1372 if "gis_layer_bing" in db.tables:
1373 return
13721374
1373 # -------------------------------------------------------------------------1375 # -------------------------------------------------------------------------
1374 # GPS Waypoints1376 # GPS Waypoints
13751377
=== added file 'models/climate.py'
--- models/climate.py 1970-01-01 00:00:00 +0000
+++ models/climate.py 2011-09-05 13:55:13 +0000
@@ -0,0 +1,5 @@
1# -*- coding: utf-8 -*-
2
3module = "climate"
4if deployment_settings.has_module(module):
5 local_import("ClimateDataPortal").define_models(env = Storage(globals()))
06
=== added directory 'modules/ClimateDataPortal'
=== added file 'modules/ClimateDataPortal/MapPlugin.py'
--- modules/ClimateDataPortal/MapPlugin.py 1970-01-01 00:00:00 +0000
+++ modules/ClimateDataPortal/MapPlugin.py 2011-09-05 13:55:13 +0000
@@ -0,0 +1,430 @@
1
2# notes:
3
4# dependencies:
5# R
6
7# create folder for cache:
8# mkdir -p /tmp/climate_data_portal/images/recent/
9# mkdir -p /tmp/climate_data_portal/images/older/
10
11MAX_CACHE_FOLDER_SIZE = 2**24 # 16 MiB
12
13class TwoStageCache(object):
14 def __init__(self, folder, max_size):
15 self.folder = folder
16 self.max_size
17
18 def purge(self):
19 pass
20
21 def retrieve(self, file_name, generate_if_not_found):
22 pass
23
24import os, errno
25
26def mkdir_p(path):
27 try:
28 os.makedirs(path)
29 except OSError as exc: # Python >2.5
30 if exc.errno == errno.EEXIST:
31 pass
32 else: raise
33
34def define(env, place, tables, date_to_month_number, sample_codes, exports):
35 # This starts an R interpreter.
36 # As we are sharing it (restarting it every time is inefficient),
37 # we have to be somewhat careful to make sure objects are garbage collected
38 # better to just not stick anything in R's globals
39 try:
40 import rpy2.robjects as robjects
41 except ImportError:
42 import logging
43 logging.getLogger().error(
44"""R is required by the climate data portal to generate charts
45
46To install R: refer to:
47http://cran.r-project.org/doc/manuals/R-admin.html
48
49
50rpy2 is required to interact with python.
51
52To install rpy2, refer to:
53http://rpy.sourceforge.net/rpy2/doc-dev/html/overview.html
54""")
55 raise
56
57 R = robjects.r
58
59 from rpy2.robjects.packages import importr
60
61 base = importr("base")
62
63 from math import fsum
64 def average(values):
65 "Safe float average"
66 l = len(values)
67 if l is 0:
68 return None
69 else:
70 return fsum(values)/l
71
72 class Maximum(object):
73 def __init__(self, column, add_query_term):
74 self.value_max = value_max = column.max()
75 add_query_term(value_max)
76
77 def __call__(self, row):
78 return row._extra[self.value_max]
79
80 class Minimum(object):
81 def __init__(self, column, add_query_term):
82 self.value_min = value_min = column.min()
83 add_query_term(value_min)
84
85 def __call__(self, row):
86 return row._extra[self.value_min]
87
88 class Average(object):
89 def __init__(self, column, add_query_term):
90 self.value_sum = value_sum = column.sum()
91 self.value_count = value_count = column.count()
92 add_query_term((
93 value_sum,
94 value_count
95 ))
96
97 def __call__(self, row):
98 return row._extra[self.value_sum] / row._extra[self.value_count]
99
100 aggregators = {
101 "Maximum": Maximum,
102 "Minimum": Minimum,
103 "Average": Average
104 }
105
106 def get_cached_or_generated_file(cache_file_name, generate):
107 from os.path import join, exists
108 from os import stat, makedirs
109 # this needs to become a setting
110 climate_data_image_cache_path = join(
111 "/tmp","climate_data_portal","images"
112 )
113 recent_cache = join(climate_data_image_cache_path, "recent")
114 mkdir_p(recent_cache)
115 older_cache = join(climate_data_image_cache_path, "older")
116 mkdir_p(older_cache)
117 recent_cache_path = join(recent_cache, cache_file_name)
118 if not exists(recent_cache_path):
119 older_cache_path = join(older_cache, cache_file_name)
120 if exists(older_cache_path):
121 # move the older cache to the recent folder
122 rename(older_cache_path, recent_cache_path)
123 else:
124 generate(recent_cache_path)
125 file_path = recent_cache_path
126
127 # update the folder size file (race condition?)
128 folder_size_file_path = join(climate_data_image_cache_path, "size")
129 folder_size_file = open(folder_size_file_path, "w+")
130 folder_size_file_contents = folder_size_file.read()
131 try:
132 folder_size = int(folder_size_file_contents)
133 except ValueError:
134 folder_size = 0
135 folder_size_file.seek(0)
136 folder_size_file.truncate()
137 folder_size += stat(file_path).st_size
138 if folder_size > MAX_CACHE_FOLDER_SIZE:
139 rmdir(older_cache)
140
141 folder_size_file.write(str(folder_size))
142 folder_size_file.close()
143 else:
144 # use the existing cached image
145 file_path = recent_cache_path
146 return file_path
147
148 class MapPlugin(object):
149 def __init__(
150 self,
151 data_type_option_names,
152 parameter_names,
153 year_min,
154 year_max
155 ):
156 self.data_type_option_names = data_type_option_names
157 self.parameter_names = parameter_names
158 self.year_min = year_min
159 self.year_max = year_max
160
161 def extend_gis_map(self, add_javascript, add_configuration):
162 add_javascript("scripts/S3/s3.gis.climate.js")
163 SCRIPT = env.SCRIPT
164 T = env.T
165 import json
166
167 add_configuration(
168 SCRIPT(
169 "\n".join((
170 "registerPlugin(",
171 " new ClimateDataMapPlugin("+
172 json.dumps(
173 dict(
174 data_type_option_names = self.data_type_option_names,
175 parameter_names = self.parameter_names,
176 year_min = self.year_min,
177 year_max = self.year_max,
178 overlay_data_URL = "/%s/climate/climate_overlay_data" % (
179 env.request.application
180 ),
181 chart_URL = "/%s/climate/climate_chart" % (
182 env.request.application
183 ),
184 data_type_label = str(T("Data Type")),
185 projected_option_type_label = str(
186 T("Projection Type")
187 )
188 ),
189 indent = 4
190 )+
191 ")",
192 ")",
193 ))
194 )
195 )
196
197
198 def get_overlay_data(
199 self,
200 env,
201 data_type,
202 parameter,
203 from_date,
204 to_date,
205 statistic
206 ):
207 from_month = date_to_month_number(from_date)
208 to_month = date_to_month_number(to_date)
209 def generate_map_overlay_data(file_path):
210 # generate the new file in the recent folder
211
212 db = env.db
213 sample_table_name, sample_table = tables[parameter]
214 place = db.place
215 #sample_table = db[sample_table_name]
216
217 query = [
218 place.id,
219 place.longitude,
220 place.latitude,
221 ]
222 aggregator = aggregators[statistic](
223 sample_table.value,
224 query.append
225 )
226
227 sample_rows = db(
228 (sample_table.time_period >= from_month) &
229 (sample_table.time_period <= to_month) &
230 (sample_table.sample_type == sample_codes[data_type]) &
231 (place.id == sample_table.place_id)
232 ).select(
233 *query,
234 groupby=sample_table.place_id
235 )
236
237 # map positions to data
238 # find max and min value
239 positions = {}
240 aggregated_values = []
241 for row in sample_rows:
242 place = row.place
243 aggregated_value = aggregator(row)
244 aggregated_values.append(aggregated_value)
245 positions[place.id] = (
246 place.latitude,
247 place.longitude,
248 aggregated_value
249 )
250 max_aggregated_value = max(aggregated_values)
251 min_aggregated_value = min(aggregated_values)
252 aggregated_range = max_aggregated_value - min_aggregated_value
253
254 data_lines = []
255 write = data_lines.append
256 from colorsys import hsv_to_rgb
257 for id, (lat, lon, aggregated_value) in positions.iteritems():
258 north = lat + 0.05
259 south = lat - 0.05
260 east = lon + 0.05
261 west = lon - 0.05
262 # only hue changes
263 # hue range is from 2/3 (blue, low) to 0 (red, high)
264 normalised_value = 1.0-((aggregated_value - min_aggregated_value) / aggregated_range)
265 r,g,b = hsv_to_rgb(normalised_value *(2.0/3.0), 1.0, 1.0)
266 hex_colour = "%02x%02x%02x" % (r*255, g*255, b*255)
267 write(
268 "Vector("
269 "Polygon(["
270 "LinearRing(["
271 "Point(%(north)f,%(west)f),"
272 "Point(%(north)f,%(east)f),"
273 "Point(%(south)f,%(east)f),"
274 "Point(%(south)f,%(west)f)"
275 "])"
276 "]),"
277 "{"
278 "value:%(aggregated_value)f,"
279 "id:%(id)i"
280 "},"
281 "{"
282 "fillColor:'#%(hex_colour)s'"
283 "}"
284 ")," % locals()
285 )
286 overlay_data_file = open(file_path, "w")
287 write = overlay_data_file.write
288 write("{")
289 if max_aggregated_value < 10:
290 float_format = "%0.2f"
291 if max_aggregated_value < 100:
292 float_format = "%0.1f"
293 elif max_aggregated_value < 10000:
294 float_format = "%0.0f"
295 else:
296 float_format = "%0.2e"
297 write("max:%s," % float_format % max_aggregated_value)
298 write("min:%s," % float_format % min_aggregated_value)
299 write("features:[")
300 write("".join(data_lines))
301 overlay_data_file.seek(-1, 1) # delete last ",'
302 write("]}")
303 overlay_data_file.close()
304
305 return get_cached_or_generated_file(
306 "_".join((
307 statistic,
308 data_type,
309 parameter,
310 str(from_month),
311 str(to_month),
312 ".js"
313 )),
314 generate_map_overlay_data
315 )
316
317 def render_plots(
318 self,
319 env,
320 specs
321 ):
322 def generate_chart(file_path):
323 def render_plot(
324 data_type,
325 parameter,
326 from_date,
327 to_date,
328 place_ids
329 ):
330 from_month = date_to_month_number(from_date)
331 to_month = date_to_month_number(to_date)
332
333 db = env.db
334 sample_table_name, sample_table = tables[parameter]
335 place = db.place
336 #sample_table = db[sample_table_name]
337 sample_rows = db(
338 (sample_table.time_period >= from_month) &
339 (sample_table.time_period <= to_month) &
340 (sample_table.sample_type == sample_codes[data_type]) &
341 (sample_table.place_id.belongs(place_ids))
342 ).select(
343 sample_table.value,
344 sample_table.time_period,
345 )
346
347 # coalesce values by time_period:
348 aggregated_values = {}
349 for sample_row in sample_rows:
350 time_period = sample_row.time_period
351 value = sample_row.value
352 try:
353 aggregated_values[time_period]
354 except KeyError:
355 aggregated_values[time_period] = value
356 else:
357 aggregated_values[time_period] += value
358
359 values = []
360 time_periods = aggregated_values.keys()
361 time_periods.sort()
362 for time_period in time_periods:
363 values.append(aggregated_values[time_period])
364 return from_date, to_date, data_type, parameter, values
365
366 time_serieses = []
367 c = R("c")
368 for spec in specs:
369 from_date, to_date, data_type, parameter, values = render_plot(**spec)
370 time_serieses.append(
371 R("ts")(
372 robjects.FloatVector(values),
373 start = c(from_date.year, from_date.month),
374 end = c(to_date.year, to_date.month),
375 frequency = 12
376 )
377 )
378
379 R("png(filename = '%s', width=640, height=480)" % file_path)
380 plot_chart = R(
381 "function (xlab, ylab, n, ...) {"
382 "ts.plot(...,"
383 "gpars=list(xlab=xlab, ylab=ylab, col=c(1:n))"
384 ")"
385 "}"
386 )
387
388 plot_chart(
389 "Date",
390 "Combined %s %s" % (data_type, parameter),
391 len(time_serieses),
392 *time_serieses
393 )
394 R("dev.off()")
395
396 import md5
397 import gluon.contrib.simplejson as JSON
398
399 import datetime
400 def serialiseDate(obj):
401 if isinstance(
402 obj,
403 (
404 datetime.date,
405 datetime.datetime,
406 datetime.time
407 )
408 ):
409 return obj.isoformat()[:19].replace("T"," ")
410 raise TypeError("%r is not JSON serializable" % (obj,))
411
412 return get_cached_or_generated_file(
413 "_".join((
414 md5.md5(
415 JSON.dumps(
416 specs,
417 sort_keys=True,
418 default=serialiseDate
419 )
420 ).hexdigest(),
421 ".png"
422 )),
423 generate_chart
424 )
425
426 exports.update(
427 MapPlugin = MapPlugin
428 )
429
430 del globals()["define"]
0431
=== added file 'modules/ClimateDataPortal/__init__.py'
--- modules/ClimateDataPortal/__init__.py 1970-01-01 00:00:00 +0000
+++ modules/ClimateDataPortal/__init__.py 2011-09-05 13:55:13 +0000
@@ -0,0 +1,201 @@
1
2"""
3 Climate Data Module
4
5 @author: Mike Amy
6"""
7
8# datasets are stored in actual tables
9# - e.g. rainfall_mm
10
11# data collection points in dataset
12# values at a point within a time range
13
14# e.g. observed temperature in Kathmandu between Feb 2006 - April 2007
15
16
17sample_types = dict(
18 O = "Observed",
19 G = "Gridded",
20
21 r = "Projected (RC)",
22 g = "Projected (GC)",
23 s = "Scenario",
24)
25
26sample_codes = {}
27
28import re
29for code, name in sample_types.iteritems():
30 globals()[re.sub("\W", "", name)] = code
31 sample_codes[name] = code
32
33
34# Until I figure out how to sanely import things from web2py,
35# apply a prophylactic import method...
36def define_models(env):
37 """
38 Define Climate Data models.
39 """
40 db = env.db
41 Field = env.Field
42
43 def create_index(table_name, field_name):
44 db.executesql(
45 """
46 CREATE INDEX IF NOT EXISTS
47 "index_%(table_name)s__%(field_name)s"
48 ON "%(table_name)s" ("%(field_name)s");
49 """ % locals()
50 )
51
52 place = db.define_table(
53 "place",
54 Field(
55 "longitude",
56 "double",
57 notnull=True,
58 required=True,
59 ),
60 Field(
61 "latitude",
62 "double",
63 notnull=True,
64 required=True,
65 )
66 )
67
68 # not all places are stations with elevations
69 # as in the case of "gridded" data
70 # a station can only be in one place
71 observation_station = db.define_table(
72 "observation_station",
73 Field(
74 "id",
75 "id", # must be a place,
76 notnull=True,
77 required=True,
78 ),
79 Field(
80 "name",
81 "string",
82 notnull=True,
83 unique=False,
84 required=True,
85 ),
86 Field(
87 "elevation_metres",
88 "integer"
89 )
90 )
91
92 def sample_table(name, value_type):
93 table = db.define_table(
94 name,
95 Field(
96 "sample_type",
97 "string",
98 length = 1,
99 notnull=True,
100 # necessary as web2py requires a default value even for
101 # not null fields
102 default="-1",
103 required=True
104 ),
105 Field(
106 "time_period",
107 "integer",
108 notnull=True,
109 default=-1000,
110 required=True
111 ),
112 Field(
113 # this should become a GIS field
114 "place_id",
115 place,
116 notnull=True,
117 required=True
118 ),
119 Field(
120 "value",
121 value_type,
122 notnull = True,
123 required=True,
124 ),
125 )
126
127 create_index(name, "id")
128 create_index(name, "sample_type")
129 create_index(name, "time_period")
130 create_index(name, "place_id")
131
132 return table
133
134 rainfall_mm = sample_table("climate_rainfall_mm", "double")
135 min_temperature_celsius = sample_table("climate_min_temperature_celsius", "double")
136 max_temperature_celsius = sample_table("climate_max_temperature_celsius", "double")
137
138 tables = {
139 "Rainfall mm": ("climate_rainfall_mm", rainfall_mm),
140 "Max Temperature C": ("climate_max_temperature_celsius", max_temperature_celsius),
141 "Min Temperature C": ("climate_min_temperature_celsius", min_temperature_celsius),
142 }
143
144 def year_month_to_month_number(year, month):
145 """Time periods are integers representing months in years,
146 from 1960 onwards.
147
148 e.g. 0 = Jan 1960, 1 = Feb 1960, 12 = Jan 1961
149
150 This function converts a year and month to a month number.
151 """
152 return ((year-1960) * 12) + (month-1)
153
154 def date_to_month_number(date):
155 """This function converts a date to a month number.
156
157 See also year_month_to_month_number(year, month)
158 """
159 return year_month_to_month_number(date.year, date.month)
160
161# def month_number_to_date(month_number):
162# ret
163
164 from .MapPlugin import define
165 define(
166 env,
167 place,
168 tables,
169 date_to_month_number,
170 sample_codes,
171 globals()
172 )
173
174 # exports:
175 globals().update(
176 sample_types = sample_types,
177
178 place = place,
179 observation_station = observation_station,
180
181 tables = tables,
182
183 rainfall_mm = rainfall_mm,
184 max_temperature_celsius = max_temperature_celsius,
185 min_temperature_celsius = min_temperature_celsius,
186
187 date_to_month_number = date_to_month_number,
188 year_month_to_month_number = year_month_to_month_number,
189 )
190
191 def redefine_models(env):
192 # avoid risking insidious aliasing bugs
193 # by not defining things more than once
194 env.db.update(
195 climate_rainfall_mm = rainfall_mm,
196 climate_max_temperature_celsius = max_temperature_celsius,
197 climate_min_temperature_celsius = min_temperature_celsius,
198 place = place,
199 observation_station = observation_station,
200 )
201 globals()["define_models"] = redefine_models
0202
=== added file 'modules/ClimateDataPortal/import_NetCDF_readings.py'
--- modules/ClimateDataPortal/import_NetCDF_readings.py 1970-01-01 00:00:00 +0000
+++ modules/ClimateDataPortal/import_NetCDF_readings.py 2011-09-05 13:55:13 +0000
@@ -0,0 +1,131 @@
1
2ClimateDataPortal = local_import("ClimateDataPortal")
3
4
5def get_or_create(dict, key, creator):
6 try:
7 value = dict[key]
8 except KeyError:
9 value = dict[key] = creator()
10 return value
11
12def get_or_create_record(table, query):
13 query_terms = []
14 for key, value in query.iteritems():
15 query_terms.append(getattr(table, key) == value)
16 reduced_query = reduce(
17 (lambda left, right: left & right),
18 query_terms
19 )
20 records = db(reduced_query).select()
21 count = len(records)
22 assert count <= 1, "Multiple records for %s" % query
23 if count == 0:
24 record = table.insert(**query)
25 db.commit()
26 else:
27 record = records.first()
28 return record.id
29
30def nearly(expected_float, actual_float):
31 return (expected_float * 0.999) < actual_float < (expected_float * 1.001)
32
33def add_reading_if_none(
34 database_table,
35 sample_type,
36 time_period,
37 place_id,
38 value
39):
40 records = db(
41 (database_table.sample_type == sample_type) &
42 (database_table.time_period == time_period) &
43 (database_table.place_id == place_id)
44 ).select(database_table.value, database_table.id)
45 count = len(records)
46 assert count <= 1
47 if count == 0:
48 database_table.insert(
49 sample_type = sample_type,
50 time_period = time_period,
51 place_id = place_id,
52 value = value
53 )
54 else:
55 existing = records.first()
56 assert nearly(existing.value, value), (existing.value, value, place_id)
57
58
59
60import datetime
61
62def import_climate_readings(
63 netcdf_file,
64 database_table,
65 add_reading,
66 start_time = datetime.date(1971,1,1),
67 is_undefined = lambda x: -99.900003 < x < -99.9
68):
69 """
70 Assumptions:
71 * there are no places
72 * the data is in order of places
73 """
74 variables = netcdf_file.variables
75
76 # create grid of places
77 place_ids = {}
78
79 def to_list(variable):
80 result = []
81 for i in range(len(variable)):
82 result.append(variable[i])
83 return result
84
85 def iter_pairs(list):
86 for index in range(len(list)):
87 yield index, list[index]
88
89 times = to_list(variables["time"])
90 lat = to_list(variables["lat"])
91 lon = to_list(variables["lon"])
92 for latitude in lat:
93 for longitude in lon:
94 record = get_or_create_record(
95 ClimateDataPortal.place,
96 dict(
97 longitude = longitude,
98 latitude = latitude
99 )
100 )
101 place_ids[(latitude, longitude)] = record
102 #print longitude, latitude, record
103
104 tt = variables["tt"]
105 print "up to:", len(times)
106 for time_index, time in iter_pairs(times):
107 print time_index
108 time_period = start_time+datetime.timedelta(hours=time)
109 for latitude_index, latitude in iter_pairs(lat):
110 for longitude_index, longitude in iter_pairs(lon):
111 value = tt[time_index][latitude_index][longitude_index]
112 if not is_undefined(value):
113 add_reading(
114 database_table = database_table,
115 sample_type = ClimateDataPortal.Gridded,
116 time_period = ClimateDataPortal.date_to_month_number(time_period),
117 place_id = place_ids[(latitude, longitude)],
118 value = value
119 )
120 db.commit()
121
122import sys
123
124from Scientific.IO import NetCDF
125
126file_name = sys.argv[1]
127import_climate_readings(
128 NetCDF.NetCDFFile(file_name),
129 ClimateDataPortal.min_temperature_celsius,
130 add_reading_if_none
131)
0132
=== added file 'modules/ClimateDataPortal/import_stations.py'
--- modules/ClimateDataPortal/import_stations.py 1970-01-01 00:00:00 +0000
+++ modules/ClimateDataPortal/import_stations.py 2011-09-05 13:55:13 +0000
@@ -0,0 +1,53 @@
1
2ClimateDataPortal = local_import("ClimateDataPortal")
3
4from decimal import Decimal
5
6def import_stations(file_name):
7 """
8 Expects a file containing lines of the form e.g.:
9226 JALESORE 1122 172 26.65 85.78
10275 PHIDIM (PANCHTH 1419 1205 27.15 87.75
11unused Station name <-id <-elev <-lat <-lon
120123456789012345678901234567890123456789012345678901234567890123456789
130 1 2 3 4 5 6
14 """
15 place = ClimateDataPortal.place
16 observation_station = ClimateDataPortal.observation_station
17 observation_station.truncate()
18 place.truncate()
19 db.commit()
20
21 for line in open(file_name, "r").readlines():
22 try:
23 place_id_text = line[27:33]
24 except IndexError:
25 continue
26 else:
27 try:
28 place_id = int(place_id_text)
29 except ValueError:
30 continue
31 else:
32 station_name = line[8:25].strip() # don't restrict if they add more
33 elevation_metres = int(line[37:43])
34
35 latitude = Decimal(line[47:53])
36 longitude = Decimal(line[57:623])
37
38 assert place.insert(
39 id = place_id,
40 longitude = longitude,
41 latitude = latitude
42 ) == place_id
43
44 station_id = observation_station.insert(
45 id = place_id,
46 name = station_name,
47 elevation_metres = elevation_metres
48 )
49 print place_id, station_name, latitude, longitude, elevation_metres
50 db.commit()
51
52import sys
53import_stations(sys.argv[1])
054
=== added file 'modules/ClimateDataPortal/import_tabbed_readings.py'
--- modules/ClimateDataPortal/import_tabbed_readings.py 1970-01-01 00:00:00 +0000
+++ modules/ClimateDataPortal/import_tabbed_readings.py 2011-09-05 13:55:13 +0000
@@ -0,0 +1,154 @@
1
2ClimateDataPortal = local_import("ClimateDataPortal")
3
4from decimal import Decimal
5
6
7def get_or_create(dict, key, creator):
8 try:
9 value = dict[key]
10 except KeyError:
11 value = dict[key] = creator()
12 return value
13
14import os
15
16class Readings(object):
17 def __init__(
18 self,
19 database_table,
20 null_value,
21 maximum = None,
22 minimum = None
23 ):
24 self.database_table = database_table
25 db(database_table.sample_type == ClimateDataPortal.Observed).delete()
26 self.null_value = null_value
27 self.maximum = maximum
28 self.minimum = minimum
29
30 self.aggregated_values = {}
31
32 def add_reading(self, time_period, reading, out_of_range):
33 if reading != self.null_value:
34 if (
35 (self.minimum is not None and reading < self.minimum) or
36 (self.maximum is not None and reading > self.maximum)
37 ):
38 out_of_range(reading)
39 else:
40 readings = get_or_create(
41 self.aggregated_values,
42 time_period,
43 list
44 )
45 readings.append(reading)
46
47 def done(self, place_id):
48 for month_number, values in self.aggregated_values.iteritems():
49 self.database_table.insert(
50 sample_type = ClimateDataPortal.Observed,
51 time_period = month_number,
52 place_id = place_id,
53 value = sum(values) / len(values)
54 )
55
56import datetime
57
58
59def import_tabbed_readings(
60 folder_name,
61 variables = [],
62 place_ids = None
63):
64 """
65 Expects a folder containing files with name rtXXXX.txt
66
67 each file contains lines of the form e.g.:
681978\t1\t1\t0\t-99.9\t-99.9
69
70representing year, month, day, rainfall(mm), minimum and maximum temperature
71 """
72 observation_station = ClimateDataPortal.observation_station
73
74 null_value = Decimal("-99.9") # seems to be
75
76 for row in db(observation_station).select(observation_station.id):
77 place_id = row.id
78 if place_ids is not None:
79 # avoid certain place ids (to allow importing particular places)
80 start_place, end_place = map(int, place_ids.split(":"))
81 assert start_place <= end_place
82 if place_id < start_place or place_id > end_place:
83 continue
84 print place_id
85
86 data_file_path = os.path.join(folder_name, "rt%04i.txt" % place_id)
87 if not os.path.exists(data_file_path):
88 print "%s not found" % data_file_path
89 else:
90 try:
91 for line in open(data_file_path, "r").readlines():
92 if line:
93 data = line.split()
94 if data:
95 try:
96 year = int(data[0])
97 month = int(data[1])
98 day = int(data[2])
99
100 time_period = ClimateDataPortal.year_month_to_month_number(year, month)
101
102 for variable, reading_data in zip(
103 variables,
104 data[3:6]
105 ):
106 def out_of_range(reading):
107 print "%s/%s/%s: %s out of range" % (
108 day, month, year, reading
109 )
110 reading = Decimal(reading_data)
111 variable.add_reading(
112 time_period,
113 reading,
114 out_of_range = out_of_range
115 )
116
117 except Exception, exception:
118 print exception
119 for variable in variables:
120 variable.done(place_id)
121 except:
122 print line
123 raise
124
125 db.commit()
126 else:
127 print "No stations!"
128
129import sys
130
131null_value = Decimal("-99.9")
132import_tabbed_readings(
133 folder_name = sys.argv[1],
134 variables = [
135 Readings(
136 ClimateDataPortal.rainfall_mm,
137 null_value = null_value,
138 minimum = 0,
139 ),
140 Readings(
141 database_table = ClimateDataPortal.min_temperature_celsius,
142 null_value = null_value,
143 minimum = -120,
144 maximum = 55
145 ),
146 Readings(
147 database_table = ClimateDataPortal.max_temperature_celsius,
148 null_value = null_value,
149 minimum = -120,
150 maximum = 55
151 ),
152 ],
153 place_ids = sys.argv[2:] or None
154)
0155
=== modified file 'modules/s3/s3gis.py'
--- modules/s3/s3gis.py 2011-08-30 06:22:20 +0000
+++ modules/s3/s3gis.py 2011-09-05 13:55:13 +0000
@@ -74,16 +74,11 @@
74 Provide an easy, safe, systematic way of handling Debug output74 Provide an easy, safe, systematic way of handling Debug output
75 (print to stdout doesn't work with WSGI deployments)75 (print to stdout doesn't work with WSGI deployments)
76 """76 """
77 try:77 # should be using python's built-in logging module
78 output = "S3 Debug: %s" % str(message)78 output = u"S3 Debug: %s" % unicode(message)
79 if value:79 if value:
80 output += ": %s" % str(value)80 output += u": %s" % unicode(value)
81 except:81 sys.stderr.write(output)
82 output = "S3 Debug: %s" % unicode(message)
83 if value:
84 output += ": %s" % unicode(value)
85
86 print >> sys.stderr, output
8782
88SHAPELY = False83SHAPELY = False
89try:84try:
@@ -240,7 +235,6 @@
240 """235 """
241236
242 def __init__(self):237 def __init__(self):
243
244 self.deployment_settings = current.deployment_settings238 self.deployment_settings = current.deployment_settings
245 self.public_url = current.deployment_settings.get_base_public_url()239 self.public_url = current.deployment_settings.get_base_public_url()
246 if not current.db is not None:240 if not current.db is not None:
@@ -308,6 +302,18 @@
308 else:302 else:
309 return wkt303 return wkt
310304
305 def debug(self, message, value=None):
306 # should be using python's built-in logging module
307 session = current.session
308 if session.s3.debug:
309 raise Exception(message)
310 else:
311 output = u"S3 Debug: %s" % unicode(message)
312 if value:
313 output += u": %s" % unicode(value)
314 sys.stderr.write(output)
315 session.error = current.T(message)
316
311 # -------------------------------------------------------------------------317 # -------------------------------------------------------------------------
312 def download_kml(self, record_id, filename):318 def download_kml(self, record_id, filename):
313 """319 """
@@ -329,11 +335,11 @@
329 db = current.db335 db = current.db
330336
331 layer = KMLLayer(self)337 layer = KMLLayer(self)
338
332 query = (layer.table.id == record_id)339 query = (layer.table.id == record_id)
333 record = db(query).select(limitby=(0, 1)).first()340 record = db(query).select(limitby=(0, 1)).first()
334 url = record.url341 url = record.url
335342
336 layer.add_record(record)
337 cachepath = layer.cachepath343 cachepath = layer.cachepath
338 filepath = os.path.join(cachepath, filename)344 filepath = os.path.join(cachepath, filename)
339345
@@ -430,7 +436,7 @@
430 myfile = zipfile.ZipFile(fp)436 myfile = zipfile.ZipFile(fp)
431 try:437 try:
432 file = myfile.read("doc.kml")438 file = myfile.read("doc.kml")
433 except:439 except: # Naked except!!
434 file = myfile.read(myfile.infolist()[0].filename)440 file = myfile.read(myfile.infolist()[0].filename)
435 myfile.close()441 myfile.close()
436442
@@ -534,7 +540,7 @@
534 try:540 try:
535 lon = features[0].lon541 lon = features[0].lon
536 simple = True542 simple = True
537 except:543 except AttributeError:
538 simple = False544 simple = False
539545
540 for feature in features:546 for feature in features:
@@ -547,7 +553,7 @@
547 # A Join553 # A Join
548 lon = feature.gis_location.lon554 lon = feature.gis_location.lon
549 lat = feature.gis_location.lat555 lat = feature.gis_location.lat
550 except:556 except AttributeError:
551 # Skip any rows without the necessary lat/lon fields557 # Skip any rows without the necessary lat/lon fields
552 continue558 continue
553559
@@ -988,7 +994,8 @@
988 _marker = db.gis_marker994 _marker = db.gis_marker
989 _projection = db.gis_projection995 _projection = db.gis_projection
990 have_tables = _config and _projection996 have_tables = _config and _projection
991 except:997 except Exception, exception:
998 self.debug(exception)
992 have_tables = False999 have_tables = False
9931000
994 row = None1001 row = None
@@ -1005,16 +1012,13 @@
1005 if not row:1012 if not row:
1006 if auth.is_logged_in():1013 if auth.is_logged_in():
1007 # Read personalised config, if available.1014 # Read personalised config, if available.
1008 try:1015 query = (db.pr_person.uuid == auth.user.person_uuid) & \
1009 query = (db.pr_person.uuid == auth.user.person_uuid) & \1016 (_config.pe_id == db.pr_person.pe_id) & \
1010 (_config.pe_id == db.pr_person.pe_id) & \1017 (_marker.id == _config.marker_id) & \
1011 (_marker.id == _config.marker_id) & \1018 (_projection.id == _config.projection_id)
1012 (_projection.id == _config.projection_id)1019 row = db(query).select(limitby=(0, 1)).first()
1013 row = db(query).select(limitby=(0, 1)).first()1020 if row:
1014 if row:1021 config_id = row["gis_config"].id
1015 config_id = row["gis_config"].id
1016 except:
1017 pass
1018 if not row:1022 if not row:
1019 # No personal config or not logged in. Use site default.1023 # No personal config or not logged in. Use site default.
1020 config_id = 11024 config_id = 1
@@ -1151,7 +1155,7 @@
1151 if level:1155 if level:
1152 try:1156 try:
1153 return location_hierarchy[level]1157 return location_hierarchy[level]
1154 except:1158 except KeyError:
1155 return level1159 return level
1156 else:1160 else:
1157 return location_hierarchy1161 return location_hierarchy
@@ -1200,7 +1204,8 @@
1200 if level:1204 if level:
1201 try:1205 try:
1202 return all_levels[level]1206 return all_levels[level]
1203 except:1207 except Exception, exception:
1208
1204 return level1209 return level
1205 else:1210 else:
1206 return all_levels1211 return all_levels
@@ -1475,7 +1480,7 @@
1475 represent = db(table.id == value).select(table.name,1480 represent = db(table.id == value).select(table.name,
1476 cache=cache,1481 cache=cache,
1477 limitby=(0, 1)).first().name1482 limitby=(0, 1)).first().name
1478 except:1483 except: # @ToDo: provide specific exception
1479 # Keep the default from earlier1484 # Keep the default from earlier
1480 pass1485 pass
14811486
@@ -1515,24 +1520,21 @@
1515 lat_max = location.lat_max1520 lat_max = location.lat_max
15161521
1517 else:1522 else:
1518 s3_debug("Location searched within isn't a Polygon!")1523 self.debug("Location searched within isn't a Polygon!")
1519 session.error = T("Location searched within isn't a Polygon!")
1520 return None1524 return None
1521 except:1525 except: # @ToDo: need specific exception
1522 wkt = location1526 wkt = location
1523 if (wkt.startswith("POLYGON") or wkt.startswith("MULTIPOLYGON")):1527 if (wkt.startswith("POLYGON") or wkt.startswith("MULTIPOLYGON")):
1524 # ok1528 # ok
1525 lon_min = None1529 lon_min = None
1526 else:1530 else:
1527 s3_debug("This isn't a Polygon!")1531 self.debug("This isn't a Polygon!")
1528 session.error = T("This isn't a Polygon!")
1529 return None1532 return None
15301533
1531 try:1534 try:
1532 polygon = wkt_loads(wkt)1535 polygon = wkt_loads(wkt)
1533 except:1536 except: # @ToDo: need specific exception
1534 s3_debug("Invalid Polygon!")1537 self.debug("Invalid Polygon!")
1535 session.error = T("Invalid Polygon!")
1536 return None1538 return None
15371539
1538 table = db[tablename]1540 table = db[tablename]
@@ -1540,8 +1542,7 @@
15401542
1541 if "location_id" not in table.fields():1543 if "location_id" not in table.fields():
1542 # @ToDo: Add any special cases to be able to find the linked location1544 # @ToDo: Add any special cases to be able to find the linked location
1543 s3_debug("This table doesn't have a location_id!")1545 self.debug("This table doesn't have a location_id!")
1544 session.error = T("This table doesn't have a location_id!")
1545 return None1546 return None
15461547
1547 query = (table.location_id == locations.id)1548 query = (table.location_id == locations.id)
@@ -1573,7 +1574,10 @@
1573 # Save Record1574 # Save Record
1574 output.records.append(row)1575 output.records.append(row)
1575 except shapely.geos.ReadingError:1576 except shapely.geos.ReadingError:
1576 s3_debug("Error reading wkt of location with id", row.id)1577 self.debug(
1578 "Error reading wkt of location with id",
1579 value=row.id
1580 )
1577 else:1581 else:
1578 # 1st check for Features included within the bbox (faster)1582 # 1st check for Features included within the bbox (faster)
1579 def in_bbox(row):1583 def in_bbox(row):
@@ -1599,7 +1603,10 @@
1599 # Save Record1603 # Save Record
1600 output.records.append(row)1604 output.records.append(row)
1601 except shapely.geos.ReadingError:1605 except shapely.geos.ReadingError:
1602 s3_debug("Error reading wkt of location with id", row.id)1606 self.debug(
1607 "Error reading wkt of location with id",
1608 value = row.id,
1609 )
16031610
1604 return output1611 return output
16051612
@@ -2075,38 +2082,41 @@
2075 current_row += 12082 current_row += 1
2076 try:2083 try:
2077 name0 = row.pop("ADM0_NAME")2084 name0 = row.pop("ADM0_NAME")
2078 except:2085 except KeyError:
2079 name0 = ""2086 name0 = ""
2080 try:2087 try:
2081 name1 = row.pop("ADM1_NAME")2088 name1 = row.pop("ADM1_NAME")
2082 except:2089 except KeyError:
2083 name1 = ""2090 name1 = ""
2084 try:2091 try:
2085 name2 = row.pop("ADM2_NAME")2092 name2 = row.pop("ADM2_NAME")
2086 except:2093 except KeyError:
2087 name2 = ""2094 name2 = ""
2088 try:2095 try:
2089 name3 = row.pop("ADM3_NAME")2096 name3 = row.pop("ADM3_NAME")
2090 except:2097 except KeyError:
2091 name3 = ""2098 name3 = ""
2092 try:2099 try:
2093 name4 = row.pop("ADM4_NAME")2100 name4 = row.pop("ADM4_NAME")
2094 except:2101 except KeyError:
2095 name4 = ""2102 name4 = ""
2096 try:2103 try:
2097 name5 = row.pop("ADM5_NAME")2104 name5 = row.pop("ADM5_NAME")
2098 except:2105 except KeyError:
2099 name5 = ""2106 name5 = ""
21002107
2101 if not name5 and not name4 and not name3 and \2108 if not name5 and not name4 and not name3 and \
2102 not name2 and not name1:2109 not name2 and not name1:
2103 # We need a name! (L0's are already in DB)2110 # We need a name! (L0's are already in DB)
2104 s3_debug("No name provided", current_row)2111 self.debug(
2112 "No name provided",
2113 current_row,
2114 )
2105 continue2115 continue
21062116
2107 try:2117 try:
2108 wkt = row.pop("WKT")2118 wkt = row.pop("WKT")
2109 except:2119 except KeyError:
2110 wkt = None2120 wkt = None
2111 try:2121 try:
2112 lat = row.pop("LAT")2122 lat = row.pop("LAT")
@@ -2115,21 +2125,17 @@
2115 lat = None2125 lat = None
2116 lon = None2126 lon = None
21172127
2118 if domain:2128 try:
2119 try:2129 uuid = row.pop("UUID")
2120 uuid = "%s/%s" % (domain,2130 except KeyError:
2121 row.pop("UUID"))2131 uuid = ""
2122 except:
2123 uuid = ""
2124 else:2132 else:
2125 try:2133 if domain:
2126 uuid = row.pop("UUID")2134 uuid = "%s/%s" % (domain, uuid)
2127 except:
2128 uuid = ""
21292135
2130 try:2136 try:
2131 code = row.pop("CODE")2137 code = row.pop("CODE")
2132 except:2138 except KeyError:
2133 code = ""2139 code = ""
21342140
2135 population = ""2141 population = ""
@@ -2175,7 +2181,7 @@
2175 # Calculate Centroid & Bounds2181 # Calculate Centroid & Bounds
2176 if wkt:2182 if wkt:
2177 try:2183 try:
2178 # Valid WKT2184 # Valid WKT
2179 shape = wkt_loads(wkt)2185 shape = wkt_loads(wkt)
2180 centroid_point = shape.centroid2186 centroid_point = shape.centroid
2181 lon = centroid_point.x2187 lon = centroid_point.x
@@ -2189,8 +2195,8 @@
2189 feature_type = 1 # Point2195 feature_type = 1 # Point
2190 else:2196 else:
2191 feature_type = 3 # Polygon2197 feature_type = 3 # Polygon
2192 except:2198 except: # @ToDo: provide specific exception
2193 s3_debug("Invalid WKT", name)2199 self.debug("Invalid WKT", name)
2194 continue2200 continue
2195 else:2201 else:
2196 lon_min = lon_max = lon2202 lon_min = lon_max = lon
@@ -2234,8 +2240,8 @@
2234 else:2240 else:
2235 path += "%s/" % _parent.id2241 path += "%s/" % _parent.id
2236 else:2242 else:
2237 s3_debug("Location", name)2243 self.debug("Location", name)
2238 s3_debug("Parent cannot be found", parent)2244 self.debug("Parent cannot be found", parent)
2239 parent = ""2245 parent = ""
22402246
2241 # Check for duplicates2247 # Check for duplicates
@@ -2245,8 +2251,8 @@
2245 duplicate = db(query).select(table.id, limitby=(0, 1)).first()2251 duplicate = db(query).select(table.id, limitby=(0, 1)).first()
22462252
2247 if duplicate:2253 if duplicate:
2248 s3_debug("Location", name)2254 self.debug("Location", name)
2249 s3_debug("Duplicate - updating...")2255 self.debug("Duplicate - updating...")
2250 path += str(duplicate.id)2256 path += str(duplicate.id)
2251 # Update with any new information2257 # Update with any new information
2252 query = (table.id == duplicate.id)2258 query = (table.id == duplicate.id)
@@ -2329,7 +2335,7 @@
2329 else:2335 else:
2330 cached = False2336 cached = False
2331 if not os.access(cachepath, os.W_OK):2337 if not os.access(cachepath, os.W_OK):
2332 s3_debug("Folder not writable", cachepath)2338 self.debug("Folder not writable", cachepath)
2333 return2339 return
23342340
2335 if not cached:2341 if not cached:
@@ -2338,11 +2344,11 @@
2338 f = fetch(url)2344 f = fetch(url)
2339 except (urllib2.URLError,):2345 except (urllib2.URLError,):
2340 e = sys.exc_info()[1]2346 e = sys.exc_info()[1]
2341 s3_debug("URL Error", e)2347 self.debug("URL Error", e)
2342 return2348 return
2343 except (urllib2.HTTPError,):2349 except (urllib2.HTTPError,):
2344 e = sys.exc_info()[1]2350 e = sys.exc_info()[1]
2345 s3_debug("HTTP Error", e)2351 self.debug("HTTP Error", e)
2346 return2352 return
23472353
2348 # Unzip File2354 # Unzip File
@@ -2355,8 +2361,8 @@
2355 # For now, 2.5 users need to download/unzip manually to cache folder2361 # For now, 2.5 users need to download/unzip manually to cache folder
2356 myfile.extract(filename, cachepath)2362 myfile.extract(filename, cachepath)
2357 myfile.close()2363 myfile.close()
2358 except:2364 except IOError:
2359 s3_debug("Zipfile contents don't seem correct!")2365 self.debug("Zipfile contents don't seem correct!")
2360 myfile.close()2366 myfile.close()
2361 return2367 return
23622368
@@ -2472,7 +2478,7 @@
2472 # Should be just a single parent2478 # Should be just a single parent
2473 break2479 break
2474 except shapely.geos.ReadingError:2480 except shapely.geos.ReadingError:
2475 s3_debug("Error reading wkt of location with id", row.id)2481 self.debug("Error reading wkt of location with id", row.id)
24762482
2477 # Add entry to database2483 # Add entry to database
2478 table.insert(uuid=uuid,2484 table.insert(uuid=uuid,
@@ -2492,7 +2498,7 @@
2492 else:2498 else:
2493 continue2499 continue
24942500
2495 s3_debug("All done!")2501 self.debug("All done!")
2496 return2502 return
24972503
2498 # -------------------------------------------------------------------------2504 # -------------------------------------------------------------------------
@@ -2735,7 +2741,7 @@
27352741
2736 db = current.db2742 db = current.db
2737 in_bbox = self.query_features_by_bbox(*shape.bounds)2743 in_bbox = self.query_features_by_bbox(*shape.bounds)
2738 has_wkt = (db.gis_location.wkt != None) & (db.gis_location.wkt != '')2744 has_wkt = (db.gis_location.wkt != None) & (db.gis_location.wkt != "")
27392745
2740 for loc in db(in_bbox & has_wkt).select():2746 for loc in db(in_bbox & has_wkt).select():
2741 try:2747 try:
@@ -2743,7 +2749,7 @@
2743 if location_shape.intersects(shape):2749 if location_shape.intersects(shape):
2744 yield loc2750 yield loc
2745 except shapely.geos.ReadingError:2751 except shapely.geos.ReadingError:
2746 s3_debug("Error reading wkt of location with id", loc.id)2752 self.debug("Error reading wkt of location with id", loc.id)
27472753
2748 # -------------------------------------------------------------------------2754 # -------------------------------------------------------------------------
2749 def _get_features_by_latlon(self, lat, lon):2755 def _get_features_by_latlon(self, lat, lon):
@@ -2799,7 +2805,7 @@
2799 try :2805 try :
2800 shape = wkt_loads(location.wkt)2806 shape = wkt_loads(location.wkt)
2801 except:2807 except:
2802 s3_debug("Error reading WKT", location.wkt)2808 self.debug("Error reading WKT", location.wkt)
2803 continue2809 continue
2804 bounds = shape.bounds2810 bounds = shape.bounds
2805 table[location.id] = dict(2811 table[location.id] = dict(
@@ -2950,7 +2956,12 @@
2950 map_width = width2956 map_width = width
2951 else:2957 else:
2952 map_width = config.map_width2958 map_width = config.map_width
2953 if bbox and (-90 < bbox["max_lat"] < 90) and (-90 < bbox["min_lat"] < 90) and (-180 < bbox["max_lon"] < 180) and (-180 < bbox["min_lon"] < 180):2959 if (bbox
2960 and (-90 < bbox["max_lat"] < 90)
2961 and (-90 < bbox["min_lat"] < 90)
2962 and (-180 < bbox["max_lon"] < 180)
2963 and (-180 < bbox["min_lon"] < 180)
2964 ):
2954 # We have sane Bounds provided, so we should use them2965 # We have sane Bounds provided, so we should use them
2955 pass2966 pass
2956 else:2967 else:
@@ -2974,15 +2985,21 @@
2974 projection = config.epsg2985 projection = config.epsg
29752986
29762987
2977 if projection != 900913 and projection != 4326:2988 if projection not in (900913, 4326):
2978 # Test for Valid Projection file in Proj4JS library2989 # Test for Valid Projection file in Proj4JS library
2979 projpath = os.path.join(request.folder, "static", "scripts", "gis", "proj4js", "lib", "defs", "EPSG%s.js" % projection)2990 projpath = os.path.join(
2991 request.folder, "static", "scripts", "gis", "proj4js", \
2992 "lib", "defs", "EPSG%s.js" % projection
2993 )
2980 try:2994 try:
2981 f = open(projpath, "r")2995 f = open(projpath, "r")
2982 f.close()2996 f.close()
2983 except:2997 except:
2984 session.error = "%s /static/scripts/gis/proj4js/lib/defs" % T("Projection not supported - please add definition to")2998 session.error = "'%s' %s /static/scripts/gis/proj4js/lib/defs" % (
2985 redirect(URL(c="gis", f="projection"))2999 projection,
3000 T("Projection not supported - please add definition to")
3001 )
3002 redirect(URL(r=request, c="gis", f="projection"))
29863003
2987 units = config.units3004 units = config.units
2988 maxResolution = config.maxResolution3005 maxResolution = config.maxResolution
@@ -3055,11 +3072,12 @@
3055 #########3072 #########
3056 # Scripts3073 # Scripts
3057 #########3074 #########
3075
3058 def add_javascript(script):3076 def add_javascript(script):
3059 if type(script) == SCRIPT:3077 if type(script) == SCRIPT:
3060 html.append(script)3078 html.append(script)
3061 elif script.startswith("http"):3079 elif script.startswith("http"):
3062 html.append(3080 html.append(
3063 SCRIPT(_type="text/javascript",3081 SCRIPT(_type="text/javascript",
3064 _src=script))3082 _src=script))
3065 else:3083 else:
@@ -3069,7 +3087,7 @@
30693087
3070 debug = session.s3.debug3088 debug = session.s3.debug
3071 if debug:3089 if debug:
3072 if projection != 900913 and projection != 4326:3090 if projection not in (900913, 4326):
3073 add_javascript("scripts/gis/proj4js/lib/proj4js-combined.js")3091 add_javascript("scripts/gis/proj4js/lib/proj4js-combined.js")
3074 add_javascript("scripts/gis/proj4js/lib/defs/EPSG%s.js" % projection)3092 add_javascript("scripts/gis/proj4js/lib/defs/EPSG%s.js" % projection)
30753093
@@ -3082,7 +3100,7 @@
3082 add_javascript("scripts/gis/usng2.js")3100 add_javascript("scripts/gis/usng2.js")
3083 add_javascript("scripts/gis/MP.js")3101 add_javascript("scripts/gis/MP.js")
3084 else:3102 else:
3085 if projection != 900913 and projection != 4326:3103 if projection not in (900913, 4326):
3086 add_javascript("scripts/gis/proj4js/lib/proj4js-compressed.js")3104 add_javascript("scripts/gis/proj4js/lib/proj4js-compressed.js")
3087 add_javascript("scripts/gis/proj4js/lib/defs/EPSG%s.js" % projection)3105 add_javascript("scripts/gis/proj4js/lib/defs/EPSG%s.js" % projection)
3088 add_javascript("scripts/gis/OpenLayers.js")3106 add_javascript("scripts/gis/OpenLayers.js")
@@ -3184,20 +3202,20 @@
3184 # If we do come back to it, then it should be moved to static3202 # If we do come back to it, then it should be moved to static
3185 if print_tool:3203 if print_tool:
3186 url = print_tool["url"]3204 url = print_tool["url"]
3187 url+'' # check url can be concatenated with strings3205 url+"" # check url can be concatenated with strings
3188 if "title" in print_tool:3206 if "title" in print_tool:
3189 mapTitle = str(print_tool["mapTitle"])3207 mapTitle = unicode(print_tool["mapTitle"])
3190 else:3208 else:
3191 mapTitle = str(T("Map from Sahana Eden"))3209 mapTitle = unicode(T("Map from Sahana Eden"))
3192 if "subtitle" in print_tool:3210 if "subtitle" in print_tool:
3193 subTitle = str(print_tool["subTitle"])3211 subTitle = unicode(print_tool["subTitle"])
3194 else:3212 else:
3195 subTitle = str(T("Printed from Sahana Eden"))3213 subTitle = unicode(T("Printed from Sahana Eden"))
3196 if session.auth:3214 if session.auth:
3197 creator = session.auth.user.email3215 creator = unicode(session.auth.user.email)
3198 else:3216 else:
3199 creator = ""3217 creator = ""
3200 print_tool1 = "".join(("""3218 print_tool1 = u"".join(("""
3201 if (typeof(printCapabilities) != 'undefined') {3219 if (typeof(printCapabilities) != 'undefined') {
3202 // info.json from script headers OK3220 // info.json from script headers OK
3203 printProvider = new GeoExt.data.PrintProvider({3221 printProvider = new GeoExt.data.PrintProvider({
@@ -3221,7 +3239,7 @@
3221 // printProvider: printProvider3239 // printProvider: printProvider
3222 //});3240 //});
3223 // A layer to display the print page extent3241 // A layer to display the print page extent
3224 //var pageLayer = new OpenLayers.Layer.Vector('""", str(T("Print Extent")), """');3242 //var pageLayer = new OpenLayers.Layer.Vector('""", unicode(T("Print Extent")), """');
3225 //pageLayer.addFeatures(printPage.feature);3243 //pageLayer.addFeatures(printPage.feature);
3226 //pageLayer.setVisibility(false);3244 //pageLayer.setVisibility(false);
3227 //map.addLayer(pageLayer);3245 //map.addLayer(pageLayer);
@@ -3237,7 +3255,7 @@
3237 //});3255 //});
3238 // The form with fields controlling the print output3256 // The form with fields controlling the print output
3239 S3.gis.printFormPanel = new Ext.form.FormPanel({3257 S3.gis.printFormPanel = new Ext.form.FormPanel({
3240 title: '""", str(T("Print Map")), """',3258 title: '""", unicode(T("Print Map")), """',
3241 rootVisible: false,3259 rootVisible: false,
3242 split: true,3260 split: true,
3243 autoScroll: true,3261 autoScroll: true,
@@ -3250,7 +3268,7 @@
3250 defaults: {anchor: '100%%'},3268 defaults: {anchor: '100%%'},
3251 listeners: {3269 listeners: {
3252 'expand': function() {3270 'expand': function() {
3253 //if (null == mapPanel.map.getLayersByName('""", str(T("Print Extent")), """')[0]) {3271 //if (null == mapPanel.map.getLayersByName('""", unicode(T("Print Extent")), """')[0]) {
3254 // mapPanel.map.addLayer(pageLayer);3272 // mapPanel.map.addLayer(pageLayer);
3255 //}3273 //}
3256 if (null == mapPanel.plugins[0]) {3274 if (null == mapPanel.plugins[0]) {
@@ -3278,7 +3296,7 @@
3278 xtype: 'textarea',3296 xtype: 'textarea',
3279 name: 'comment',3297 name: 'comment',
3280 value: '',3298 value: '',
3281 fieldLabel: '""", str(T("Comment")), """',3299 fieldLabel: '""", unicode(T("Comment")), """',
3282 plugins: new GeoExt.plugins.PrintPageField({3300 plugins: new GeoExt.plugins.PrintPageField({
3283 printPage: printPage3301 printPage: printPage
3284 })3302 })
@@ -3286,7 +3304,7 @@
3286 xtype: 'combo',3304 xtype: 'combo',
3287 store: printProvider.layouts,3305 store: printProvider.layouts,
3288 displayField: 'name',3306 displayField: 'name',
3289 fieldLabel: '""", str(T("Layout")), """',3307 fieldLabel: '""", T("Layout").decode("utf-8"), """',
3290 typeAhead: true,3308 typeAhead: true,
3291 mode: 'local',3309 mode: 'local',
3292 triggerAction: 'all',3310 triggerAction: 'all',
@@ -3297,7 +3315,7 @@
3297 xtype: 'combo',3315 xtype: 'combo',
3298 store: printProvider.dpis,3316 store: printProvider.dpis,
3299 displayField: 'name',3317 displayField: 'name',
3300 fieldLabel: '""", str(T("Resolution")), """',3318 fieldLabel: '""", unicode(T("Resolution")), """',
3301 tpl: '<tpl for="."><div class="x-combo-list-item">{name} dpi</div></tpl>',3319 tpl: '<tpl for="."><div class="x-combo-list-item">{name} dpi</div></tpl>',
3302 typeAhead: true,3320 typeAhead: true,
3303 mode: 'local',3321 mode: 'local',
@@ -3314,7 +3332,7 @@
3314 // xtype: 'combo',3332 // xtype: 'combo',
3315 // store: printProvider.scales,3333 // store: printProvider.scales,
3316 // displayField: 'name',3334 // displayField: 'name',
3317 // fieldLabel: '""", str(T("Scale")), """',3335 // fieldLabel: '""", unicode(T("Scale")), """',
3318 // typeAhead: true,3336 // typeAhead: true,
3319 // mode: 'local',3337 // mode: 'local',
3320 // triggerAction: 'all',3338 // triggerAction: 'all',
@@ -3324,13 +3342,13 @@
3324 //}, {3342 //}, {
3325 // xtype: 'textfield',3343 // xtype: 'textfield',
3326 // name: 'rotation',3344 // name: 'rotation',
3327 // fieldLabel: '""", str(T("Rotation")), """',3345 // fieldLabel: '""", unicode(T("Rotation")), """',
3328 // plugins: new GeoExt.plugins.PrintPageField({3346 // plugins: new GeoExt.plugins.PrintPageField({
3329 // printPage: printPage3347 // printPage: printPage
3330 // })3348 // })
3331 }],3349 }],
3332 buttons: [{3350 buttons: [{
3333 text: '""", str(T("Create PDF")), """',3351 text: '""", unicode(T("Create PDF")), """',
3334 handler: function() {3352 handler: function() {
3335 // the PrintExtent plugin is the mapPanel's 1st plugin3353 // the PrintExtent plugin is the mapPanel's 1st plugin
3336 //mapPanel.plugins[0].print();3354 //mapPanel.plugins[0].print();
@@ -3348,7 +3366,7 @@
3348 } else {3366 } else {
3349 // Display error diagnostic3367 // Display error diagnostic
3350 S3.gis.printFormPanel = new Ext.Panel ({3368 S3.gis.printFormPanel = new Ext.Panel ({
3351 title: '""", str(T("Print Map")), """',3369 title: '""", unicode(T("Print Map")), """',
3352 rootVisible: false,3370 rootVisible: false,
3353 split: true,3371 split: true,
3354 autoScroll: true,3372 autoScroll: true,
@@ -3359,7 +3377,7 @@
3359 bodyStyle: 'padding:5px',3377 bodyStyle: 'padding:5px',
3360 labelAlign: 'top',3378 labelAlign: 'top',
3361 defaults: {anchor: '100%'},3379 defaults: {anchor: '100%'},
3362 html: '""", str(T("Printing disabled since server not accessible")), """: <BR />""", url, """'3380 html: '""", unicode(T("Printing disabled since server not accessible")), """: <BR />""", unicode(url), """'
3363 });3381 });
3364 }3382 }
3365 """))3383 """))
@@ -3445,40 +3463,40 @@
3445 name_safe = re.sub("'", "", layer.name)3463 name_safe = re.sub("'", "", layer.name)
3446 if layer.url2:3464 if layer.url2:
3447 url2 = """,3465 url2 = """,
3448 url2: '%s'""" % layer.url23466 "url2": "%s\"""" % layer.url2
3449 else:3467 else:
3450 url2 = ""3468 url2 = ""
3451 if layer.url3:3469 if layer.url3:
3452 url3 = """,3470 url3 = """,
3453 url3: '%s'""" % layer.url33471 "url3": "%s\"""" % layer.url3
3454 else:3472 else:
3455 url3 = ""3473 url3 = ""
3456 if layer.base:3474 if layer.base:
3457 base = ""3475 base = ""
3458 else:3476 else:
3459 base = """,3477 base = """,
3460 isBaseLayer: false"""3478 "isBaseLayer": false"""
3461 if layer.visible:3479 if layer.visible:
3462 visibility = ""3480 visibility = ""
3463 else:3481 else:
3464 visibility = """,3482 visibility = """,
3465 visibility: false"""3483 "visibility": false"""
3466 if layer.attribution:3484 if layer.attribution:
3467 attribution = """,3485 attribution = """,
3468 attribution: '%s'""" % layer.attribution3486 "attribution": %s""" % repr(layer.attribution)
3469 else:3487 else:
3470 attribution = ""3488 attribution = ""
3471 if layer.zoom_levels is not None and layer.zoom_levels != 19:3489 if layer.zoom_levels is not None and layer.zoom_levels != 19:
3472 zoomLevels = """,3490 zoomLevels = """,
3473 zoomLevels: %i""" % layer.zoom_levels3491 "zoomLevels": %i""" % layer.zoom_levels
3474 else:3492 else:
3475 zoomLevels = ""3493 zoomLevels = ""
34763494
3477 # Generate JS snippet to pass to static3495 # Generate JS snippet to pass to static
3478 layers_osm += """3496 layers_osm += """
3479S3.gis.layers_osm[%i] = {3497S3.gis.layers_osm[%i] = {
3480 name: '%s',3498 "name": "%s",
3481 url1: '%s'%s%s%s%s%s%s3499 "url1": "%s"%s%s%s%s%s%s
3482}3500}
3483""" % (counter,3501""" % (counter,
3484 name_safe,3502 name_safe,
@@ -3490,6 +3508,7 @@
3490 attribution,3508 attribution,
3491 zoomLevels)3509 zoomLevels)
34923510
3511
3493 # ---------------------------------------------------------------------3512 # ---------------------------------------------------------------------
3494 # XYZ3513 # XYZ
3495 # @ToDo: Migrate to Class/Static3514 # @ToDo: Migrate to Class/Static
@@ -3681,7 +3700,7 @@
36813700
3682 if "active" in layer and not layer["active"]:3701 if "active" in layer and not layer["active"]:
3683 visibility = """,3702 visibility = """,
3684 visibility: false"""3703 "visibility": false"""
3685 else:3704 else:
3686 visibility = ""3705 visibility = ""
36873706
@@ -3712,32 +3731,33 @@
3712 marker_url = ""3731 marker_url = ""
3713 if marker_url:3732 if marker_url:
3714 markerLayer = """,3733 markerLayer = """,
3715 marker_url: '%s',3734 "marker_url": "%s",
3716 marker_height: %i,3735 "marker_height": %i,
3717 marker_width: %i""" % (marker_url, marker_height, marker_width)3736 "marker_width": %i""" % (marker_url, marker_height, marker_width)
37183737
3719 if "opacity" in layer and layer["opacity"] != 1:3738 if "opacity" in layer and layer["opacity"] != 1:
3720 opacity = """,3739 opacity = """,
3721 opacity: %.1f""" % layer["opacity"]3740 "opacity": %.1f""" % layer["opacity"]
3722 else:3741 else:
3723 opacity = ""3742 opacity = ""
3724 if "cluster_distance" in layer and layer["cluster_distance"] != self.cluster_distance:3743 if "cluster_distance" in layer and layer["cluster_distance"] != self.cluster_distance:
3725 cluster_distance = """,3744 cluster_distance = """,
3726 cluster_distance: %i""" % layer["cluster_distance"]3745 "cluster_distance": %i""" % layer["cluster_distance"]
3727 else:3746 else:
3728 cluster_distance = ""3747 cluster_distance = ""
3729 if "cluster_threshold" in layer and layer["cluster_threshold"] != self.cluster_threshold:3748 if "cluster_threshold" in layer and layer["cluster_threshold"] != self.cluster_threshold:
3730 cluster_threshold = """,3749 cluster_threshold = """,
3731 cluster_threshold: %i""" % layer["cluster_threshold"]3750 "cluster_threshold": %i""" % layer["cluster_threshold"]
3732 else:3751 else:
3733 cluster_threshold = ""3752 cluster_threshold = ""
37343753
3735 # Generate JS snippet to pass to static3754 # Generate JS snippet to pass to static
3736 layers_feature_queries += """3755 layers_feature_queries += """
3737S3.gis.layers_feature_queries[%i] = {3756S3.gis.layers_feature_queries[%i] = {
3738 name: '%s',3757 "name": "%s",
3739 url: '%s'%s%s%s%s%s3758 "url": "%s"%s%s%s%s%s
3740}""" % (counter,3759}
3760""" % (counter,
3741 name,3761 name,
3742 url,3762 url,
3743 visibility,3763 visibility,
@@ -3745,36 +3765,44 @@
3745 opacity,3765 opacity,
3746 cluster_distance,3766 cluster_distance,
3747 cluster_threshold)3767 cluster_threshold)
37483768
3749 # ---------------------------------------------------------------------3769 # ---------------------------------------------------------------------
3750 # Add Layers from the Catalogue3770 # Add Layers from the Catalogue
3751 # ---------------------------------------------------------------------3771 # ---------------------------------------------------------------------
3752 layers_config = ""3772 layers_config = ""
3753 if catalogue_layers:3773 if catalogue_layers:
3754 for LayerType in [3774 for LayerType in [
3755 #OSMLayer,3775 #OSMLayer,
3756 Bing,3776 BingLayer,
3757 Google,3777 GoogleLayer,
3758 Yahoo,3778 YahooLayer,
3759 TMSLayer,3779 TMSLayer,
3760 WMSLayer,3780 WMSLayer,
3761 FeatureLayer,3781 FeatureLayer,
3762 GeoJSONLayer,3782 GeoJSONLayer,
3763 GeoRSSLayer,3783 GeoRSSLayer,
3764 GPXLayer,3784 GPXLayer,
3765 KMLLayer,3785 KMLLayer,
3766 WFSLayer3786 WFSLayer
3767 ]:3787 ]:
3768 # Instantiate the Class3788 try:
3769 layer = LayerType(self)3789 # Instantiate the Class
3770 layer_type_js = layer.as_javascript()3790 layer = LayerType(self)
3771 if layer_type_js:3791 layer_type_js = layer.as_javascript()
3772 # Add to the output JS3792 if layer_type_js:
3773 layers_config = "".join((layers_config,3793 # Add to the output JS
3774 layer_type_js))3794 layers_config = "".join((layers_config,
3775 if layer.scripts:3795 layer_type_js))
3776 for script in layer.scripts:3796 if layer.scripts:
3777 add_javascript(script)3797 for script in layer.scripts:
3798 add_javascript(script)
3799 except Exception, exception:
3800 if debug:
3801 raise
3802 else:
3803 session.warning.append(
3804 LayerType.__name__ + " not shown due to error"
3805 )
37783806
3779 # -----------------------------------------------------------------3807 # -----------------------------------------------------------------
3780 # Coordinate Grid - only one possible3808 # Coordinate Grid - only one possible
@@ -3848,6 +3876,7 @@
3848 "S3.gis.marker_default_width = %i;\n" % marker_default.width,3876 "S3.gis.marker_default_width = %i;\n" % marker_default.width,
3849 osm_auth,3877 osm_auth,
3850 layers_osm,3878 layers_osm,
3879 layers_feature_queries,
3851 _features,3880 _features,
3852 layers_config,3881 layers_config,
3853 # i18n Labels3882 # i18n Labels
@@ -3880,6 +3909,7 @@
3880 ))))3909 ))))
38813910
3882 # Static Script3911 # Static Script
3912
3883 if debug:3913 if debug:
3884 add_javascript("scripts/S3/s3.gis.js")3914 add_javascript("scripts/S3/s3.gis.js")
3885 add_javascript("scripts/S3/s3.gis.layers.js")3915 add_javascript("scripts/S3/s3.gis.layers.js")
@@ -3889,7 +3919,6 @@
38893919
3890 # Dynamic Script (stuff which should, as far as possible, be moved to static)3920 # Dynamic Script (stuff which should, as far as possible, be moved to static)
3891 html.append(SCRIPT(layers_js + \3921 html.append(SCRIPT(layers_js + \
3892 #layers_xyz + \
3893 print_tool1))3922 print_tool1))
38943923
3895 # Set up map plugins3924 # Set up map plugins
@@ -3904,8 +3933,7 @@
39043933
3905 return html3934 return html
39063935
39073936# -----------------------------------------------------------------------------
3908# =============================================================================
3909class Marker(object):3937class Marker(object):
3910 """ Represents a Map Marker """3938 """ Represents a Map Marker """
3911 def __init__(self, gis, id=None):3939 def __init__(self, gis, id=None):
@@ -3938,6 +3966,13 @@
3938 #self.url = URL(c="static", f="img",3966 #self.url = URL(c="static", f="img",
3939 # args=["markers", marker.image])3967 # args=["markers", marker.image])
39403968
3969 def add_attributes_to_output(self, output):
3970 output.update(
3971 marker_image = self.image,
3972 marker_height = self.height,
3973 marker_width = self.width,
3974 )
3975
3941# -----------------------------------------------------------------------------3976# -----------------------------------------------------------------------------
3942class Projection(object):3977class Projection(object):
3943 """ Represents a Map Projection """3978 """ Represents a Map Projection """
@@ -3963,41 +3998,34 @@
3963 self.epsg = projection.epsg3998 self.epsg = projection.epsg
39643999
3965# -----------------------------------------------------------------------------4000# -----------------------------------------------------------------------------
4001
4002def config_dict(mandatory, defaulted):
4003 d = dict(mandatory)
4004 for key, (value, defaults) in defaulted.iteritems():
4005 if value not in defaults:
4006 d[key] = value
4007 return d
4008
4009
4010# the layer code only needs to do:
4011# any database lookups to get extra data
4012# security checks.
4013
4014# then it generates appropriate JSON strings.
4015
3966class Layer(object):4016class Layer(object):
3967 """4017 """
3968 Base Class for Layers4018 Abstract Base Class for Layers
3969 Not meant to be instantiated direct
3970 """4019 """
3971 def __init__(self, gis, record=None):4020 def __init__(self, gis):
4021 self.gis = gis
3972 db = current.db4022 db = current.db
39734023 try:
3974 self.gis = gis4024 self.table = db[self.table_name]
3975 # This usually arrives later4025 except:
3976 self.record = record4026 current.manager.load(self.table_name)
3977 # Ensure all attributes available (even if Null)4027 self.table = db[tablename]
3978 self._refresh()4028
3979 self.scripts = []
3980
3981 def add_record(self, record):
3982 """
3983 Update the record & refresh the attributes
3984 """
3985 if record:
3986 self.record = record
3987 self._refresh()
3988 else:
3989 return
3990
3991 def as_dict(self):
3992 """
3993 Output the Layer as a Python dictionary
3994 - this is used to build a JSON of the overall dict of layers
3995 """
3996 record = self.record
3997 if record:
3998 return record
3999 else:
4000 return
40014029
4002 def as_json(self):4030 def as_json(self):
4003 """4031 """
@@ -4009,663 +4037,480 @@
4009 else:4037 else:
4010 return4038 return
40114039
4012 def as_javascript(self):4040
4013 """4041
4014 Output the Layer as Javascript4042# -----------------------------------------------------------------------------
4015 - suitable for inclusion in the HTML page4043class SingleRecordLayer(Layer):
4016 """4044 """
4017 gis = self.gis4045 Abstract Base Class for Layers with just a single record
4018 auth = gis.auth4046 """
4019 db = current.db4047
4020 table = self.table4048 def __init__(self, gis):
40214049 super(SingleRecordLayer, self).__init__(gis)
4022 layer_type_list = []4050 table = self.table
4023 # Read the enabled Layers4051 records = current.db(table.id > 0).select()
4024 records = db(table.enabled == True).select()4052 assert len(records) <= 1, (
4025 for record in records:4053 "There should only ever be 0 or 1 %s" % self.__class__.__name__
4026 # Check user is allowed to access the layer4054 )
4027 role_required = record.role_required4055 self.record = None
4028 if (not role_required) or auth.s3_has_role(role_required):4056 record = records.first()
4029 # Pass the record to the Class4057 if record is not None:
4030 self.add_record(record)4058 if record.enabled:
4031 # Read the output dict for this layer4059 role_required = record.role_required
4032 layer_dict = self.as_dict()4060 if not role_required or self.gis.auth.s3_has_role(role_required):
4033 if layer_dict:4061 self.record = record
4034 # Add this layer to the list of layers for this layer type4062 # Refresh the attributes of the Layer
4035 layer_type_list.append(layer_dict)4063 if "apikey" in table:
40364064 if record:
4037 if layer_type_list:4065 self.apikey = record.apikey
4038 # Output the Layer Type as JSON4066 else:
4039 layer_type_json = json.dumps(layer_type_list,4067 self.apikey = None
4040 sort_keys=True,4068 self.gis = gis
4041 indent=4)4069 self.scripts = []
4042 layer_type_js = "".join(("%s = " % self.js_array,4070
4043 layer_type_json,4071 def as_javascript(self):
4044 "\n"))4072 """
4045 return layer_type_js4073 Output the Layer as Javascript
40464074 - suitable for inclusion in the HTML page
4047 def _name_safe(self, name):4075 """
4048 """4076 if self.record:
4049 Make the name safe for use in JSON4077 if "apikey" in self.table and not self.apikey:
4050 i.e. any Unicode character allowed except for " & \4078 raise Exception("Cannot display a %s if we have no valid API Key" % self.__class__.__name__)
4051 """4079 json = self.as_json()
4052 return re.sub('[\\"]', "", name)4080 if json:
40534081 return "%s = %s\n" % (
4054 def _refresh(self):4082 self.js_array,
4055 " Refresh the attributes of the Layer "4083 json
4056 table = self.table4084 )
4057 if "marker_id" in table:4085 else:
4058 self.set_marker()4086 return None
4059 if "projection_id" in table:4087 else:
4060 self.set_projection()4088 return None
40614089
4062 def set_marker(self):4090# -----------------------------------------------------------------------------
4063 " Set the Marker for the Layer "4091class BingLayer(SingleRecordLayer):
4064 gis = self.gis4092 """ Bing Layer from Catalogue """
4065 record = self.record4093 table_name = "gis_layer_bing"
4066 if record:4094 js_array = "S3.gis.Bing"
4067 marker = Marker(gis, record.marker_id)4095
4068 self.marker = marker
4069 else:
4070 self.marker = None
4071
4072 def set_projection(self):
4073 " Set the Projection for the Layer "
4074 gis = self.gis
4075 record = self.record
4076 if record:
4077 projection = Projection(gis, record.projection_id)
4078 self.projection = projection
4079 else:
4080 self.projection = None
4081
4082# -----------------------------------------------------------------------------
4083class OneLayer(Layer):
4084 """
4085 Base Class for Layers with just a single record
4086 Not meant to be instantiated direct
4087 """
4088
4089 def __init__(self, gis, record=None):
4090 db = current.db
4091 tablename = ""
4092 try:
4093 table = db[tablename]
4094 except:
4095 current.manager.load(tablename)
4096 table = db[tablename]
4097 if not record:
4098 # There is only ever 1 layer
4099 record = db(table.id > 0).select().first()
4100
4101 self.gis = gis
4102 self.table = table
4103 self.js_array = "S3.gis.OneLayer"
4104 self.record = record
4105 self._refresh()
4106 self.scripts = []
4107
4108 def as_javascript(self):
4109 """
4110 Output the Layer as Javascript
4111 - suitable for inclusion in the HTML page
4112 """
4113 auth = self.gis.auth
4114 record = self.record
4115 # Check Layer exists in the DB
4116 if not record:
4117 return None
4118 # Check Layer is enabled
4119 if not record.enabled:
4120 return None
4121 # Check user is allowed to access the Layer
4122 role_required = record.role_required
4123 if role_required and not auth.s3_has_role(role_required):
4124 return None
4125 # Read the output JSON for this layer
4126 layer_type_json = self.as_json()
4127 layer_type_js = "".join(("%s = " % self.js_array,
4128 layer_type_json,
4129 "\n"))
4130 return layer_type_js
4131
4132 def _set_api_key(self):
4133 " Set the API Key for the Layer "
4134 record = self.record
4135 if record:
4136 self.apikey = record.apikey
4137 else:
4138 self.apikey = None
4139
4140 def _refresh(self):
4141 " Refresh the attributes of the Layer "
4142 table = self.table
4143 if "apikey" in table:
4144 self._set_api_key()
4145
4146# -----------------------------------------------------------------------------
4147class Bing(OneLayer):
4148 """ Bing Layers from Catalogue """
4149 def __init__(self, gis, record=None):
4150 db = current.db
4151 tablename = "gis_layer_bing"
4152 try:
4153 table = db[tablename]
4154 except:
4155 current.manager.load(tablename)
4156 table = db[tablename]
4157 if not record:
4158 # There is only ever 1 layer
4159 record = db(table.id > 0).select().first()
4160
4161 self.gis = gis
4162 self.table = table
4163 self.js_array = "S3.gis.Bing"
4164 self.record = record
4165 self._refresh()
4166 self.scripts = []
4167
4168 def as_dict(self):4096 def as_dict(self):
4169 gis = self.gis4097 gis = self.gis
4170 record = self.record4098 record = self.record
4171 apikey = self.apikey4099 if record is not None:
41724100 config = self.gis.get_config()
4173 if not apikey:4101 if Projection(gis, id=config.projection_id).epsg != 900913:
4174 # Cannot display Bing layers if we have no valid API Key4102 raise Exception("Cannot display Bing layers unless we're using the Spherical Mercator Projection")
4175 return None4103 else:
41764104 # Mandatory attributes
4177 config = gis.get_config()4105 output = {
4178 if Projection(gis, id=config.projection_id).epsg != 900913:4106 "ApiKey": self.apikey
4179 # Cannot display Bing layers unless we're using the4107 }
4180 # Spherical Mercator Projection4108
4181 return None4109 # Attributes which are defaulted client-side if not set
41824110 if record.aerial_enabled:
4183 # Mandatory attributes4111 output["Aerial"] = record.aerial or "Bing Satellite"
4184 output = {4112 if record.road_enabled:
4185 "ApiKey": self.apikey4113 output["Road"] = record.road or "Bing Roads"
4186 }4114 if record.hybrid_enabled:
41874115 output["Hybrid"] = record.hybrid or "Bing Hybrid"
4188 # Attributes which are defaulted client-side if not set4116 return output
4189 if record.aerial_enabled:4117 else:
4190 output["Aerial"] = record.aerial or "Bing Satellite"4118 return None
4191 if record.road_enabled:
4192 output["Road"] = record.road or "Bing Roads"
4193 if record.hybrid_enabled:
4194 output["Hybrid"] = record.hybrid or "Bing Hybrid"
4195
4196 return output
41974119
4198# -----------------------------------------------------------------------------4120# -----------------------------------------------------------------------------
4199class Google(OneLayer):4121class GoogleLayer(SingleRecordLayer):
4200 """4122 """
4201 Google Layers/Tools from Catalogue4123 Google Layers/Tools from Catalogue
4202 """4124 """
4203 def __init__(self, gis, record=None):4125 table_name = "gis_layer_google"
4204 db = current.db4126 js_array = "S3.gis.Google"
4205 tablename = "gis_layer_google"4127
4206 try:4128 def __init__(self, gis):
4207 table = db[tablename]4129 super(GoogleLayer, self).__init__(gis)
4208 except:4130 record = self.record
4209 current.manager.load(tablename)4131 if record is not None:
4210 table = db[tablename]4132 debug = current.session.s3.debug
4211 debug = current.session.s3.debug4133 add_script = self.scripts.append
4212 if not record:
4213 # There is only ever 1 layer
4214 record = db(table.id > 0).select().first()
4215
4216 self.gis = gis
4217 self.table = table
4218 self.js_array = "S3.gis.Google"
4219 self.record = record
4220 self._refresh()
4221
4222 if record:
4223 if record.mapmaker_enabled or record.mapmakerhybrid_enabled:4134 if record.mapmaker_enabled or record.mapmakerhybrid_enabled:
4224 # Need to use v2 API4135 # Need to use v2 API
4225 # http://code.google.com/p/gmaps-api-issues/issues/detail?id=23494136 # http://code.google.com/p/gmaps-api-issues/issues/detail?id=2349
4226 self.scripts = ["http://maps.google.com/maps?file=api&v=2&key=%s" % self.apikey]4137 add_script("http://maps.google.com/maps?file=api&v=2&key=%s" % self.apikey)
4227 else:4138 else:
4228 # v3 API4139 # v3 API
4229 self.scripts = ["http://maps.google.com/maps/api/js?v=3.2&sensor=false"]4140 add_script("http://maps.google.com/maps/api/js?v=3.2&sensor=false")
4230 if debug and record.streetview_enabled:4141 if debug and record.streetview_enabled:
4231 self.scripts.append("scripts/gis/gxp/widgets/GoogleStreetViewPanel.js")4142 add_script("scripts/gis/gxp/widgets/GoogleStreetViewPanel.js")
4232 if record.earth_enabled:4143 if record.earth_enabled:
4233 self.scripts.append("http://www.google.com/jsapi?key=%s" % self.apikey)4144 add_script("http://www.google.com/jsapi?key=%s" % self.apikey)
4234 self.scripts.append(SCRIPT("google && google.load('earth', '1');", _type="text/javascript"))4145 add_script(SCRIPT("google && google.load('earth', '1');", _type="text/javascript"))
4235 if debug:4146 if debug:
4236 self.scripts.append("scripts/gis/gxp/widgets/GoogleEarthPanel.js")4147 add_script("scripts/gis/gxp/widgets/GoogleEarthPanel.js")
4237 else:
4238 self.scripts = []
42394148
4240 def as_dict(self):4149 def as_dict(self):
4241 gis = self.gis4150 gis = self.gis
4242 T = current.T4151 T = current.T
4243 record = self.record4152 record = self.record
4244 apikey = self.apikey4153 if record is not None:
42454154 config = gis.get_config()
4246 if not apikey and (record.mapmaker_enabled or record.mapmakerhybrid_enabled):4155 if Projection(gis, id=config.projection_id).epsg != 900913:
4247 # Cannot display Google layers if we have no valid API Key4156 if record.earth_enabled:
4157 # But the Google Earth panel can still be enabled
4158 return {
4159 "Earth": str(T("Switch to 3D"))
4160 }
4161 else:
4162 raise Exception("Cannot display Google layers unless we're using the Spherical Mercator Projection")
4163
4164
4165 # Mandatory attributes
4166 #"ApiKey": self.apikey
4167 output = {
4168 }
4169
4170 # Attributes which are defaulted client-side if not set
4171 if record.satellite_enabled:
4172 output["Satellite"] = record.satellite or "Google Satellite"
4173 if record.maps_enabled:
4174 output["Maps"] = record.maps or "Google Maps"
4175 if record.hybrid_enabled:
4176 output["Hybrid"] = record.hybrid or "Google Hybrid"
4177 if record.mapmaker_enabled:
4178 output["MapMaker"] = record.mapmaker or "Google MapMaker"
4179 if record.mapmakerhybrid_enabled:
4180 output["MapMakerHybrid"] = record.mapmakerhybrid or "Google MapMaker Hybrid"
4181 if record.earth_enabled:
4182 output["Earth"] = str(T("Switch to 3D"))
4183 if record.streetview_enabled and not (record.mapmaker_enabled or record.mapmakerhybrid_enabled):
4184 # Streetview doesn't work with v2 API
4185 output["StreetviewButton"] = str(T("Click where you want to open Streetview"))
4186 output["StreetviewTitle"] = str(T("Street View"))
4187
4188 return output
4189 else:
4248 return None4190 return None
42494191
4250 config = gis.get_config()
4251 if Projection(gis, id=config.projection_id).epsg != 900913:
4252 # Cannot display Google layers unless we're using the
4253 # Spherical Mercator Projection
4254 if record.earth_enabled:
4255 # But the Google Earth panel can still be enabled
4256 output = {
4257 "Earth": str(T("Switch to 3D"))
4258 }
4259 return output
4260 else:
4261 return None
4262
4263 # Mandatory attributes
4264 #"ApiKey": self.apikey
4265 output = {
4266 }
4267
4268 # Attributes which are defaulted client-side if not set
4269 if record.satellite_enabled:
4270 output["Satellite"] = record.satellite or "Google Satellite"
4271 if record.maps_enabled:
4272 output["Maps"] = record.maps or "Google Maps"
4273 if record.hybrid_enabled:
4274 output["Hybrid"] = record.hybrid or "Google Hybrid"
4275 if record.mapmaker_enabled:
4276 output["MapMaker"] = record.mapmaker or "Google MapMaker"
4277 if record.mapmakerhybrid_enabled:
4278 output["MapMakerHybrid"] = record.mapmakerhybrid or "Google MapMaker Hybrid"
4279 if record.earth_enabled:
4280 output["Earth"] = str(T("Switch to 3D"))
4281 if record.streetview_enabled and not (record.mapmaker_enabled or record.mapmakerhybrid_enabled):
4282 # Streetview doesn't work with v2 API
4283 output["StreetviewButton"] = str(T("Click where you want to open Streetview"))
4284 output["StreetviewTitle"] = str(T("Street View"))
4285
4286 return output
4287
4288# -----------------------------------------------------------------------------4192# -----------------------------------------------------------------------------
4289class Yahoo(OneLayer):4193class YahooLayer(SingleRecordLayer):
4290 """4194 """
4291 Yahoo Layers from Catalogue4195 Yahoo Layers from Catalogue
42924196
4293 NB This will stop working on 13 September 20114197 NB This will stop working on 13 September 2011
4294 http://developer.yahoo.com/blogs/ydn/posts/2011/06/yahoo-maps-apis-service-closure-announcement-new-maps-offerings-coming-soon/4198 http://developer.yahoo.com/blogs/ydn/posts/2011/06/yahoo-maps-apis-service-closure-announcement-new-maps-offerings-coming-soon/
4295 """4199 """
4296 def __init__(self, gis, record=None):4200 js_array = "S3.gis.Yahoo"
4297 db = current.db4201 table_name = "gis_layer_yahoo"
4298 tablename = "gis_layer_yahoo"4202
4299 try:4203 def __init__(self, gis):
4300 table = db[tablename]4204 super(YahooLayer, self).__init__(gis)
4301 except:4205 if self.record:
4302 current.manager.load(tablename)4206 self.scripts.append("http://api.maps.yahoo.com/ajaxymap?v=3.8&appid=%s" % self.apikey)
4303 table = db[tablename]4207 config = gis.get_config()
4304 if not record:4208 if Projection(gis, id=config.projection_id).epsg != 900913:
4305 # There is only ever 1 layer4209 raise Exception("Cannot display Yahoo layers unless we're using the Spherical Mercator Projection")
4306 record = db(table.id > 0).select().first()
4307
4308 self.gis = gis
4309 self.table = table
4310 self.js_array = "S3.gis.Yahoo"
4311 self.record = record
4312 self._refresh()
4313 if record:
4314 self.scripts = ["http://api.maps.yahoo.com/ajaxymap?v=3.8&appid=%s" % self.apikey]
4315 else:
4316 self.scripts = []
43174210
4318 def as_dict(self):4211 def as_dict(self):
4319 gis = self.gis
4320 record = self.record4212 record = self.record
4321 apikey = self.apikey4213 if record is not None:
43224214 # Mandatory attributes
4323 if not apikey:4215 #"ApiKey": self.apikey
4324 # Cannot display Yahoo layers if we have no valid API Key4216 output = {
4325 return None4217 }
43264218
4327 config = gis.get_config()4219 # Attributes which are defaulted client-side if not set
4328 if Projection(gis, id=config.projection_id).epsg != 900913:4220 if record.satellite_enabled:
4329 # Cannot display Yahoo layers unless we're using the4221 output["Satellite"] = record.satellite or "Yahoo Satellite"
4330 # Spherical Mercator Projection4222 if record.maps_enabled:
4331 return None4223 output["Maps"] = record.maps or "Yahoo Maps"
43324224 if record.hybrid_enabled:
4333 # Mandatory attributes4225 output["Hybrid"] = record.hybrid or "Yahoo Hybrid"
4334 #"ApiKey": self.apikey4226
4335 output = {4227 return output
4336 }4228 else:
43374229 return None
4338 # Attributes which are defaulted client-side if not set4230
4339 if record.satellite_enabled:4231class MultiRecordLayer(Layer):
4340 output["Satellite"] = record.satellite or "Yahoo Satellite"4232 def __init__(self, gis):
4341 if record.maps_enabled:4233 super(MultiRecordLayer, self).__init__(gis)
4342 output["Maps"] = record.maps or "Yahoo Maps"4234 self.sublayers = []
4343 if record.hybrid_enabled:4235 self.scripts = []
4344 output["Hybrid"] = record.hybrid or "Yahoo Hybrid"4236
43454237 auth = gis.auth
4346 return output4238
4239 layer_type_list = []
4240 # Read the enabled Layers
4241 for record in current.db(self.table.enabled == True).select():
4242 # Check user is allowed to access the layer
4243 role_required = record.role_required
4244 if (not role_required) or auth.s3_has_role(role_required):
4245 self.sublayers.append(self.SubLayer(gis, record))
4246
4247 def as_javascript(self):
4248 """
4249 Output the Layer as Javascript
4250 - suitable for inclusion in the HTML page
4251 """
4252 sublayer_dicts = []
4253 for sublayer in self.sublayers:
4254 # Read the output dict for this sublayer
4255 sublayer_dict = sublayer.as_dict()
4256 if sublayer_dict:
4257 # Add this layer to the list of layers for this layer type
4258 sublayer_dicts.append(sublayer_dict)
4259
4260 if sublayer_dicts:
4261 # Output the Layer Type as JSON
4262 layer_type_json = json.dumps(sublayer_dicts,
4263 sort_keys=True,
4264 indent=4)
4265 return "%s = %s\n" % (self.js_array, layer_type_json)
4266 else:
4267 return None
4268
4269 class SubLayer(object):
4270 def __init__(self, gis, record):
4271 # Ensure all attributes available (even if Null)
4272 self.gis = gis
4273 self.__dict__.update(record)
4274 del record
4275 self.safe_name = re.sub('[\\"]', "", self.name)
4276
4277 if hasattr(self, "marker_id"):
4278 self.marker = Marker(gis, self.marker_id)
4279 if hasattr(self, "projection_id"):
4280 self.projection = Projection(gis, self.projection_id)
4281
4282 def setup_clustering(self, output):
4283 gis = self.gis
4284 cluster_distance = gis.cluster_distance
4285 cluster_threshold = gis.cluster_threshold
4286 if self.cluster_distance != cluster_distance:
4287 output["cluster_distance"] = self.cluster_distance
4288 if self.cluster_threshold != cluster_threshold:
4289 output["cluster_threshold"] = self.cluster_threshold
4290
4291 def setup_visibility_and_opacity(self, output):
4292 if not self.visible:
4293 output["visibility"] = False
4294 if self.opacity != 1:
4295 output["opacity"] = "%.1f" % self.opacity
4296
4297 def add_attributes_if_not_default(self, output, **values_and_defaults):
4298 # could also write values in debug mode, to check if defaults ignored.
4299 # could also check values are not being overwritten.
4300 for key, (value, defaults) in values_and_defaults.iteritems():
4301 if value not in defaults:
4302 output[key] = value
4303
4304 #def set_marker(self):
4305 # " Set the Marker for the Layer "
4306 # gis = self.gis
4307 # self.marker = Marker(gis, self.marker_id)
4308
4309 #def set_projection(self):
4310 # " Set the Projection for the Layer "
4311 # gis = self.gis
4312 # self.projection = Projection(gis, self.projection_id)
43474313
4348# -----------------------------------------------------------------------------4314# -----------------------------------------------------------------------------
4349class FeatureLayer(Layer):4315class FeatureLayer(MultiRecordLayer):
4350 """ Feature Layer from Catalogue """4316 """ Feature Layer from Catalogue """
4351 def __init__(self, gis, record=None):4317 table_name = "gis_layer_feature"
4352 db = current.db4318 js_array = "S3.gis.layers_features"
4353 tablename = "gis_layer_feature"4319
4354 try:4320 class SubLayer(MultiRecordLayer.SubLayer):
4355 table = db[tablename]4321 def __init__(self, gis, record):
4356 except:4322 record_module = record.module
4357 current.manager.load(tablename)4323 if record_module is not None:
4358 table = db[tablename]4324 if record_module not in gis.deployment_settings.modules:
43594325 raise Exception("%s module is disabled" % record_module)
4360 self.gis = gis4326 if not gis.auth.permission(c=record.module, f=record.resource):
4361 self.table = table4327 raise Exception("User has no permission to this resource (in ACL)")
4362 self.js_array = "S3.gis.layers_features"4328 else:
4363 self.record = record4329 raise Exception("FeatureLayer Record '%s' has no module" % record.name)
4364 self._refresh()4330 super(FeatureLayer.SubLayer, self).__init__(gis, record)
4365 self.scripts = []4331
43664332 def as_dict(self):
4367 def as_dict(self):4333 gis = self.gis
4368 gis = self.gis4334
4369 cluster_distance = gis.cluster_distance4335 request = current.request
4370 cluster_threshold = gis.cluster_threshold4336 deployment_settings = gis.deployment_settings
4371 record = self.record4337
4372 marker = self.marker4338 url = "%s.geojson?layer=%i" % (URL(self.module, self.resource),
43734339 self.id)
4374 auth = gis.auth4340 if self.filter:
4375 request = current.request4341 url = "%s&%s" % (url, self.filter)
4376 deployment_settings = gis.deployment_settings4342
43774343 # Mandatory attributes
4378 if record.module not in deployment_settings.modules:4344 output = {
4379 # Module is disabled4345 "name": self.safe_name,
4380 return
4381 if not auth.permission(c=record.module, f=record.resource):
4382 # User has no permission to this resource (in ACL)
4383 return
4384
4385 name_safe = self._name_safe(record.name)
4386
4387 url = "%s.geojson?layer=%i" % (URL(record.module,record.resource),
4388 record.id)
4389 if record.filter:
4390 url = "%s&%s" % (url, record.filter)
4391
4392 # Mandatory attributes
4393 output = {
4394 "name": name_safe,
4395 "url": url,4346 "url": url,
4396 "marker_image": marker.image,
4397 "marker_height": marker.height,
4398 "marker_width": marker.width,
4399 }4347 }
44004348 self.marker.add_attributes_to_output(output)
4401 # Attributes which are defaulted client-side if not set4349 self.setup_visibility_and_opacity(output)
4402 if not record.visible:4350 self.setup_clustering(output)
4403 output["visibility"] = False4351
4404 if record.opacity != 1:4352 return output
4405 output["opacity"] = "%.1f" % record.opacity
4406 if record.cluster_distance != cluster_distance:
4407 output["cluster_distance"] = record.cluster_distance
4408 if record.cluster_threshold != cluster_threshold:
4409 output["cluster_threshold"] = record.cluster_threshold
4410
4411 return output
44124353
4413# -----------------------------------------------------------------------------4354# -----------------------------------------------------------------------------
4414class GeoJSONLayer(Layer):4355class GeoJSONLayer(MultiRecordLayer):
4415 """ GeoJSON Layer from Catalogue """4356 """ GeoJSON Layer from Catalogue """
4416 def __init__(self, gis, record=None):4357 table_name = "gis_layer_geojson"
4417 db = current.db4358 js_array = "S3.gis.layers_geojson"
4418 tablename = "gis_layer_geojson"4359
4419 try:4360 class SubLayer(MultiRecordLayer.SubLayer):
4420 table = db[tablename]4361 def as_dict(self):
4421 except:4362 # Mandatory attributes
4422 current.manager.load(tablename)4363 output = {
4423 table = db[tablename]4364 "name": self.safe_name,
44244365 "url": self.url,
4425 self.gis = gis
4426 self.table = table
4427 self.js_array = "S3.gis.layers_geojson"
4428 self.record = record
4429 self._refresh()
4430 self.scripts = []
4431
4432 def as_dict(self):
4433 gis = self.gis
4434 cluster_distance = gis.cluster_distance
4435 cluster_threshold = gis.cluster_threshold
4436 record = self.record
4437 marker = self.marker
4438 projection = self.projection
4439
4440 name_safe = self._name_safe(record.name)
4441 # Mandatory attributes
4442 output = {
4443 "name": name_safe,
4444 "url": record.url,
4445 "marker_image": marker.image,
4446 "marker_height": marker.height,
4447 "marker_width": marker.width,
4448 }4366 }
44494367 self.marker.add_attributes_to_output(output)
4450 # Attributes which are defaulted client-side if not set4368
4451 if projection.epsg != 4326:4369 # Attributes which are defaulted client-side if not set
4452 output["projection"] = projection.epsg4370 projection = self.projection
4453 if not record.visible:4371 if projection.epsg != 4326:
4454 output["visibility"] = False4372 output["projection"] = projection.epsg
4455 if record.opacity != 1:4373 self.setup_visibility_and_opacity(output)
4456 output["opacity"] = "%.1f" % record.opacity4374 self.setup_clustering(output)
4457 if record.cluster_distance != cluster_distance:4375
4458 output["cluster_distance"] = record.cluster_distance4376 return output
4459 if record.cluster_threshold != cluster_threshold:
4460 output["cluster_threshold"] = record.cluster_threshold
4461
4462 return output
44634377
4464# -----------------------------------------------------------------------------4378# -----------------------------------------------------------------------------
4465class GeoRSSLayer(Layer):4379class GeoRSSLayer(MultiRecordLayer):
4466 """ GeoRSS Layer from Catalogue """4380 """ GeoRSS Layer from Catalogue """
4467 def __init__(self, gis, record=None):4381 table_name = "gis_layer_georss"
4468 db = current.db4382 js_array = "S3.gis.layers_georss"
4469 tablename = "gis_layer_georss"4383
4470 try:4384 def __init__(self, gis):
4471 table = db[tablename]4385 super(GeoRSSLayer, self).__init__(gis)
4472 except:4386 GeoRSSLayer.SubLayer.cachetable = current.db.gis_cache
4473 current.manager.load(tablename)4387
4474 table = db[tablename]4388 class SubLayer(MultiRecordLayer.SubLayer):
44754389 def as_dict(self):
4476 self.gis = gis4390 gis = self.gis
4477 self.table = table4391
4478 self.cachetable = db.gis_cache4392 db = current.db
4479 self.js_array = "S3.gis.layers_georss"4393 request = current.request
4480 self.record = record4394 public_url = gis.public_url
4481 self._refresh()4395 cachetable = self.cachetable
4482 self.scripts = []4396
44834397 url = self.url
4484 def as_dict(self):4398 # Check to see if we should Download layer to the cache
4485 gis = self.gis4399 download = True
4486 cluster_distance = gis.cluster_distance4400 query = (cachetable.source == url)
4487 cluster_threshold = gis.cluster_threshold4401 existing_cached_copy = db(query).select(cachetable.modified_on,
4488 record = self.record4402 limitby=(0, 1)).first()
4489 marker = self.marker4403 refresh = self.refresh or 900 # 15 minutes set if we have no data (legacy DB)
44904404 if existing_cached_copy:
4491 db = current.db4405 modified_on = existing_cached_copy.modified_on
4492 request = current.request4406 cutoff = modified_on + timedelta(seconds=refresh)
4493 public_url = gis.public_url4407 if request.utcnow < cutoff:
4494 cachetable = self.cachetable4408 download = False
44954409 if download:
4496 url = record.url4410 # Download layer to the Cache
4497 # Check to see if we should Download layer to the cache4411 # @ToDo: Call directly without going via HTTP
4498 download = True4412 # s3mgr = current.manager
4499 query = (cachetable.source == url)4413 # @ToDo: Make this async by using Celery (also use this for the refresh time)
4500 cached = db(query).select(cachetable.modified_on,4414 fields = ""
4501 limitby=(0, 1)).first()4415 if self.data:
4502 refresh = record.refresh or 900 # 15 minutes set if we have no data (legacy DB)4416 fields = "&data_field=%s" % self.data
4503 if cached:4417 if self.image:
4504 modified_on = cached.modified_on4418 fields = "%s&image_field=%s" % (fields, self.image)
4505 cutoff = modified_on + timedelta(seconds=refresh)4419 _url = "%s%s/update.georss?fetchurl=%s%s" % (public_url,
4506 if request.utcnow < cutoff:4420 URL(c="gis", f="cache_feed"),
4507 download = False4421 url,
4508 if download:4422 fields)
4509 # Download layer to the Cache4423 try:
4510 # @ToDo: Call directly without going via HTTP4424 # @ToDo: Need to commit to not have DB locked with SQLite?
4511 # s3mgr = current.manager4425 fetch(_url)
4512 # @ToDo: Make this async by using Celery (also use this for the refresh time)4426 if existing_cached_copy:
4513 fields = ""4427 # Clear old selfs which are no longer active
4514 if record.data:4428 query = (cachetable.source == url) & \
4515 fields = "&data_field=%s" % record.data4429 (cachetable.modified_on < cutoff)
4516 if record.image:4430 db(query).delete()
4517 fields = "%s&image_field=%s" % (fields, record.image)4431 except:
4518 _url = "%s%s/update.georss?fetchurl=%s%s" % (public_url,4432 # Feed down
4519 URL(c="gis", f="cache_feed"),4433 if existing_cached_copy:
4520 url,4434 # Use cached copy
4521 fields)4435 # Should we Update timestamp to prevent every
4522 try:4436 # subsequent request attempting the download?
4523 # @ToDo: Need to commit to not have DB locked with SQLite?4437 #query = (cachetable.source == url)
4524 fetch(_url)4438 #db(query).update(modified_on=request.utcnow)
4525 if cached:4439 pass
4526 # Clear old records which are no longer active4440 else:
4527 query = (cachetable.source == url) & \4441 raise Exception("No cached copy available - skip layer")
4528 (cachetable.modified_on < cutoff)4442
4529 db(query).delete()4443 name_safe = self.safe_name
4530 except:4444
4531 # Feed down4445 # Pass the GeoJSON URL to the client
4532 if cached:4446 # Filter to the source of this feed
4533 # Use cached copy4447 url = "%s.geojson?cache.source=%s" % (URL(c="gis", f="cache_feed"),
4534 # Should we Update timestamp to prevent every4448 url)
4535 # subsequent request attempting the download?4449
4536 #query = (cachetable.source == url)4450 # Mandatory attributes
4537 #db(query).update(modified_on=request.utcnow)4451 output = {
4538 pass4452 "name": name_safe,
4539 else:4453 "url": url,
4540 # No cached copy available - skip layer4454 }
4541 return4455 self.marker.add_attributes_to_output(output)
45424456
4543 name_safe = self._name_safe(record.name)4457 # Attributes which are defaulted client-side if not set
45444458 if self.refresh != 900:
4545 # Pass the GeoJSON URL to the client4459 output["refresh"] = self.refresh
4546 # Filter to the source of this feed4460 self.setup_visibility_and_opacity(output)
4547 url = "%s.geojson?cache.source=%s" % (URL(c="gis", f="cache_feed"),4461 self.setup_clustering(output)
4548 url)4462
45494463 return output
4550 # Mandatory attributes
4551 output = {
4552 "name": name_safe,
4553 "url": url,
4554 "marker_image": marker.image,
4555 "marker_height": marker.height,
4556 "marker_width": marker.width,
4557 }
4558
4559 # Attributes which are defaulted client-side if not set
4560 if record.refresh != 900:
4561 output["refresh"] = record.refresh
4562 if not record.visible:
4563 output["visibility"] = False
4564 if record.opacity != 1:
4565 output["opacity"] = "%.1f" % record.opacity
4566 if record.cluster_distance != cluster_distance:
4567 output["cluster_distance"] = record.cluster_distance
4568 if record.cluster_threshold != cluster_threshold:
4569 output["cluster_threshold"] = record.cluster_threshold
4570
4571 return output
45724464
4573# -----------------------------------------------------------------------------4465# -----------------------------------------------------------------------------
4574class GPXLayer(Layer):4466class GPXLayer(MultiRecordLayer):
4575 """ GPX Layer from Catalogue """4467 """ GPX Layer from Catalogue """
4576 def __init__(self, gis, record=None):4468 table_name = "gis_layer_gpx"
4577 db = current.db4469 js_array = "S3.gis.layers_gpx"
4578 tablename = "gis_layer_gpx"4470
4579 try:4471 def __init__(self, gis):
4580 table = db[tablename]4472 super(GPXLayer, self).__init__(gis)
4581 except:4473
4582 current.manager.load(tablename)4474# if record:
4583 table = db[tablename]4475# self.url = "%s/%s" % (URL(c="default", f="download"),
45844476# record.track)
4585 self.gis = gis4477# else:
4586 self.table = table4478# self.url = None
4587 self.js_array = "S3.gis.layers_gpx"4479
4588 self.record = record4480 class SubLayer(MultiRecordLayer.SubLayer):
4589 self._refresh()4481 def as_dict(self):
4590 self.scripts = []4482 gis = self.gis
45914483 request = current.request
4592 def as_dict(self):4484
4593 gis = self.gis4485 url = URL(c="default", f="download",
4594 cluster_distance = gis.cluster_distance4486 args=self.track)
4595 cluster_threshold = gis.cluster_threshold4487
4596 record = self.record4488 # Mandatory attributes
4597 marker = self.marker4489 output = {
45984490 "name": self.safe_name,
4599 name_safe = self._name_safe(record.name)4491 "url": url,
46004492 }
4601 request = current.request4493 self.marker.add_attributes_to_output(output)
4602 url = URL(c="default", f="download",4494 self.add_attributes_if_not_default(
4603 args=record.track)4495 output,
46044496 waypoints = (self.waypoints, (True,)),
4605 # Mandatory attributes4497 tracks = (self.tracks, (True,)),
4606 output = {4498 routes = (self.routes, (True,)),
4607 "name": name_safe,4499 )
4608 "url": url,4500 self.setup_visibility_and_opacity(output)
4609 "marker_image": marker.image,4501 self.setup_clustering(output)
4610 "marker_height": marker.height,4502 return output
4611 "marker_width": marker.width,
4612 }
4613
4614 # Attributes which are defaulted client-side if not set
4615 if not record.waypoints:
4616 output["waypoints"] = False
4617 if not record.tracks:
4618 output["tracks"] = False
4619 if not record.routes:
4620 output["routes"] = False
4621 if not record.visible:
4622 output["visibility"] = False
4623 if record.opacity != 1:
4624 output["opacity"] = "%.1f" % record.opacity
4625 if record.cluster_distance != cluster_distance:
4626 output["cluster_distance"] = record.cluster_distance
4627 if record.cluster_threshold != cluster_threshold:
4628 output["cluster_threshold"] = record.cluster_threshold
4629
4630 return output
4631
4632 def refresh(self):
4633 " Refresh the attributes of the Layer "
4634 gis = self.gis
4635 request = current.request
4636 record = self.record
4637 table = self.table
4638 if "marker_id" in table:
4639 self.set_marker()
4640 if "projection_id" in table:
4641 self.set_projection()
4642 if record:
4643 self.url = "%s/%s" % (URL(c="default", f="download"),
4644 record.track)
4645 else:
4646 self.url = None
46474503
4648# -----------------------------------------------------------------------------4504# -----------------------------------------------------------------------------
4649class KMLLayer(Layer):4505class KMLLayer(MultiRecordLayer):
4650 """ KML Layer from Catalogue """4506 """ KML Layer from Catalogue """
4651 def __init__(self, gis, record=None):4507 table_name = "gis_layer_kml"
4652 db = current.db4508 js_array = "S3.gis.layers_kml"
4653 tablename = "gis_layer_kml"4509
4654 try:4510 def __init__(self, gis):
4655 table = db[tablename]4511 super(KMLLayer, self).__init__(gis)
4656 except:4512
4657 current.manager.load(tablename)4513 "Set up the KML cache, should be done once per request"
4658 table = db[tablename]
4659
4660 self.gis = gis
4661 self.table = table
4662 # @ToDo: Migrate to gis_cache
4663 self.cachetable = db.gis_cache2
4664 self.js_array = "S3.gis.layers_kml"
4665 self.record = record
4666 self._refresh()
4667 self.scripts = []
4668
4669 # Can we cache downloaded KML feeds?4514 # Can we cache downloaded KML feeds?
4670 # Needed for unzipping & filtering as well4515 # Needed for unzipping & filtering as well
4671 # @ToDo: Should we move this folder to static to speed up access to cached content?4516 # @ToDo: Should we move this folder to static to speed up access to cached content?
@@ -4676,263 +4521,172 @@
4676 else:4521 else:
4677 try:4522 try:
4678 os.mkdir(cachepath)4523 os.mkdir(cachepath)
4524 except OSError, os_error:
4525 self.gis.debug(
4526 "GIS: KML layers cannot be cached: %s %s" % (
4527 cachepath,
4528 os_error
4529 )
4530 )
4531 cacheable = False
4532 else:
4679 cacheable = True4533 cacheable = True
4680 except:4534 # @ToDo: Migrate to gis_cache
4681 cacheable = False4535 KMLLayer.cachetable = current.db.gis_cache2
4682 self.cacheable = cacheable4536 KMLLayer.cacheable = cacheable
4683 self.cachepath = cachepath4537 KMLLayer.cachepath = cachepath
46844538
4685 def as_dict(self):4539
4686 gis = self.gis4540 class SubLayer(MultiRecordLayer.SubLayer):
4687 cluster_distance = gis.cluster_distance4541 def as_dict(self):
4688 cluster_threshold = gis.cluster_threshold4542 gis = self.gis
4689 record = self.record4543
4690 marker = self.marker4544 T = current.T
46914545 db = current.db
4692 T = current.T4546 request = current.request
4693 db = current.db4547 response = current.response
4694 request = current.request4548 public_url = gis.public_url
4695 response = current.response4549
4696 public_url = gis.public_url4550 cachetable = KMLLayer.cachetable
4697 cacheable = self.cacheable4551 cacheable = KMLLayer.cacheable
4698 cachepath = self.cachepath4552 cachepath = KMLLayer.cachepath
4699 cachetable = self.cachetable4553
47004554 name = self.name
4701 name = record.name4555 if cacheable:
4702 name_safe = self._name_safe(record.name)4556 _name = urllib2.quote(name)
47034557 _name = _name.replace("%", "_")
4704 if cacheable:4558 filename = "%s.file.%s.kml" % (cachetable._tablename,
4705 _name = urllib2.quote(name)4559 _name)
4706 _name = _name.replace("%", "_")4560
4707 filename = "%s.file.%s.kml" % (cachetable._tablename,4561 # Should we download a fresh copy of the source file?
4708 _name)4562 download = True
47094563 query = (cachetable.name == name)
4710 # Should we download a fresh copy of the source file?4564 cached = db(query).select(cachetable.modified_on,
4711 download = True4565 limitby=(0, 1)).first()
4712 query = (cachetable.name == name)4566 refresh = self.refresh or 900 # 15 minutes set if we have no data (legacy DB)
4713 cached = db(query).select(cachetable.modified_on,
4714 limitby=(0, 1)).first()
4715 refresh = record.refresh or 900 # 15 minutes set if we have no data (legacy DB)
4716 if cached:
4717 modified_on = cached.modified_on
4718 cutoff = modified_on + timedelta(seconds=refresh)
4719 if request.utcnow < cutoff:
4720 download = False
4721
4722 if download:
4723 # Download file
4724 if response.s3.tasks_active():
4725 # Async call
4726 db.task_scheduled.insert(name="download_kml_%s" % uuid.uuid4(),
4727 func="download_kml",
4728 args=json.dumps([record.id,
4729 filename]))
4730 else:
4731 # Sync call
4732 gis.download_kml(record.id, filename)
4733 if cached:4567 if cached:
4734 db(query).update(modified_on=request.utcnow)4568 modified_on = cached.modified_on
4735 else:4569 cutoff = modified_on + timedelta(seconds=refresh)
4736 cachetable.insert(name=name, file=filename)4570 if request.utcnow < cutoff:
47374571 download = False
4738 url = URL(c="default", f="download",4572
4739 args=[filename])4573 if download:
4740 else:4574 # Download file
4741 # No caching possible (e.g. GAE), display file direct from remote (using Proxy)4575 if response.s3.tasks_active():
4742 # (Requires OpenLayers.Layer.KML to be available)4576 # Async call
4743 url = record.url4577 db.task_scheduled.insert(name="download_kml_%s" % uuid.uuid4(),
47444578 func="download_kml",
4745 # Mandatory attributes4579 args=json.dumps([record.id,
4746 output = {4580 filename]))
4747 "name": name_safe,4581 else:
4748 "url": url,4582 # Sync call
4749 "marker_image": marker.image,4583 gis.download_kml(self.id, filename)
4750 "marker_height": marker.height,4584 if cached:
4751 "marker_width": marker.width,4585 db(query).update(modified_on=request.utcnow)
4752 }4586 else:
47534587 cachetable.insert(name=name, file=filename)
4754 # Attributes which are defaulted client-side if not set4588
4755 if record.title and record.title != "name":4589 url = URL(r=request, c="default", f="download",
4756 output["title"] = record.title4590 args=[filename])
4757 if record.body and record.body != "description":4591 else:
4758 output["body"] = record.body4592 # No caching possible (e.g. GAE), display file direct from remote (using Proxy)
4759 if record.refresh != 900:4593 # (Requires OpenLayers.Layer.KML to be available)
4760 output["refresh"] = record.refresh4594 url = self.url
4761 if not record.visible:4595
4762 output["visibility"] = False4596 output = dict(
4763 if record.opacity != 1:4597 name = self.safe_name,
4764 output["opacity"] = "%.1f" % record.opacity4598 url = url,
4765 if record.cluster_distance != cluster_distance:4599 )
4766 output["cluster_distance"] = record.cluster_distance4600 self.add_attributes_if_not_default(
4767 if record.cluster_threshold != cluster_threshold:4601 output,
4768 output["cluster_threshold"] = record.cluster_threshold4602 title = (self.title, ("name", None, "")),
47694603 body = (self.body, ("description", None)),
4770 return output4604 refresh = (self.refresh, (900,)),
4605 )
4606 self.setup_visibility_and_opacity(output)
4607 self.setup_clustering(output)
4608 self.marker.add_attributes_to_output(output)
4609 return output
47714610
4772# -----------------------------------------------------------------------------4611# -----------------------------------------------------------------------------
4773class TMSLayer(Layer):4612class TMSLayer(MultiRecordLayer):
4774 """ TMS Layer from Catalogue """4613 """ TMS Layer from Catalogue """
4775 def __init__(self, gis, record=None):4614 table_name = "gis_layer_tms"
4776 db = current.db4615 js_array = "S3.gis.layers_tms"
4777 tablename = "gis_layer_tms"4616
4778 try:4617 class SubLayer(MultiRecordLayer.SubLayer):
4779 table = db[tablename]4618 def as_dict(self):
4780 except:4619 output = {
4781 current.manager.load(tablename)4620 "name": self.safe_name,
4782 table = db[tablename]4621 "url": self.url,
47834622 "layername": self.layername
4784 self.gis = gis4623 }
4785 self.table = table4624 self.add_attributes_if_not_default(
4786 self.js_array = "S3.gis.layers_tms"4625 output,
4787 self.record = record4626 url2 = (self.url2, (None,)),
4788 self._refresh()4627 url3 = (self.url3, (None,)),
4789 self.scripts = []4628 format = (self.img_format, ("png", None)),
47904629 zoomLevels = (self.zoom_levels, (9,)),
4791 def as_dict(self):4630 attribution = (self.attribution, (None,)),
4792 #gis = self.gis4631 )
4793 record = self.record4632 return output
4794
4795 name_safe = self._name_safe(record.name)
4796
4797 # Mandatory attributes
4798 output = {
4799 "name": name_safe,
4800 "url": record.url,
4801 "layername": record.layername
4802 }
4803
4804 # Attributes which are defaulted client-side if not set
4805 if record.url2:
4806 output["url2"] = record.url2
4807 if record.url3:
4808 output["url3"] = record.url3
4809 if record.img_format != "png":
4810 output["format"] = record.img_format
4811 if record.zoom_levels != 9:
4812 output["zoomLevels"] = record.zoom_levels
4813 if record.attribution:
4814 output["attribution"] = record.attribution
4815
4816 return output
48174633
4818# -----------------------------------------------------------------------------4634# -----------------------------------------------------------------------------
4819class WFSLayer(Layer):4635class WFSLayer(MultiRecordLayer):
4820 """ WFS Layer from Catalogue """4636 """ WFS Layer from Catalogue """
4821 def __init__(self, gis, record=None):4637 table_name = "gis_layer_wfs"
4822 db = current.db4638 js_array = "S3.gis.layers_wfs"
4823 tablename = "gis_layer_wfs"4639
4824 try:4640 class SubLayer(MultiRecordLayer.SubLayer):
4825 table = db[tablename]4641 def as_dict(self):
4826 except:4642 output = dict(
4827 current.manager.load(tablename)4643 name = self.safe_name,
4828 table = db[tablename]4644 url = self.url,
48294645 title = self.title,
4830 self.gis = gis4646 featureType = self.featureType,
4831 self.table = table4647 featureNS = self.featureNS,
4832 self.js_array = "S3.gis.layers_wfs"4648 schema = self.wfs_schema,
4833 self.record = record4649 )
4834 self._refresh()4650 self.add_attributes_if_not_default(
4835 self.scripts = []4651 output,
48364652 version = (self.version, ("1.1.0",)),
4837 def as_dict(self):4653 geometryName = (self.geometryName, ("the_geom",)),
4838 gis = self.gis4654 styleField = (self.style_field, (None,)),
4839 cluster_distance = gis.cluster_distance4655 styleValues = (self.style_values, ("{}", None)),
4840 cluster_threshold = gis.cluster_threshold4656 projection = (self.projection.epsg, (4326,)),
4841 record = self.record
4842 projection = self.projection
4843
4844 name_safe = self._name_safe(record.name)
4845
4846 # Mandatory attributes
4847 output = {
4848 "name": name_safe,
4849 "url": record.url,
4850 "title": record.title,
4851 "featureType": record.featureType,
4852 "featureNS": record.featureNS,
4853 "schema": record.wfs_schema,
4854 #editable4657 #editable
4855 }4658 )
48564659 self.setup_visibility_and_opacity(output)
4857 # Attributes which are defaulted client-side if not set4660 self.setup_clustering(output)
4858 if record.version != "1.1.0":4661 return output
4859 output["version"] = record.version
4860 if record.geometryName != "the_geom":
4861 output["geometryName"] = record.geometryName
4862 if record.style_field:
4863 output["styleField"] = record.style_field
4864 if record.style_values and record.style_values != "{}":
4865 output["styleValues"] = record.style_values
4866 if projection.epsg != 4326:
4867 output["projection"] = projection.epsg
4868 if not record.visible:
4869 output["visibility"] = False
4870 if record.opacity != 1:
4871 output["opacity"] = "%.1f" % record.opacity
4872 if record.cluster_distance != cluster_distance:
4873 output["cluster_distance"] = record.cluster_distance
4874 if record.cluster_threshold != cluster_threshold:
4875 output["cluster_threshold"] = record.cluster_threshold
4876
4877 return output
48784662
4879# -----------------------------------------------------------------------------4663# -----------------------------------------------------------------------------
4880class WMSLayer(Layer):4664class WMSLayer(MultiRecordLayer):
4881 """ WMS Layer from Catalogue """4665 """ WMS Layer from Catalogue """
4882 def __init__(self, gis, record=None):4666 js_array = "S3.gis.layers_wms"
4883 db = current.db4667 table_name = "gis_layer_wms"
4884 tablename = "gis_layer_wms"4668
4885 try:4669 class SubLayer(MultiRecordLayer.SubLayer):
4886 table = db[tablename]4670 def as_dict(self):
4887 except:4671 output = dict(
4888 current.manager.load(tablename)4672 name = self.safe_name,
4889 table = db[tablename]4673 url = self.url,
48904674 layers = self.layers
4891 self.gis = gis4675 )
4892 self.table = table4676 self.add_attributes_if_not_default(
4893 self.js_array = "S3.gis.layers_wms"4677 output,
4894 self.record = record4678 transparent = (self.transparent, (True,)),
4895 self._refresh()4679 version = (self.version, ("1.1.1",)),
4896 self.scripts = []4680 format = (self.img_format, ("image/png",)),
48974681 map = (self.map, (None,)),
4898 def as_dict(self):4682 buffer = (self.buffer, (0,)),
4899 #gis = self.gis4683 base = (self.base, (False,)),
4900 record = self.record4684 style = (self.style, (None,)),
49014685 bgcolor = (self.bgcolor, (None,)),
4902 name_safe = self._name_safe(record.name)4686 tiled = (self.tiled, (False, )),
49034687 )
4904 # Mandatory attributes4688 self.setup_visibility_and_opacity(output)
4905 output = {4689 return output
4906 "name": name_safe,
4907 "url": record.url,
4908 "layers": record.layers
4909 }
4910
4911 # Attributes which are defaulted client-side if not set
4912 if not record.visible:
4913 output["visibility"] = False
4914 if record.opacity != 1:
4915 output["opacity"] = "%.1f" % record.opacity
4916 if not record.transparent:
4917 output["transparent"] = False
4918 if record.version != "1.1.1":
4919 output["version"] = record.version
4920 if record.img_format != "image/png":
4921 output["format"] = record.img_format
4922 if record.map:
4923 output["map"] = record.map
4924 if record.style:
4925 output["style"] = record.style
4926 if record.bgcolor:
4927 output["bgcolor"] = record.bgcolor
4928 if record.tiled:
4929 output["tiled"] = True
4930 if record.buffer:
4931 output["buffer"] = record.buffer
4932 if record.base:
4933 output["base"] = True
4934
4935 return output
49364690
4937# =============================================================================4691# =============================================================================
4938class S3MAP(S3Method):4692class S3MAP(S3Method):
@@ -5041,4 +4795,3 @@
5041 return page4795 return page
50424796
5043# END =========================================================================4797# END =========================================================================
5044
50454798
=== added file 'modules/test_utils/AddedRole.py'
--- modules/test_utils/AddedRole.py 1970-01-01 00:00:00 +0000
+++ modules/test_utils/AddedRole.py 2011-09-05 13:55:13 +0000
@@ -0,0 +1,25 @@
1
2class AddedRole(object):
3 """Adds a role and removes it at the end of a test no matter what happens.
4
5 """
6 def __init__(self, session, role):
7 self.role = role
8 self.session = session
9
10 def __enter__(self):
11 roles = self.session.s3.roles
12 role = self.role
13 if not role in roles:
14 roles.append(role)
15
16 def __exit__(self, type, value, traceback):
17 session_s3_roles = self.session.s3.roles
18 roles = list(session_s3_roles)
19 for i in range(len(roles)):
20 session_s3_roles.pop(0)
21 add_role = session_s3_roles.append
22 role = self.role
23 for role in roles:
24 if role is not role:
25 add_role(role)
026
=== added file 'modules/test_utils/Change.py'
--- modules/test_utils/Change.py 1970-01-01 00:00:00 +0000
+++ modules/test_utils/Change.py 2011-09-05 13:55:13 +0000
@@ -0,0 +1,25 @@
1
2_UNDEFINED = object()
3
4class Change(object):
5 def __init__(self, target, changes):
6 self.changes = changes
7 self.target = target
8
9 def __enter__(self):
10 assert not hasattr(self, "originals")
11 self.originals = originals = {}
12 # store originals and set new values
13 for name, value in self.changes.iteritems():
14 originals[name] = getattr(self.target, name, _UNDEFINED)
15 setattr(self.target, name, value)
16
17 def __exit__(self, type, value, traceback):
18 # restore originals
19 for name, value in self.originals.iteritems():
20 if value is _UNDEFINED:
21 delattr(self, name)
22 else:
23 setattr(self.target, name, value)
24 del self.originals
25
026
=== added file 'modules/test_utils/ExpectSessionWarning.py'
--- modules/test_utils/ExpectSessionWarning.py 1970-01-01 00:00:00 +0000
+++ modules/test_utils/ExpectSessionWarning.py 2011-09-05 13:55:13 +0000
@@ -0,0 +1,14 @@
1
2class ExpectSessionWarning(object):
3 def __init__(self, session, warning):
4 self.warning = warning
5 self.session = session
6
7 def __enter__(self):
8 session = self.session
9 warnings = []
10 self.warnings = session.warning = warnings
11
12 def __exit__(self, type, value, traceback):
13 if type is None:
14 assert self.warning in self.warnings
015
=== added file 'modules/test_utils/ExpectedException.py'
--- modules/test_utils/ExpectedException.py 1970-01-01 00:00:00 +0000
+++ modules/test_utils/ExpectedException.py 2011-09-05 13:55:13 +0000
@@ -0,0 +1,13 @@
1
2class ExpectedException(object):
3 def __init__(self, ExceptionClass):
4 self.ExceptionClass = ExceptionClass
5
6 def __enter__(self):
7 pass
8
9 def __exit__(self, type, value, traceback):
10 return issubclass(type, self.ExceptionClass), (
11 "%s not raised" % self.ExceptionClass.__name__
12 )
13
014
=== added file 'modules/test_utils/InsertedRecord.py'
--- modules/test_utils/InsertedRecord.py 1970-01-01 00:00:00 +0000
+++ modules/test_utils/InsertedRecord.py 2011-09-05 13:55:13 +0000
@@ -0,0 +1,19 @@
1
2from clear_table import clear_table
3
4class InsertedRecord(object):
5 """Inserts and commits a record and removes it at the end of
6 a test no matter what happens.
7
8 """
9 def __init__(self, db, table, data):
10 self.db = db
11 self.table = table
12 self.data = data
13
14 def __enter__(self):
15 self.table.insert(**self.data)
16 self.db.commit()
17
18 def __exit__(self, type, value, traceback):
19 clear_table(self.db, self.table)
020
=== added file 'modules/test_utils/Web2pyNosePlugin.py'
--- modules/test_utils/Web2pyNosePlugin.py 1970-01-01 00:00:00 +0000
+++ modules/test_utils/Web2pyNosePlugin.py 2011-09-05 13:55:13 +0000
@@ -0,0 +1,106 @@
1
2import nose
3import re
4from itertools import imap
5import unittest
6
7class Web2pyNosePlugin(nose.plugins.base.Plugin):
8 # see: http://somethingaboutorange.com/mrl/projects/nose/0.11.1/plugins/writing.html
9
10 """This plugin is designed to give the web2py environment to the tests.
11 """
12 score = 0
13 # always enable as this plugin can only
14 # be selected by running this script
15 enabled = True
16
17 def __init__(
18 self,
19 application_name,
20 environment,
21 directory_pattern,
22 test_folders
23 ):
24 super(Web2pyNosePlugin, self).__init__()
25 self.application_name = application_name
26 self.environment = environment
27 self.directory_pattern = directory_pattern
28 self.test_folders = test_folders
29
30 def options(self, parser, env):
31 """Register command line options"""
32 pass
33
34 def wantDirectory(self, dirname):
35 return bool(re.search(self.directory_pattern, dirname))
36
37 def wantFile(self, file_name):
38 print file_name
39 return file_name.endswith(".py") and any(
40 imap(file_name.__contains__, self.test_folders)
41 )
42
43 def wantModule(self, module):
44 return False
45
46 def loadTestsFromName(self, file_name, discovered):
47 """Sets up the unit-testing environment.
48
49 This involves loading modules as if by web2py.
50 Also we must have a test database.
51
52 If testing controllers, tests need to set up the request themselves.
53
54 """
55 if file_name.endswith(".py"):
56
57 # Is it possible that the module could load
58 # other code that is using the original db?
59
60 test_globals = self.environment
61
62 module_globals = dict(self.environment)
63 # execfile is used because it doesn't create a module
64 # or load the module from sys.modules if it exists.
65
66 execfile(file_name, module_globals)
67
68 import inspect
69 # we have to return something, otherwise nose
70 # will let others have a go, and they won't pass
71 # in the web2py environment, so we'll get errors
72 tests = []
73
74 for name, thing in module_globals.iteritems():
75 if (
76 # don't bother with globally imported things
77 name not in test_globals \
78 # unless they have been overridden
79 or test_globals[name] is not thing
80 ):
81 if (
82 isinstance(thing, type)
83 and issubclass(thing, unittest.TestCase)
84 ):
85 # look for test methods
86 for member_name in dir(thing):
87 if member_name.startswith("test"):
88 if callable(getattr(thing, member_name)):
89 tests.append(thing(member_name))
90 elif (
91 name.startswith("test")
92 or name.startswith("Test")
93 ):
94 if inspect.isfunction(thing):
95 function = thing
96 function_name = name
97 # things coming from execfile have no module
98 #print file_name, function_name, function.__module__
99 if function.__module__ in ("__main__", None):
100 tests.append(
101 nose.case.FunctionTestCase(function)
102 )
103 return tests
104 else:
105 return []
106
0107
=== modified file 'modules/test_utils/__init__.py'
--- modules/test_utils/__init__.py 2011-08-11 19:25:52 +0000
+++ modules/test_utils/__init__.py 2011-09-05 13:55:13 +0000
@@ -1,2 +1,12 @@
11
2from compare_lines import compare_lines
3\ No newline at end of file2\ No newline at end of file
3from compare_lines import compare_lines
4from clear_table import clear_table
5from find_JSON_format_data_structure import *
6from Web2pyNosePlugin import Web2pyNosePlugin
7from assert_equal import *
8
9from InsertedRecord import *
10from AddedRole import *
11from ExpectedException import *
12from Change import *
13from ExpectSessionWarning import ExpectSessionWarning
414
=== added file 'modules/test_utils/assert_equal.py'
--- modules/test_utils/assert_equal.py 1970-01-01 00:00:00 +0000
+++ modules/test_utils/assert_equal.py 2011-09-05 13:55:13 +0000
@@ -0,0 +1,60 @@
1
2
3def assert_same_type(expected, actual):
4 assert isinstance(actual, type(expected)), "%s vs. %s" % (type(expected), type(actual))
5
6def assert_equal_sequence(expected, actual):
7 assert len(expected) == len(actual), "length should be %i, not %i:\n%s" % (
8 len(expected), len(actual), actual
9 )
10 for i in range(len(expected)):
11 try:
12 assert_equal(expected[i], actual[i])
13 except AssertionError, assertion_error:
14 raise AssertionError(
15 str(assertion_error)
16 )
17
18def assert_equal_set(expected, actual):
19 missing = expected.difference(actual)
20 assert not missing, "Missing: %s" % ", ".join(missing)
21
22 extra = actual.difference(expected)
23 assert not extra, "Extra: %s" % ", ".join(extra)
24
25def assert_equal_dict(expected, actual):
26 assert_equal_set(
27 expected = set(expected.keys()),
28 actual = set(actual.keys())
29 )
30 for key in expected.iterkeys():
31 try:
32 assert_equal(expected[key], actual[key])
33 except AssertionError, assertion_error:
34 raise AssertionError(
35 "[%s] %s" % (
36 key,
37 str(assertion_error),
38 )
39 )
40
41def assert_equal_value(expected, actual):
42 assert expected == actual, "%s != %s" % (expected, actual)
43
44_compare_procs = {
45 list: assert_equal_sequence,
46 int: assert_equal_value,
47 float: assert_equal_value,
48 str: assert_equal_value,
49 unicode: assert_equal_value, #sequence,
50 dict: assert_equal_dict,
51 set: assert_equal_set,
52}
53
54def assert_equal(expected, actual):
55 assert_same_type(expected, actual)
56 compare_proc = _compare_procs.get(type(expected), assert_equal_value)
57 compare_proc(
58 expected,
59 actual
60 )
061
=== added file 'modules/test_utils/clear_table.py'
--- modules/test_utils/clear_table.py 1970-01-01 00:00:00 +0000
+++ modules/test_utils/clear_table.py 2011-09-05 13:55:13 +0000
@@ -0,0 +1,4 @@
1
2def clear_table(db, db_table):
3 db(db_table.id).delete()
4 db.commit()
05
=== added file 'modules/test_utils/find_JSON_format_data_structure.py'
--- modules/test_utils/find_JSON_format_data_structure.py 1970-01-01 00:00:00 +0000
+++ modules/test_utils/find_JSON_format_data_structure.py 2011-09-05 13:55:13 +0000
@@ -0,0 +1,54 @@
1
2import re
3from json.decoder import JSONDecoder
4
5__all__ = (
6 "not_found",
7 "cannot_parse_JSON",
8 "find_JSON_format_data_structure"
9)
10
11def not_found(name, string):
12 raise Exception(
13 u"Cannot find %s in %s" % (name, string)
14 )
15
16def cannot_parse_JSON(string):
17 raise Exception(
18 u"Cannot parse JSON: '%s'" % string
19 )
20
21def find_JSON_format_data_structure(
22 string,
23 name,
24 found,
25 not_found,
26 cannot_parse_JSON
27):
28 """Finds a named JSON-format data structure in the string.
29
30 The name can be any string.
31 The pattern "name = " will be looked for in the string,
32 and the data structure following it parsed and returned as a python
33 data structure.
34 """
35 try:
36 name_start = string.index(name)
37 except ValueError:
38 not_found(name, string)
39 else:
40 name_length = len(name)
41 name_end = name_start + name_length
42
43 _, remaining = re.Scanner([
44 (r"\s*=\s*", lambda scanner, token: None)
45 ]).scan(
46 string[name_end:]
47 )
48
49 try:
50 data, end_position = JSONDecoder().raw_decode(remaining)
51 except ValueError, value_error:
52 cannot_parse_JSON(remaining)
53 else:
54 found(data)
055
=== modified file 'modules/test_utils/run.py'
--- modules/test_utils/run.py 2011-08-11 19:25:52 +0000
+++ modules/test_utils/run.py 2011-09-05 13:55:13 +0000
@@ -1,5 +1,8 @@
1#!python1#!python
22
3# capture web2py environment before doing anything else
4
5web2py_environment = dict(globals())
36
4__doc__ = """This script is run from the nose command in the 7__doc__ = """This script is run from the nose command in the
5application being tested:8application being tested:
@@ -8,249 +11,81 @@
811
9python2.6 ./applications/eden/tests/nose.py <nose arguments>12python2.6 ./applications/eden/tests/nose.py <nose arguments>
1013
14web2py runs a file which:
151. Sets up a plugin. This plugin registers itself so nose can use it.
162. Runs nose programmatically giving it the plugin
17nose loads the tests via the plugin.
18when the plugin loads the tests, it injects the web2py environment.
19
11"""20"""
1221
13########################################22import sys
14#23
15# web2py runs a file which:24from types import ModuleType
16# 1. Sets up a plugin. This plugin registers itself so nose can use it.
17# 2. Runs nose programmatically giving it the plugin
18#
19# nose loads the tests via the plugin.
20# when the plugin loads the tests, it injects the web2py environment.
21#
22########################################
23
24
25# @ToDo:
26# in particular, haven't checked that db is being replaced with the test_db
27# (work in progress)
28
29# using --import_models is OK for a single test, but for running a suite,
30# we probably need the test_env function to set up an environment
31
32# @ToDo: Provide an ignore_warns mode so that we can tackle ERRORs 1st
33# but FAILs often give us clues that help us fix ERRORs
34# fixing an error might itself cause a failure, but not be spotted until later.
35
36def use_test_db(db):
37 print "Creating test database..."
38 try:
39 test_db = use_test_db.db
40 except AttributeError:
41 # create test database by copying db
42 test_db_name = "sqlite://testing.sqlite"
43 print "Copying db tables into test database..."
44 test_db = DAL(test_db_name) # Name and location of the test DB file
45 # Copy tables!
46 for tablename in db.tables:
47 table_copy = []
48 for data in db[tablename]:
49 table_copy.append(
50 copy.copy(data)
51 )
52 test_db.define_table(tablename, *table_copy)
53 use_test_db.db = test_db
54 return test_db
55
56import nose25import nose
5726import glob
27import os.path
28import os
58import copy29import copy
59def test_env(30from gluon.globals import Request
60 imports,31import unittest
61 application,32
62 controller,33
63 function,34def load_module(application_relative_module_path):
64 folder,
65 globals,
66 create_test_db = use_test_db,
67 _module_root = "applications",
68):
69 """Sets up the unit-testing environment.
70
71 This involves loading modules as if by web2py.
72 Also we must make a test database.
73
74 """
75 from gluon.globals import Request
76 globals["Request"] = Request
77 request = Request()
78 request.application = application
79 request.controller = controller
80 request.function = function
81 request.folder = folder
82
83 globals["request"] = request
84
85 test_db = db#create_test_db(db)
86
87 import os35 import os
88 for import_path, names in imports:36 web2py_relative_module_path = ".".join((
89 #print import_path37 "applications", request.application, application_relative_module_path
90 #module = {"db": test_db}38 ))
91 #module.update(globals)39 imported_module = __import__(web2py_relative_module_path)
92 module = dict(globals)40 for step in web2py_relative_module_path.split(".")[1:]:
93 path_components = [_module_root] + import_path.split(".")41 imported_module = getattr(imported_module, step)
94 file_path = os.path.join(*path_components)+".py"42 return imported_module
95 # execfile is used because it doesn't create a module43
96 # and doesn't load the module if it exists.44web2py_environment["load_module"] = load_module
97 execfile(file_path, module)45web2py_env_module = ModuleType("web2py_env")
98 if names is "*":46web2py_env_module.__dict__.update(web2py_environment)
99 globals.update(module)47sys.modules["web2py_env"] = web2py_env_module
100 else:48
101 for name in names:
102 globals[name] = module[name]
103
104# -----------------------
105
106import os.path
10749
108application_name = request.application50application_name = request.application
10951application_folder_path = os.path.join("applications",application_name)
110model_files_pattern = os.path.join("applications",application_name,"models","*.py")52
111import glob53application = application_name
11254controller = "controller"
113test_env(55function = "function"
114 globals = globals(),56folder = os.path.join(os.getcwd(), "applications", application_name)
115 application = application_name,57
116 controller = "controller",58web2py_environment["Request"] = Request
117 function = "function",59request = Request()
118 folder = "folder",60request.application = application
119 imports = [61request.controller = controller
120 (application_name+".models.%s" % (module_name[len(model_files_pattern)-4:-3]), "*") 62request.function = function
121 for module_name in glob.glob(model_files_pattern)63request.folder = folder
122 ]64
123)65web2py_environment["request"] = request
12466current.request = request
125import sys67
126log = sys.stderr.write68controller_configuration = OrderedDict()
12769for controller_name in ["default"]+glob.glob(
128import unittest70 os.path.join(application_folder_path, "controllers", "*.py")
129from itertools import imap71):
72 controller_configuration[controller_name] = Storage(
73 name_nice = controller_name,
74 description = controller_name,
75 restricted = False,
76 module_type = 0
77 )
78
79current.deployment_settings.modules = controller_configuration
13080
131test_folders = set()81test_folders = set()
13282argv = []
133class Web2pyNosePlugin(nose.plugins.base.Plugin):
134 # see: http://somethingaboutorange.com/mrl/projects/nose/0.11.1/plugins/writing.html
135
136 """This plugin is designed to give the web2py environment to the tests.
137 """
138 score = 0
139 # always enable as this plugin can only
140 # be selected by running this script
141 enabled = True
142
143 def __init__(
144 self,
145 application_name,
146 environment,
147 create_test_db,
148 directory_pattern
149 ):
150 super(Web2pyNosePlugin, self).__init__()
151 self.application_name = application_name
152 self.environment = dict(
153 db = db#create_test_db(db)
154 )
155 self.environment.update(environment)
156 self.directory_pattern = directory_pattern
157
158 def options(self, parser, env):
159 """Register command line options"""
160 return
161 parser.add_option(
162 "--web2py",
163 dest="web2py",
164 action="append",
165 metavar="ATTR",
166 help="Use web2py environment when loading tests"
167 )
168
169 def wantDirectory(self, dirname):
170 return bool(re.search(self.directory_pattern, dirname))
171
172 def wantFile(self, file_name):
173 return file_name.endswith(".py") and any(
174 imap(file_name.__contains__, test_folders)
175 )
176
177 def wantModule(self, module):
178 return False
179
180 def loadTestsFromName(self, file_name, discovered):
181 """Sets up the unit-testing environment.
182
183 This involves loading modules as if by web2py.
184 Also we must have a test database.
185
186 If testing controllers, tests need to set up the request themselves.
187
188 """
189 if file_name.endswith(".py"):
190# log(file_name)
191
192 # assert 0, file_name
193 # stop
194
195 # Is it possible that the module could load
196 # other code that is using the original db?
197
198 test_globals = self.environment
199
200 # execfile is used because it doesn't create a module
201 # and doesn't load the module into sys.modules if it exists.
202 module_globals = dict(self.environment)
203 execfile(file_name, module_globals)
204
205 import inspect
206 # we have to return something, otherwise nose
207 # will let others have a go, and they won't pass
208 # in the web2py environment, so we'll get errors
209 tests = []
210
211 for name, thing in module_globals.iteritems():
212 if (
213 # don't bother with globally imported things
214 name not in test_globals \
215 # unless they have been overridden
216 or test_globals[name] is not thing
217 ):
218 if (
219 isinstance(thing, type)
220 and issubclass(thing, unittest.TestCase)
221 ):
222 # look for test methods
223 for member_name in dir(thing):
224 if member_name.startswith("test"):
225 if callable(getattr(thing, member_name)):
226 tests.append(thing(member_name))
227 elif (
228 name.startswith("test")
229 or name.startswith("Test")
230 ):
231 if inspect.isfunction(thing):
232 function = thing
233 function_name = name
234 # things coming from execfile have no module
235 #print file_name, function_name, function.__module__
236 if function.__module__ is None:
237 tests.append(
238 nose.case.FunctionTestCase(function)
239 )
240 return tests
241 else:
242 return
243
244import re
245
246argv = [
247 #"--verbosity=2",
248 #"--debug=nose"
249]
25083
251# folder in which tests are kept84# folder in which tests are kept
252# non-option arguments (test paths) are made relative to this85# non-option arguments (test paths) are made relative to this
253test_root = os.path.join("applications", application_name, "tests", "unit_tests")86test_root = os.path.join(application_folder_path, "tests", "unit_tests")
87
88current_working_directory = os.getcwd()
25489
255disallowed_options = {}90disallowed_options = {}
256disallowed_options["-w"] = disallowed_options["--where"] = (91disallowed_options["-w"] = disallowed_options["--where"] = (
@@ -275,10 +110,10 @@
275 argv.append(arg)110 argv.append(arg)
276 else:111 else:
277 test_path = arg112 test_path = arg
278 test_fuller_path = os.path.join(test_root, test_path)113 test_folder_fuller_path = os.path.join(test_root, test_path)
279 test_folders.add(test_fuller_path)114 test_folders.add(test_folder_fuller_path)
280 if not os.path.exists(test_fuller_path):115 if not os.path.exists(test_folder_fuller_path):
281 print "\n", test_fuller_path, "not found"116 print "\n", test_folder_fuller_path, "not found"
282 #sys.exit(1)117 #sys.exit(1)
283118
284# test paths in command line aren't passed, just added to test_folders119# test paths in command line aren't passed, just added to test_folders
@@ -293,14 +128,15 @@
293128
294sys.argv[1:] = argv129sys.argv[1:] = argv
295130
131test_utils = local_import("test_utils")
132
296nose.main(133nose.main(
297# seems at least this version of nose ignores passed in argv134# seems at least this version of nose ignores passed in argv
298# argv = argv, 135# argv = argv,
299 addplugins = nose.plugins.PluginManager([136 addplugins = nose.plugins.PluginManager([
300 Web2pyNosePlugin(137 test_utils.Web2pyNosePlugin(
301 application_name,138 application_name,
302 globals(),139 web2py_environment,
303 use_test_db,
304 re.compile(140 re.compile(
305 re.escape(os.path.sep).join(141 re.escape(os.path.sep).join(
306 (142 (
@@ -311,7 +147,8 @@
311 "[^","]*)*)?)?)?$"147 "[^","]*)*)?)?)?$"
312 )148 )
313 )149 )
314 )150 ),
151 test_folders
315 )152 )
316 ])153 ])
317)154)
318155
=== added file 'private/prepopulate/default/tasks.cfg'
--- private/prepopulate/default/tasks.cfg 1970-01-01 00:00:00 +0000
+++ private/prepopulate/default/tasks.cfg 2011-09-05 13:55:13 +0000
@@ -0,0 +1,18 @@
1##########################################################################
2# Add a list of csv file to import into the system
3# the list of import file sis a comma separated list as follows:
4# "prefix","tablename","csv file name","stylesheet"
5#
6# The csv file is assumed to be in the same directory as this file
7# The style sheet is assumed to be in either of the following directories:
8# static/format/s3csv/"prefix"/
9# static/format/s3csv/
10#
11# For details on how to import data into the system see the following:
12# zzz_1st_run
13# s3Tools::S3BulkImporter
14##########################################################################
15"supply","catalog_item","DefaultItems.csv","supply_items.xsl"
16"supply","catalog_item","StandardItems.csv","supply_items.xsl"
17"hrm","skill","DefaultSkillList.csv","skill.xsl"
18"hrm","competency_rating",DefaultSkillCompetency.csv,competency_rating.xsl
0\ No newline at end of file19\ No newline at end of file
120
=== removed file 'private/prepopulate/default/tasks.cfg'
--- private/prepopulate/default/tasks.cfg 2011-08-17 15:13:43 +0000
+++ private/prepopulate/default/tasks.cfg 1970-01-01 00:00:00 +0000
@@ -1,18 +0,0 @@
1##########################################################################
2# Add a list of csv file to import into the system
3# the list of import file sis a comma separated list as follows:
4# "prefix","tablename","csv file name","stylesheet"
5#
6# The csv file is assumed to be in the same directory as this file
7# The style sheet is assumed to be in either of the following directories:
8# static/format/s3csv/"prefix"/
9# static/format/s3csv/
10#
11# For details on how to import data into the system see the following:
12# zzz_1st_run
13# s3Tools::S3BulkImporter
14##########################################################################
15"supply","catalog_item","DefaultItems.csv","supply_items.xsl"
16"supply","catalog_item","StandardItems.csv","supply_items.xsl"
17"hrm","skill","DefaultSkillList.csv","skill.xsl"
18"hrm","competency_rating",DefaultSkillCompetency.csv,competency_rating.xsl
190
=== modified file 'static/scripts/S3/s3.gis.climate.js'
--- static/scripts/S3/s3.gis.climate.js 2011-06-15 09:47:54 +0000
+++ static/scripts/S3/s3.gis.climate.js 2011-09-05 13:55:13 +0000
@@ -9,154 +9,372 @@
9 }9 }
10}10}
1111
12
13ClimateDataMapPlugin = function (config) {12ClimateDataMapPlugin = function (config) {
14 var self = this // so no this-clobbering13 var plugin = this // let's be explicit!
15 self.data_type_option_names = config.data_type_option_names14 plugin.data_type_option_names = config.data_type_option_names
16 self.parameter_names = config.parameter_names15 plugin.parameter_names = config.parameter_names
17 self.projected_option_type_names = config.projected_option_type_names16 plugin.year_min = config.year_min
18 self.year_min = config.year_min 17 plugin.year_max = config.year_max
19 self.year_max = config.year_max18
2019 plugin.data_type_label = config.data_type_label
21 self.data_type_label = config.data_type_label20 plugin.overlay_data_URL = config.overlay_data_URL
22 self.projected_option_type_label = config.projected_option_type_label21 plugin.chart_URL = config.chart_URL
2322 delete config
24 self.setup = function () { 23
25 var graphic = new OpenLayers.Layer.Image(24 plugin.setup = function () {
26 'Test Data',25 var overlay_layer = plugin.overlay_layer = new OpenLayers.Layer.Vector(
27 '/eden/climate/climate_image_overlay',26 'Climate data map overlay',
28 new OpenLayers.Bounds(8900000, 3020000, 9850000, 3580000),27 {
29// new OpenLayers.Bounds(-180, -88.759, 180, 88.759),28 isBaseLayer:false,
30 new OpenLayers.Size(249, 139),29 }
31 {30 );
32 // numZoomLevels: 3,31 map.addLayer(overlay_layer);
33 isBaseLayer:false,32
34 opacity: 0.5,33 // selection
35 transparent:true34 OpenLayers.Feature.Vector.style['default']['strokeWidth'] = '2'
36 }35 var selectCtrl = new OpenLayers.Control.SelectFeature(
37 );36 overlay_layer,
38 graphic.events.on({37 {
39 loadstart: function() {38 clickout: true,
40 OpenLayers.Console.log("loadstart");39 toggle: false,
41 },40 multiple: false,
42 loadend: function() {41 hover: false,
43 OpenLayers.Console.log("loadend");42 toggleKey: 'altKey',
44 }43 multipleKey: 'shiftKey',
45 });44 box: true,
46 map.addLayer(graphic);45 onSelect: function (feature) {
46 feature.style.strokeColor = 'black'
47 feature.style.strokeDashstyle = 'dash'
48 overlay_layer.drawFeature(feature)
49 },
50 onUnselect: function (feature) {
51 feature.style.strokeColor = 'none'
52 overlay_layer.drawFeature(feature)
53 },
54 }
55 );
56
57 map.addControl(selectCtrl);
58
59 selectCtrl.activate();
47 }60 }
48 self.addToMapWindow = function (items) {61 plugin.addToMapWindow = function (items) {
49 function toggle_projected_options() {62 var combo_box_size = {
50 $('#projected-options').toggle(63 width: 120,
51 $('#id_Projected').attr('checked') == 'checked'64 heigth:25
65 }
66
67 function make_combo_box(
68 data,
69 fieldLabel,
70 hiddenName
71 ) {
72 var options = []
73 each(
74 data,
75 function (option) {
76 options.push([option, option])
77 }
52 )78 )
53 }79 var combo_box = new Ext.form.ComboBox({
54 var climate_data_type_options = [];80 fieldLabel: fieldLabel,
55 each(81 hiddenName: hiddenName,
56 self.data_type_option_names,82 store: new Ext.data.SimpleStore({
57 function (option_name) {83 fields: ['name', 'option'],
58 var radio_button = new Ext.form.Radio({84 data: options
59 name: "data-type",
60 id: "id_%s" % option_name,
61 boxLabel: option_name,
62 checked: option_name == self.data_type_option_names[0],
63 })
64 radio_button.on({
65 change: toggle_projected_options
66 })
67 climate_data_type_options.push(radio_button)
68 }
69 )
70 var projected_options = [];
71 each(
72 self.projected_option_type_names,
73 function (projected_option_type_name) {
74 projected_options.push(
75 new Ext.form.Radio({
76 name: "projected-option-type",
77 id: "id_%s" % projected_option_type_name,
78 boxLabel: projected_option_type_name,
79 })
80 )
81 }
82 )
83 var projected_options_widget = new Ext.form.FieldSet({
84 title: self.projected_option_type_label,
85 items: [
86 new Ext.form.CheckboxGroup({
87 items: projected_options,
88 xtype: 'checkboxgroup',
89 columns: 1
90 })
91 ]
92 })
93
94 var climate_data_type_options = new Ext.form.FieldSet({
95 title: self.data_type_label,
96 items: [
97 new Ext.form.RadioGroup({
98 items: climate_data_type_options,
99 columns: 1,
100 })
101 ]
102 })
103
104 var parameter_options = [];
105 each(
106 self.parameter_names,
107 function (parameter_name) {
108 var checkbox = new Ext.form.Checkbox({
109 name: parameter_name,
110 id: "id_%s" % parameter_name,
111 boxLabel: parameter_name,
112 })
113 parameter_options.push(checkbox)
114 }
115 )
116
117 var parameters_widget = new Ext.form.FieldSet({
118 title: "Parameters",
119 items: [
120 new Ext.form.CheckboxGroup({
121 items: parameter_options,
122 xtype: 'checkboxgroup',
123 columns: 1
124 })
125 ]
126 })
127
128 var period_widget = new Ext.form.FieldSet({
129 title: "Period",
130 items: [
131 new Ext.form.NumberField({
132 fieldLabel: "From",
133 minValue: self.year_min,
134 maxValue: self.year_max,
135 value: self.year_min
136 }),85 }),
137 new Ext.form.NumberField({86 displayField: 'name',
138 fieldLabel: "To",87 typeAhead: true,
139 minValue: self.year_min,88 mode: 'local',
140 maxValue: self.year_max,89 triggerAction: 'all',
141 value: self.year_max90 emptyText:'Choose...',
142 })91 selectOnFocus:true
143 ]92 })
144 })93 combo_box.setSize(combo_box_size)
94 return combo_box
95 }
96 var data_type_combo_box = make_combo_box(
97 plugin.data_type_option_names,
98 'Data type',
99 'data_type'
100 )
101
102 var variable_combo_box = make_combo_box(
103 plugin.parameter_names,
104 'Variable',
105 'parameter'
106 )
107
108 var statistic_combo_box = make_combo_box(
109 ['Minimum','Maximum','Average'],
110 'Aggregate values',
111 'statistic'
112 )
113
145 var climate_data_panel = new Ext.FormPanel({114 var climate_data_panel = new Ext.FormPanel({
146 id: 'climate_data_panel',115 id: 'climate_data_panel',
147 title: 'Climate data',116 title: 'Climate data map overlay',
148 collapsible: true,117 collapsible: true,
149 collapseMode: 'mini',118 collapseMode: 'mini',
150 items: [{119 items: [{
151 region: 'center',120 region: 'center',
152 items: [121 items: [
153 climate_data_type_options,122 new Ext.form.FieldSet({
154 projected_options_widget,123 title: 'Data set',
155 parameters_widget,124 items: [
156 period_widget125 data_type_combo_box,
126 variable_combo_box
127 ]
128 }),
129 new Ext.form.FieldSet({
130 title: 'Period',
131 items: [
132 new Ext.form.NumberField({
133 fieldLabel: 'From',
134 name: 'from_date',
135 minValue: plugin.year_min,
136 maxValue: plugin.year_max,
137 value: plugin.year_min
138 }),
139 new Ext.form.NumberField({
140 fieldLabel: 'To',
141 name: 'to_date',
142 minValue: plugin.year_min,
143 maxValue: plugin.year_max,
144 value: plugin.year_max,
145 size: combo_box_size
146 })
147 ]
148 }),
149 new Ext.form.FieldSet({
150 title: 'Map overlay colours',
151 items: [
152 statistic_combo_box,
153 ]
154 })
157 ]155 ]
158 }]156 }]
159 });157 });
158
159 var update_map_layer_button = new Ext.Button({
160 text: 'Update map layer',
161 disabled: true,
162 handler: function() {
163 plugin.overlay_layer.destroyFeatures()
164
165 // request new features
166 var form_values = climate_data_panel.getForm().getValues()
167
168 // add new features
169 $.ajax({
170 url: plugin.overlay_data_URL,
171 data: {
172 data_type: form_values.data_type,
173 statistic: form_values.statistic,
174 parameter: form_values.parameter,
175 from_date: form_values.from_date,
176 to_date: form_values.to_date
177 },
178 success: function(feature_data, status_code) {
179 function Vector(geometry, attributes, style) {
180 style.strokeColor= 'none'
181 style.fillOpacity= 0.8
182 style.strokeWidth = 1
183
184 return new OpenLayers.Feature.Vector(
185 geometry, attributes, style
186 )
187 }
188 function Polygon(components) {
189 return new OpenLayers.Geometry.Polygon(components)
190 }
191 function Point(lon, lat) {
192 var point = new OpenLayers.Geometry.Point(lat, lon)
193 return point.transform(
194 S3.gis.proj4326,
195 S3.gis.projection_current
196 )
197 }
198 function LinearRing(point_list) {
199 point_list.push(point_list[0])
200 return new OpenLayers.Geometry.LinearRing(point_list)
201 }
202 eval('var data = '+feature_data)
203 $('#id_key_min_value').html(data.min)
204 $('#id_key_max_value').html(data.max)
205 plugin.overlay_layer.addFeatures(data.features)
206 }
207 });
208 }
209 });
210
211 function enable_update_layer_button_if_form_complete(
212 box, record, index
213 ) {
214 if (
215 !!data_type_combo_box.getValue() &&
216 !!variable_combo_box.getValue() &&
217 !!statistic_combo_box.getValue()
218 ) {
219 update_map_layer_button.enable()
220 }
221 }
222 data_type_combo_box.on(
223 'change',
224 enable_update_layer_button_if_form_complete
225 );
226 variable_combo_box.on(
227 'change',
228 enable_update_layer_button_if_form_complete
229 );
230 statistic_combo_box.on(
231 'change',
232 enable_update_layer_button_if_form_complete
233 );
234 climate_data_panel.addButton(update_map_layer_button)
235
236 var show_chart_button = new Ext.Button({
237 text: 'Show chart',
238 disabled: true,
239 handler: function() {
240 // create URL
241 var place_ids = []
242 each(
243 plugin.overlay_layer.selectedFeatures,
244 function (feature) {
245 place_ids.push(feature.data.id)
246 }
247 )
248 var form_values = climate_data_panel.getForm().getValues(),
249 data_type = form_values.data_type,
250 parameter = form_values.parameter,
251 from_date = form_values.from_date,
252 to_date = form_values.to_date,
253 place_ids = place_ids;
254
255 var spec = JSON.stringify({
256 data_type: data_type,
257 parameter: parameter,
258 from_date: from_date,
259 to_date: to_date,
260 place_ids: place_ids
261 })
262
263 var chart_name = [
264 data_type, parameter,
265 'from', from_date,
266 'to', to_date,
267 'for', (
268 place_ids.length < 3?
269 'places: '+ place_ids:
270 place_ids.length+' places'
271 )
272 ].join(' ')
273
274 // get hold of a chart manager instance
275 if (!plugin.chart_window) {
276 var chart_window = plugin.chart_window = window.open(
277 'climate/chart_popup.html',
278 'chart',
279 'width=660,height=600,toolbar=0,resizable=0'
280 )
281 chart_window.onload = function () {
282 chart_window.chart_manager = new chart_window.ChartManager(plugin.chart_URL)
283 chart_window.chart_manager.addChartSpec(spec, chart_name)
284 }
285 chart_window.onbeforeunload = function () {
286 delete plugin.chart_window
287 }
288 } else {
289 // some duplication here:
290 plugin.chart_window. chart_manager.addChartSpec(spec, chart_name)
291 }
292
293 }
294 });
295
296
297 function enable_show_chart_button_if_data_and_variable_selected(
298 box, record, index
299 ) {
300 if (
301 !!data_type_combo_box.getValue() &&
302 !!variable_combo_box.getValue()
303 ) {
304 show_chart_button.enable()
305 }
306 }
307
308 data_type_combo_box.on(
309 'change',
310 enable_show_chart_button_if_data_and_variable_selected
311 );
312
313 variable_combo_box.on(
314 'change',
315 enable_show_chart_button_if_data_and_variable_selected
316 );
317
318
319
320 climate_data_panel.addButton(show_chart_button)
321
160 items.push(climate_data_panel)322 items.push(climate_data_panel)
323
324 var key_panel = new Ext.Panel({
325 id: 'key_panel',
326 title: 'Key',
327 collapsible: true,
328 collapseMode: 'mini',
329 items: [
330 {
331 layout: {
332 type: 'table',
333 columns: 3,
334 },
335 defaults: {
336 width: '100%',
337 height: 20,
338 style: 'margin: 10px'
339 },
The diff has been truncated for viewing.