Merge lp:~mike-amy/sahana-eden/climate into lp:sahana-eden

Proposed by Mike Amy
Status: Superseded
Proposed branch: lp:~mike-amy/sahana-eden/climate
Merge into: lp:sahana-eden
Diff against target: 7765 lines (+3996/-2732)
55 files modified
controllers/climate.py (+159/-96)
deployment-templates/models/000_config.py (+1/-1)
models/03_gis.py (+2/-0)
models/climate.py (+5/-0)
modules/ClimateDataPortal/MapPlugin.py (+430/-0)
modules/ClimateDataPortal/__init__.py (+201/-0)
modules/ClimateDataPortal/import_NetCDF_readings.py (+131/-0)
modules/ClimateDataPortal/import_stations.py (+53/-0)
modules/ClimateDataPortal/import_tabbed_readings.py (+154/-0)
modules/s3/s3gis.py (+784/-1031)
modules/test_utils/AddedRole.py (+25/-0)
modules/test_utils/Change.py (+25/-0)
modules/test_utils/ExpectSessionWarning.py (+14/-0)
modules/test_utils/ExpectedException.py (+13/-0)
modules/test_utils/InsertedRecord.py (+19/-0)
modules/test_utils/Web2pyNosePlugin.py (+106/-0)
modules/test_utils/__init__.py (+11/-1)
modules/test_utils/assert_equal.py (+60/-0)
modules/test_utils/clear_table.py (+4/-0)
modules/test_utils/find_JSON_format_data_structure.py (+54/-0)
modules/test_utils/run.py (+76/-239)
private/prepopulate/default/tasks.cfg (+0/-18)
static/scripts/S3/s3.gis.climate.js (+352/-134)
tests/climate/__init__.py (+101/-0)
tests/nose.py (+2/-2)
tests/unit_tests/gis/basic_map.html (+0/-91)
tests/unit_tests/gis/bing.html (+0/-62)
tests/unit_tests/gis/feature_queries.html (+0/-52)
tests/unit_tests/gis/google.html (+0/-67)
tests/unit_tests/gis/map_with_layers.html (+0/-150)
tests/unit_tests/gis/s3gis.py (+0/-417)
tests/unit_tests/gis/testgis.cmd (+0/-8)
tests/unit_tests/gis/true_code_paths.html (+0/-302)
tests/unit_tests/gis/yahoo.html (+0/-61)
tests/unit_tests/modules/s3/s3gis/BingLayer.py (+28/-0)
tests/unit_tests/modules/s3/s3gis/CommonScripts.py (+47/-0)
tests/unit_tests/modules/s3/s3gis/FeatureLayer.py (+25/-0)
tests/unit_tests/modules/s3/s3gis/FeatureQueries.py (+44/-0)
tests/unit_tests/modules/s3/s3gis/GPXLayer.py (+29/-0)
tests/unit_tests/modules/s3/s3gis/GeoJSONLayer.py (+28/-0)
tests/unit_tests/modules/s3/s3gis/GeoRSSLayer.py (+74/-0)
tests/unit_tests/modules/s3/s3gis/GoogleLayer.py (+84/-0)
tests/unit_tests/modules/s3/s3gis/KMLLayer.py (+54/-0)
tests/unit_tests/modules/s3/s3gis/LayerFailures.py (+117/-0)
tests/unit_tests/modules/s3/s3gis/OpenStreetMap.py (+31/-0)
tests/unit_tests/modules/s3/s3gis/TMSLayer.py (+30/-0)
tests/unit_tests/modules/s3/s3gis/TrueCodePaths.py (+307/-0)
tests/unit_tests/modules/s3/s3gis/UserInterface.py (+7/-0)
tests/unit_tests/modules/s3/s3gis/WFSLayer.py (+31/-0)
tests/unit_tests/modules/s3/s3gis/WMSLayer.py (+28/-0)
tests/unit_tests/modules/s3/s3gis/YahooLayer.py (+48/-0)
tests/unit_tests/modules/s3/s3gis/__init__.py (+68/-0)
tests/unit_tests/modules/s3/s3rest.py (+6/-0)
tests/unit_tests/modules/test_utils/find_JSON_format_data_structure.py (+52/-0)
views/climate/chart_popup.html (+76/-0)
To merge this branch: bzr merge lp:~mike-amy/sahana-eden/climate
Reviewer Review Type Date Requested Status
Fran Boon Needs Fixing
Review via email: mp+74043@code.launchpad.net

This proposal has been superseded by a proposal from 2011-09-05.

Description of the change

Added NetCDF importer for climate data, improved performance of the overlay layer generation.
Fixed tests, updated to use "current" global variable.

To post a comment you must log in.
Revision history for this message
Fran Boon (flavour) wrote :

views/climate/chart_popup.html
Please {{include jquery.html}} instead of hardcoding the jquery version.
You are already a point version out & we'll shortly go up to the newly released 1.6.3...

This should be reverted:
=== removed file 'private/prepopulate/default/tasks.cfg'

Can remove the r=request from:
+# self.url = "%s/%s" % (URL(r=request, c="default", f="download"),

I also see a very large number of single quotes in the Python code & some double quotes in the javascript....I'm happy to clean up the odd one that slips through, but this number seems excessive for me.

review: Needs Fixing
lp:~mike-amy/sahana-eden/climate updated
2308. By Mike Amy

fixed merge problems, cheers

2309. By Mike Amy

merged from trunk

2310. By Mike Amy

reverted references to self.debug when session isn't available and improved cache folder detection for KMLLayers

2311. By Mike Amy

merged from trunk

2312. By Mike Amy

added module config for climate to 000_config.py

2313. By Mike Amy

added parameter to the netcdf importer to pick which parameter is to be imported from the .nc file

2314. By Mike Amy

Quick import option

2315. By Mike Amy

Report status when loading image

2316. By Mike Amy

minor formatting

2317. By Mike Amy

month filter widget

2318. By Mike Amy

pre-demo commit so I can revert

2319. By Mike Amy

check in a fix before merging from trunk

2320. By Mike Amy

merged from trunk, trying to get postgres (or anything) working.

2321. By Mike Amy

merged from trunk

2322. By Mike Amy

Fix for the complaints about the missing relation 'scheduler_task'

2323. By Mike Amy

Map working again after moving the climate models out of their module.

2324. By Mike Amy

can add parameter tables whilst server is running

2325. By Mike Amy

Updated importing stations script.

2326. By Mike Amy

importing observed readings script updated

2327. By Mike Amy

Updated NetCDF (Gridded data) importer script

2328. By Mike Amy

added python script for easier running of scripts with web2py

2329. By Mike Amy

run.py uses script paths relative from itself

2330. By Mike Amy

Tabbed data importer script updated for dynamic database tables

2331. By Mike Amy

Tabbed data importer script updated for dynamic database tables

2332. By Mike Amy

Made station range parameters more explicit in tabbed data import script

2333. By Mike Amy

Typo fix

2334. By Mike Amy

map overlay updated for dynamic tables

2335. By Mike Amy

Split out climate data result caching code

2336. By Mike Amy

aggregation/statistic name configuration moved under control of the map plugin

2337. By Mike Amy

aggregation/statistic name configuration moved under control of the map plugin

2338. By Mike Amy

Chart code updated tto accept but ignore month filter and aggregation name

2339. By Mike Amy

Chart is shown imon loading climate data portal page

2340. By Mike Amy

Chart is shown imon loading climate data portal page

2341. By Mike Amy

Chart is shown on loading climate data portal page

2342. By Mike Amy

updated commands for removal of type field from the sample tables

2343. By Mike Amy

reverted the feeding of web2py's import_from_csv code, as it doesn't make things any faster, which is pretty lame.

2344. By Mike Amy

Use raw SQL inserts to speed up the import of the NetCDF data.

2345. By Mike Amy

Don't need to delete web2py's table definitions, let it do so itself.

2346. By Mike Amy

Added multi-column uniqueness constraint on the sample table definitions table

2347. By Mike Amy

DSL module, UI key scale changes.

2348. By Mike Amy

Popups show place info and observation data

2349. By Mike Amy

Note about styling the popups (they'll probably need it).

2350. By Mike Amy

Packed UI together and added, comparison interface. Layer tree collapses to leave room.

2351. By Mike Amy

Freeform and comparison query UI

2352. By Mike Amy

Changed year and month to combo box.

2353. By Mike Amy

Fixed a bug in the stringification of ToDate/FromDate and fixed the UI so that the dates get sent properly again

2354. By Mike Amy

Fixed the caching which wasn't using the str(expression), fixed some stringification problems. Expressions are added to the data for verification.

2355. By Mike Amy

Comparisons and free-form queries work. Values come back with meaningful units.

2356. By Mike Amy

freeform queries update show position of syntax errors

2357. By Mike Amy

DSL binary operators work with scalars as well as R data.frames

2358. By Mike Amy

DSL binary operators work with scalars as well as R data.frames

2359. By Mike Amy

numbers can be expressed as displacements, e.g. delta mm of rainfall

2360. By Mike Amy

Numbers don't have to be positive, unless they are marked as such.

2361. By Mike Amy

DSL Error reporting

2362. By Mike Amy

Charts now work from queries. Had to disable year-matching in queries to get sensible years.

2363. By Mike Amy

Fixed a weirdness in postgres where modulus is negative. Month filter now works for the map.

2364. By Mike Amy

Chart image size control

2365. By Mike Amy

Changed URL for place data to cope with missing trailing slash e.g. <host>/eden/climate

2366. By Mike Amy

Moved R process creation inside MapPlugin.__init__

2367. By Mike Amy

Firefox fixes

2368. By Mike Amy

rpy2 missing reference fix

2369. By Mike Amy

fixed missing import, show exact colour scale, charts understand monthly aggregation, charts display legends

2370. By Mike Amy

basic data purchase management screen

2371. By Mike Amy

Chart and data purchase popup URLs are explicitly passed.

2372. By Mike Amy

More discernible colour range, fixed incorrect colour problem

2373. By Mike Amy

More discernible colour range

2374. By Mike Amy

Graph date fixes, projected data importer, key scale changes.

2375. By Mike Amy

merged Michael's changes to add purchasing system, moved colour scale out of panel, above map

2376. By Mike Amy

Moved the freeform query to an editable map legend. Renmed FromDate, ToDate as From/To for easier reading.

2377. By Mike Amy

Added filter box widget, added error highlighting and error tooltips, reformatted climate controller file for clarity. Fixed a bug in the parsing of units (trailing space upsets pattern matcher).

2378. By Mike Amy

fixed a bug regarding incorrect end of year if month unspecified, which caused values for 1960-1960 to come out wrong

2379. By Mike Amy

Graphs: improved axis labels, show ° Celsius, legend are laid out better. Fixed a month filtering problem to do with incorrect month offset and postgres modulo weirdness with negative numbers.

2380. By Mike Amy

Legend wraps lines and lays itself out well.

2381. By Mike Amy

Added detailed linear regression (best-fit) line information to the legend.

2382. By Mike Amy

Moved legend below the graph which seems more common a placement. Moving to the right side seems more common, but is much more difficult.

2383. By Mike Amy

Better error reporting, accepts ints before AST nodes in expressions, better explanation of affine number error, map can be full window, query, filter and full window mode can be controlled by request.vars

2384. By Mike Amy

Added script for server-side rendering and screenshots via window.print(). The full window map removes unnecessary widgets and generally makes itself more printable.

2385. By Mike Amy

Printable Map PNG image downloads successfully

2386. By Mike Amy

Added observation station marker layer

2387. By Mike Amy

Added more space on to the graph legends, added a note about the screenshot script, started on KML shape file handling

2388. By Mike Amy

Added naive region detection for places. Detection is slow but subsequent filtering is fast

2389. By Mike Amy

Nepal districts KML load (hardcoded), moved place methods into new place class

2390. By Mike Amy

Removed unused place-space-matching algorithm code (asynchronous testing solves the UI lockup).

2391. By Mike Amy

within() filter function accepts multiple region names in anOR-like relationship

2392. By Mike Amy

Implemented a point simplification algorithm for Vector layers which makes them render much faster by removing points that make less than a pixel difference. Implemented an algorithm to speed up the point-in-region detection by 2.5 orders of magnitude. Vector layers run much smoother.

2393. By Mike Amy

Changes to the places file (cupreviously 1 1.6MB/250KB compressed) JSON format to allow zlib compression down to 145KB/6KB compressed.

2394. By Mike Amy

ignore places that have been filtered out by being outsife of nepal

2395. By Mike Amy

NetCDF imports generate a CSV file suitable for postgres' COPY command which is faster than INSERTS. Expression grid sizes are returned for simple expressions. The Map overlay uses these grid sizes to draw the correct size squares. Colours have been changed as per DHM request.

2396. By Mike Amy

R quietly imports libraries we need (trying to avoid hangs due to GIL mishandling in R's print back via python)

2397. By Mike Amy

escape non-alphanumeric characters in climate data table names for the parser regular expression

2398. By Mike Amy

Locating places in spaces depends now upon places and spaces being loaded rather than arbitrary timing.

2399. By Mike Amy

Added PreviousDecember handling, fixed error handling. removed some dead code in the controller. Added error handling for when linear regressions are not possible. Added grid size arg to add_table command.

2400. By Mike Amy

Added tooltips, fixed a bug that caused no datawhen no months were specified (came from PreviousDecember feature), fixed a bug that was stopping the webpage loading the initial filter and query, if given.

2401. By Mike Amy

Fixed off-by-one error on the colour map length which was causing 0 values to not be shown

2402. By Mike Amy

Fixed a bug that was stopping the reading of the initial expression, fixed a bug that allowed spaces in the filter box to be interpretted as a filter expression.

2403. By Mike Amy

Fixed the markers to show station data and t the display of values when the range of values is zero

2404. By Mike Amy

Better error explanations, in particular connection errors.

2405. By Mike Amy

added CSV download controller, changed printable map image to use os.system, cache the images and removed the .png file extension addition (cache already does it), started on real time database integration. index.html has less in it.

2406. By Mike Amy

Changed daily data importer. Hide daily data sets except for download.

2407. By Mike Amy

Implemented CSV download of purchased data

2408. By Mike Amy

Trailing space on downloaded CSV file

2409. By Mike Amy

changed value precision to 6 d.p.

2410. By Mike Amy

Shape files are not visible by default

2411. By Mike Amy

Fixed a bug which caused en arror in the SQL when no months are selected for annual aggregation. Display a message confirming successful response with no data if no data is returned.

2412. By Mike Amy

Downloaded purchased data does not show days in monthly data. removed the obsolete buy data button and template.

2413. By Mike Amy

Fixed a floating point rounding error in the point-in-region detection code (taken from OpenLayers). The shape file layer only become invisible once the loaded.

2414. By Mike Amy

Added region labels

2415. By Mike Amy

Show labels for regions.

2416. By Mike Amy

Hover delays

2417. By Mike Amy

Fixed broken place deselection

2418. By Mike Amy

Added Nepal Gov logo for putting all over the place

2419. By Mike Amy

Nepal Gov logo shows on the map overlay and on the charts as a watermark

2420. By Mike Amy

changed the chart popup window input to a text area to accommodate more text

2421. By Mike Amy

All places are listed in the chart legend, in order.

2422. By Mike Amy

Added user manual

2423. By Mike Amy

merged from trunk

2424. By Mike Amy

Chart labels are editable

2425. By Mike Amy

Years for which no data exist are greyed out but still selectable in the year comboboxes

2426. By Mike Amy

Years for which no data exist are greyed out but still selectable in the year comboboxes

2427. By Mike Amy

Removed side menu and created a climate menu in the top menu bar

2428. By Mike Amy

worked around a problem in openlayers where dragging the select feature control interferes with focussing on custom controls

2429. By Mike Amy

Fixed a performance problem where zooming in and out slowed everything down. Point samples are now shown as circles

2430. By Mike Amy

workaround for some kind of (new) f.p. precision loss in the striping algorithm.

2431. By Mike Amy

Added a quick region filter combo box.

2432. By Mike Amy

Added ability to specify zoom & coords via request parameters

2433. By Mike Amy

Typo in grid sizing code

2434. By Mike Amy

removed unnecessary call to map.updateSize()

2435. By Mike Amy

Purchase field renamings

2436. By Mike Amy

climate/purchase shows the username, changed purpose field to 'Receipt number / Student ID / other notes'

2437. By Mike Amy

Made prices editable

2438. By Mike Amy

Added download as CSV feature for map overlay data

2439. By Mike Amy

Added user manual

2440. By Mike Amy

Added model descriptions pdf

2441. By Mike Amy

Improvements to chart popup window.

2442. By Mike Amy

Previous commit was botched by bzr (committed multiple files when only one was selected):

Previous commit messages, per file:

climate.js
Fixed colour key range. Fixed error reporting.
Can download csv data. Ratios come out as percentages.

import_NetCDF_readings.py
Fixed floating point rounding errors affecting place matching.

MapPlugin, Cache.py:
Made caching filepaths application dependent.
Grid size detection and error reporting.
csv time series data function.

2443. By Mike Amy

mm -> precipitation mm

2444. By Mike Amy

allow units mm and precipitation mm

2445. By Mike Amy

NetCDF importer changes

2446. By Mike Amy

ignore values for places not on the map (was affecting the limits)

2447. By Mike Amy

Fixed bug that stopped the printable image showing

2448. By Mike Amy

detection of xvfb

2449. By Mike Amy

detection of xvfb

2450. By Mike Amy

proper xvfb command formatting

2451. By Mike Amy

mid-merge of changes from competition branch

2452. By Mike Amy

mid-merge of changes from competition branch

2453. By Mike Amy

mid-merge of changes from competition branch

2454. By Mike Amy

committing to be able to revert

Unmerged revisions

2454. By Mike Amy

committing to be able to revert

2453. By Mike Amy

mid-merge of changes from competition branch

2452. By Mike Amy

mid-merge of changes from competition branch

2451. By Mike Amy

mid-merge of changes from competition branch

2450. By Mike Amy

proper xvfb command formatting

2449. By Mike Amy

detection of xvfb

2448. By Mike Amy

detection of xvfb

2447. By Mike Amy

Fixed bug that stopped the printable image showing

2446. By Mike Amy

ignore values for places not on the map (was affecting the limits)

2445. By Mike Amy

NetCDF importer changes

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'controllers/climate.py'
2--- controllers/climate.py 2011-08-06 18:24:53 +0000
3+++ controllers/climate.py 2011-09-05 13:55:13 +0000
4@@ -2,92 +2,14 @@
5
6 module = "climate"
7
8-class ClimateDataMapPlugin(object):
9- def __init__(self,
10- data_type_option_names,
11- parameter_names,
12- projected_option_type_names,
13- year_min,
14- year_max
15- ):
16- self.data_type_option_names = data_type_option_names
17- self.parameter_names = parameter_names
18- self.projected_option_type_names = projected_option_type_names
19- self.year_min = year_min
20- self.year_max = year_max
21-
22- def extend_gis_map(self, add_javascript, add_configuration):
23- add_javascript("scripts/S3/s3.gis.climate.js")
24- add_configuration(
25- SCRIPT(
26- "\n".join((
27- "registerPlugin(",
28- " new ClimateDataMapPlugin("+
29- json.dumps(
30- dict(
31- self.__dict__,
32- data_type_label = str(T("Data Type")),
33- projected_option_type_label = str(T("Projection Type"))
34- ),
35- indent = 4
36- )+
37- ")",
38- ")",
39- ))
40- )
41- )
42-
43- def add_html(self, html):
44- statistics_widget = FIELDSET(
45- LEGEND("Statistics"),
46- UL(
47- _style="list-style:none",
48- *(
49- LI(
50- INPUT(
51- _type="radio",
52- _name="statistics",
53- _id="id_%s" % statistic,
54- ),
55- LABEL(
56- statistic,
57- _for="id_%s" % statistic,
58- )
59- )
60- for statistic in ["Mean", "Max", "Min"]
61- )
62- )
63- )
64-
65- html.append(
66- DIV(
67- FORM(
68- _id="controller",
69- *(
70- SCRIPT(
71- _type="text/javascript",
72- *["""
73- """]
74- ),
75- climate_data_type_widget,
76- parameters_widget,
77- statistics_widget,
78- period_widget
79- )
80- )
81- )
82- )
83-
84- def get_image_overlay(self, ):
85- from gluon.contenttype import contenttype
86- response.headers["Content-Type"] = contenttype(".png")
87- # @ToDo: Should be a file in static
88- return response.stream(open("/Users/mike/Desktop/red_wave.png"))
89-
90-climate_data_map_plugin = ClimateDataMapPlugin(
91- data_type_option_names = ["Observed", "Gridded", "Projected"],
92- parameter_names = ["Rainfall", "Temperature", "Wind", "Humidity", "Sunshine"],
93- projected_option_type_names = ["RC Model", "GC Model", "Scenario"],
94+ClimateDataPortal = local_import("ClimateDataPortal")
95+
96+sample_type_names = ClimateDataPortal.sample_codes.keys()
97+variable_names = ClimateDataPortal.tables.keys()
98+
99+map_plugin = ClimateDataPortal.MapPlugin(
100+ data_type_option_names = sample_type_names,
101+ parameter_names = variable_names,
102 year_max = datetime.date.today().year,
103 year_min = 1960,
104 )
105@@ -120,16 +42,16 @@
106 print_tool = {"url": print_service}
107 else:
108 print_tool = {}
109-
110+
111 map = gis.show_map(
112+ lat = 28.5,
113+ lon = 84.1,
114+ zoom = 7,
115 toolbar = False,
116- catalogue_toolbar=catalogue_toolbar, # T/F, top tabs toolbar
117+# catalogue_toolbar=catalogue_toolbar, # T/F, top tabs toolbar
118 wms_browser = wms_browser, # dict
119- catalogue_layers=catalogue_layers, # T/F
120- mouse_position = deployment_settings.get_gis_mouse_position(),
121- print_tool = print_tool,
122 plugins = [
123- climate_data_map_plugin
124+ map_plugin
125 ]
126 )
127
128@@ -138,7 +60,148 @@
129 module_name=module_name,
130 map=map
131 )
132-
133-def climate_image_overlay():
134- return climate_data_map_plugin.get_image_overlay()
135-
136+
137+month_names = dict(
138+ January=1,
139+ February=2,
140+ March=3,
141+ April=4,
142+ May=5,
143+ June=6,
144+ July=7,
145+ August=8,
146+ September=9,
147+ October=10,
148+ November=11,
149+ December=12
150+)
151+
152+for name, number in month_names.items():
153+ month_names[name[:3]] = number
154+for name, number in month_names.items():
155+ month_names[name.upper()] = number
156+for name, number in month_names.items():
157+ month_names[name.lower()] = number
158+
159+def convert_date(default_month):
160+ def converter(year_month):
161+ components = year_month.split("-")
162+ year = int(components[0])
163+ assert 1960 <= year, "year must be >= 1960"
164+
165+ try:
166+ month_value = components[1]
167+ except IndexError:
168+ month = default_month
169+ else:
170+ try:
171+ month = int(month_value)
172+ except TypeError:
173+ month = month_names[month_value]
174+
175+ assert 1 <= month <= 12, "month must be in range 1:12"
176+ return datetime.date(year, month, 1)
177+ return converter
178+
179+def one_of(options):
180+ def validator(choice):
181+ assert choice in options, "should be one of %s, not '%s'" % (
182+ options,
183+ choice
184+ )
185+ return choice
186+ return validator
187+
188+def climate_overlay_data():
189+ kwargs = dict(request.vars)
190+ kwargs["parameter"] = kwargs["parameter"].replace("+", " ")
191+
192+ arguments = {}
193+ errors = []
194+ for kwarg_name, converter in dict(
195+ data_type = one_of(sample_type_names),
196+ statistic = one_of(("Maximum", "Minimum", "Average")),
197+ parameter = one_of(variable_names),
198+ from_date = convert_date(default_month = 1),
199+ to_date = convert_date(default_month = 12),
200+ ).iteritems():
201+ try:
202+ value = kwargs.pop(kwarg_name)
203+ except KeyError:
204+ errors.append("%s missing" % kwarg_name)
205+ else:
206+ try:
207+ arguments[kwarg_name] = converter(value)
208+ except TypeError:
209+ errors.append("%s is wrong type" % kwarg_name)
210+ except AssertionError, assertion_error:
211+ errors.append("%s: %s" % (kwarg_name, assertion_error))
212+ if kwargs:
213+ errors.append("Unexpected arguments: %s" % kwargs.keys())
214+
215+ if errors:
216+ raise HTTP(500, "<br />".join(errors))
217+ else:
218+ import gluon.contenttype
219+ data_path = map_plugin.get_overlay_data(
220+ env = Storage(globals()),
221+ **arguments
222+ )
223+ return response.stream(
224+ open(data_path,"rb"),
225+ chunk_size=4096
226+ )
227+
228+def list_of(converter):
229+ def convert_list(choices):
230+ return map(converter, choices)
231+ return convert_list
232+
233+def climate_chart():
234+ kwargs = dict(request.vars)
235+ import simplejson as JSON
236+ specs = JSON.loads(kwargs.pop("spec"))
237+
238+ checked_specs = []
239+ for spec in specs:
240+ arguments = {}
241+ errors = []
242+ for name, converter in dict(
243+ data_type = one_of(sample_type_names),
244+ parameter = one_of(variable_names),
245+ from_date = convert_date(default_month = 1),
246+ to_date = convert_date(default_month = 12),
247+ place_ids = list_of(int)
248+ ).iteritems():
249+ try:
250+ value = spec.pop(name)
251+ except KeyError:
252+ errors.append("%s missing" % name)
253+ else:
254+ try:
255+ arguments[name] = converter(value)
256+ except TypeError:
257+ errors.append("%s is wrong type" % name)
258+ except AssertionError, assertion_error:
259+ errors.append("%s: %s" % (name, assertion_error))
260+ if spec:
261+ errors.append("Unexpected arguments: %s" % spec.keys())
262+ checked_specs.append(arguments)
263+
264+ if errors:
265+ raise HTTP(500, "<br />".join(errors))
266+ else:
267+ import gluon.contenttype
268+ response.headers["Content-Type"] = gluon.contenttype.contenttype(".png")
269+ data_image_file_path = map_plugin.render_plots(
270+ env = Storage(globals()),
271+ specs = checked_specs
272+ )
273+ return response.stream(
274+ open(data_image_file_path,"rb"),
275+ chunk_size=4096
276+ )
277+
278+def chart_popup():
279+ return {}
280+
281
282=== modified file 'deployment-templates/models/000_config.py'
283--- deployment-templates/models/000_config.py 2011-08-25 09:17:02 +0000
284+++ deployment-templates/models/000_config.py 2011-09-05 13:55:13 +0000
285@@ -249,7 +249,7 @@
286 strict_hierarchy = False,
287 # Should all specific locations (e.g. addresses, waypoints) be required to
288 # link to where they are in the location hierarchy?
289- location_parent_required = False,
290+ location_parent_required = False
291 )
292 # Set this if there will be multiple areas in which work is being done,
293 # and a menu to select among them is wanted. With this on, any map
294
295=== modified file 'models/03_gis.py'
296--- models/03_gis.py 2011-08-26 08:12:56 +0000
297+++ models/03_gis.py 2011-09-05 13:55:13 +0000
298@@ -1369,6 +1369,8 @@
299 # =============================================================================
300 def gis_map_tables():
301 """ Load the GIS Map Tables when needed """
302+ if "gis_layer_bing" in db.tables:
303+ return
304
305 # -------------------------------------------------------------------------
306 # GPS Waypoints
307
308=== added file 'models/climate.py'
309--- models/climate.py 1970-01-01 00:00:00 +0000
310+++ models/climate.py 2011-09-05 13:55:13 +0000
311@@ -0,0 +1,5 @@
312+# -*- coding: utf-8 -*-
313+
314+module = "climate"
315+if deployment_settings.has_module(module):
316+ local_import("ClimateDataPortal").define_models(env = Storage(globals()))
317
318=== added directory 'modules/ClimateDataPortal'
319=== added file 'modules/ClimateDataPortal/MapPlugin.py'
320--- modules/ClimateDataPortal/MapPlugin.py 1970-01-01 00:00:00 +0000
321+++ modules/ClimateDataPortal/MapPlugin.py 2011-09-05 13:55:13 +0000
322@@ -0,0 +1,430 @@
323+
324+# notes:
325+
326+# dependencies:
327+# R
328+
329+# create folder for cache:
330+# mkdir -p /tmp/climate_data_portal/images/recent/
331+# mkdir -p /tmp/climate_data_portal/images/older/
332+
333+MAX_CACHE_FOLDER_SIZE = 2**24 # 16 MiB
334+
335+class TwoStageCache(object):
336+ def __init__(self, folder, max_size):
337+ self.folder = folder
338+ self.max_size
339+
340+ def purge(self):
341+ pass
342+
343+ def retrieve(self, file_name, generate_if_not_found):
344+ pass
345+
346+import os, errno
347+
348+def mkdir_p(path):
349+ try:
350+ os.makedirs(path)
351+ except OSError as exc: # Python >2.5
352+ if exc.errno == errno.EEXIST:
353+ pass
354+ else: raise
355+
356+def define(env, place, tables, date_to_month_number, sample_codes, exports):
357+ # This starts an R interpreter.
358+ # As we are sharing it (restarting it every time is inefficient),
359+ # we have to be somewhat careful to make sure objects are garbage collected
360+ # better to just not stick anything in R's globals
361+ try:
362+ import rpy2.robjects as robjects
363+ except ImportError:
364+ import logging
365+ logging.getLogger().error(
366+"""R is required by the climate data portal to generate charts
367+
368+To install R: refer to:
369+http://cran.r-project.org/doc/manuals/R-admin.html
370+
371+
372+rpy2 is required to interact with python.
373+
374+To install rpy2, refer to:
375+http://rpy.sourceforge.net/rpy2/doc-dev/html/overview.html
376+""")
377+ raise
378+
379+ R = robjects.r
380+
381+ from rpy2.robjects.packages import importr
382+
383+ base = importr("base")
384+
385+ from math import fsum
386+ def average(values):
387+ "Safe float average"
388+ l = len(values)
389+ if l is 0:
390+ return None
391+ else:
392+ return fsum(values)/l
393+
394+ class Maximum(object):
395+ def __init__(self, column, add_query_term):
396+ self.value_max = value_max = column.max()
397+ add_query_term(value_max)
398+
399+ def __call__(self, row):
400+ return row._extra[self.value_max]
401+
402+ class Minimum(object):
403+ def __init__(self, column, add_query_term):
404+ self.value_min = value_min = column.min()
405+ add_query_term(value_min)
406+
407+ def __call__(self, row):
408+ return row._extra[self.value_min]
409+
410+ class Average(object):
411+ def __init__(self, column, add_query_term):
412+ self.value_sum = value_sum = column.sum()
413+ self.value_count = value_count = column.count()
414+ add_query_term((
415+ value_sum,
416+ value_count
417+ ))
418+
419+ def __call__(self, row):
420+ return row._extra[self.value_sum] / row._extra[self.value_count]
421+
422+ aggregators = {
423+ "Maximum": Maximum,
424+ "Minimum": Minimum,
425+ "Average": Average
426+ }
427+
428+ def get_cached_or_generated_file(cache_file_name, generate):
429+ from os.path import join, exists
430+ from os import stat, makedirs
431+ # this needs to become a setting
432+ climate_data_image_cache_path = join(
433+ "/tmp","climate_data_portal","images"
434+ )
435+ recent_cache = join(climate_data_image_cache_path, "recent")
436+ mkdir_p(recent_cache)
437+ older_cache = join(climate_data_image_cache_path, "older")
438+ mkdir_p(older_cache)
439+ recent_cache_path = join(recent_cache, cache_file_name)
440+ if not exists(recent_cache_path):
441+ older_cache_path = join(older_cache, cache_file_name)
442+ if exists(older_cache_path):
443+ # move the older cache to the recent folder
444+ rename(older_cache_path, recent_cache_path)
445+ else:
446+ generate(recent_cache_path)
447+ file_path = recent_cache_path
448+
449+ # update the folder size file (race condition?)
450+ folder_size_file_path = join(climate_data_image_cache_path, "size")
451+ folder_size_file = open(folder_size_file_path, "w+")
452+ folder_size_file_contents = folder_size_file.read()
453+ try:
454+ folder_size = int(folder_size_file_contents)
455+ except ValueError:
456+ folder_size = 0
457+ folder_size_file.seek(0)
458+ folder_size_file.truncate()
459+ folder_size += stat(file_path).st_size
460+ if folder_size > MAX_CACHE_FOLDER_SIZE:
461+ rmdir(older_cache)
462+
463+ folder_size_file.write(str(folder_size))
464+ folder_size_file.close()
465+ else:
466+ # use the existing cached image
467+ file_path = recent_cache_path
468+ return file_path
469+
470+ class MapPlugin(object):
471+ def __init__(
472+ self,
473+ data_type_option_names,
474+ parameter_names,
475+ year_min,
476+ year_max
477+ ):
478+ self.data_type_option_names = data_type_option_names
479+ self.parameter_names = parameter_names
480+ self.year_min = year_min
481+ self.year_max = year_max
482+
483+ def extend_gis_map(self, add_javascript, add_configuration):
484+ add_javascript("scripts/S3/s3.gis.climate.js")
485+ SCRIPT = env.SCRIPT
486+ T = env.T
487+ import json
488+
489+ add_configuration(
490+ SCRIPT(
491+ "\n".join((
492+ "registerPlugin(",
493+ " new ClimateDataMapPlugin("+
494+ json.dumps(
495+ dict(
496+ data_type_option_names = self.data_type_option_names,
497+ parameter_names = self.parameter_names,
498+ year_min = self.year_min,
499+ year_max = self.year_max,
500+ overlay_data_URL = "/%s/climate/climate_overlay_data" % (
501+ env.request.application
502+ ),
503+ chart_URL = "/%s/climate/climate_chart" % (
504+ env.request.application
505+ ),
506+ data_type_label = str(T("Data Type")),
507+ projected_option_type_label = str(
508+ T("Projection Type")
509+ )
510+ ),
511+ indent = 4
512+ )+
513+ ")",
514+ ")",
515+ ))
516+ )
517+ )
518+
519+
520+ def get_overlay_data(
521+ self,
522+ env,
523+ data_type,
524+ parameter,
525+ from_date,
526+ to_date,
527+ statistic
528+ ):
529+ from_month = date_to_month_number(from_date)
530+ to_month = date_to_month_number(to_date)
531+ def generate_map_overlay_data(file_path):
532+ # generate the new file in the recent folder
533+
534+ db = env.db
535+ sample_table_name, sample_table = tables[parameter]
536+ place = db.place
537+ #sample_table = db[sample_table_name]
538+
539+ query = [
540+ place.id,
541+ place.longitude,
542+ place.latitude,
543+ ]
544+ aggregator = aggregators[statistic](
545+ sample_table.value,
546+ query.append
547+ )
548+
549+ sample_rows = db(
550+ (sample_table.time_period >= from_month) &
551+ (sample_table.time_period <= to_month) &
552+ (sample_table.sample_type == sample_codes[data_type]) &
553+ (place.id == sample_table.place_id)
554+ ).select(
555+ *query,
556+ groupby=sample_table.place_id
557+ )
558+
559+ # map positions to data
560+ # find max and min value
561+ positions = {}
562+ aggregated_values = []
563+ for row in sample_rows:
564+ place = row.place
565+ aggregated_value = aggregator(row)
566+ aggregated_values.append(aggregated_value)
567+ positions[place.id] = (
568+ place.latitude,
569+ place.longitude,
570+ aggregated_value
571+ )
572+ max_aggregated_value = max(aggregated_values)
573+ min_aggregated_value = min(aggregated_values)
574+ aggregated_range = max_aggregated_value - min_aggregated_value
575+
576+ data_lines = []
577+ write = data_lines.append
578+ from colorsys import hsv_to_rgb
579+ for id, (lat, lon, aggregated_value) in positions.iteritems():
580+ north = lat + 0.05
581+ south = lat - 0.05
582+ east = lon + 0.05
583+ west = lon - 0.05
584+ # only hue changes
585+ # hue range is from 2/3 (blue, low) to 0 (red, high)
586+ normalised_value = 1.0-((aggregated_value - min_aggregated_value) / aggregated_range)
587+ r,g,b = hsv_to_rgb(normalised_value *(2.0/3.0), 1.0, 1.0)
588+ hex_colour = "%02x%02x%02x" % (r*255, g*255, b*255)
589+ write(
590+ "Vector("
591+ "Polygon(["
592+ "LinearRing(["
593+ "Point(%(north)f,%(west)f),"
594+ "Point(%(north)f,%(east)f),"
595+ "Point(%(south)f,%(east)f),"
596+ "Point(%(south)f,%(west)f)"
597+ "])"
598+ "]),"
599+ "{"
600+ "value:%(aggregated_value)f,"
601+ "id:%(id)i"
602+ "},"
603+ "{"
604+ "fillColor:'#%(hex_colour)s'"
605+ "}"
606+ ")," % locals()
607+ )
608+ overlay_data_file = open(file_path, "w")
609+ write = overlay_data_file.write
610+ write("{")
611+ if max_aggregated_value < 10:
612+ float_format = "%0.2f"
613+ if max_aggregated_value < 100:
614+ float_format = "%0.1f"
615+ elif max_aggregated_value < 10000:
616+ float_format = "%0.0f"
617+ else:
618+ float_format = "%0.2e"
619+ write("max:%s," % float_format % max_aggregated_value)
620+ write("min:%s," % float_format % min_aggregated_value)
621+ write("features:[")
622+ write("".join(data_lines))
623+ overlay_data_file.seek(-1, 1) # delete last ",'
624+ write("]}")
625+ overlay_data_file.close()
626+
627+ return get_cached_or_generated_file(
628+ "_".join((
629+ statistic,
630+ data_type,
631+ parameter,
632+ str(from_month),
633+ str(to_month),
634+ ".js"
635+ )),
636+ generate_map_overlay_data
637+ )
638+
639+ def render_plots(
640+ self,
641+ env,
642+ specs
643+ ):
644+ def generate_chart(file_path):
645+ def render_plot(
646+ data_type,
647+ parameter,
648+ from_date,
649+ to_date,
650+ place_ids
651+ ):
652+ from_month = date_to_month_number(from_date)
653+ to_month = date_to_month_number(to_date)
654+
655+ db = env.db
656+ sample_table_name, sample_table = tables[parameter]
657+ place = db.place
658+ #sample_table = db[sample_table_name]
659+ sample_rows = db(
660+ (sample_table.time_period >= from_month) &
661+ (sample_table.time_period <= to_month) &
662+ (sample_table.sample_type == sample_codes[data_type]) &
663+ (sample_table.place_id.belongs(place_ids))
664+ ).select(
665+ sample_table.value,
666+ sample_table.time_period,
667+ )
668+
669+ # coalesce values by time_period:
670+ aggregated_values = {}
671+ for sample_row in sample_rows:
672+ time_period = sample_row.time_period
673+ value = sample_row.value
674+ try:
675+ aggregated_values[time_period]
676+ except KeyError:
677+ aggregated_values[time_period] = value
678+ else:
679+ aggregated_values[time_period] += value
680+
681+ values = []
682+ time_periods = aggregated_values.keys()
683+ time_periods.sort()
684+ for time_period in time_periods:
685+ values.append(aggregated_values[time_period])
686+ return from_date, to_date, data_type, parameter, values
687+
688+ time_serieses = []
689+ c = R("c")
690+ for spec in specs:
691+ from_date, to_date, data_type, parameter, values = render_plot(**spec)
692+ time_serieses.append(
693+ R("ts")(
694+ robjects.FloatVector(values),
695+ start = c(from_date.year, from_date.month),
696+ end = c(to_date.year, to_date.month),
697+ frequency = 12
698+ )
699+ )
700+
701+ R("png(filename = '%s', width=640, height=480)" % file_path)
702+ plot_chart = R(
703+ "function (xlab, ylab, n, ...) {"
704+ "ts.plot(...,"
705+ "gpars=list(xlab=xlab, ylab=ylab, col=c(1:n))"
706+ ")"
707+ "}"
708+ )
709+
710+ plot_chart(
711+ "Date",
712+ "Combined %s %s" % (data_type, parameter),
713+ len(time_serieses),
714+ *time_serieses
715+ )
716+ R("dev.off()")
717+
718+ import md5
719+ import gluon.contrib.simplejson as JSON
720+
721+ import datetime
722+ def serialiseDate(obj):
723+ if isinstance(
724+ obj,
725+ (
726+ datetime.date,
727+ datetime.datetime,
728+ datetime.time
729+ )
730+ ):
731+ return obj.isoformat()[:19].replace("T"," ")
732+ raise TypeError("%r is not JSON serializable" % (obj,))
733+
734+ return get_cached_or_generated_file(
735+ "_".join((
736+ md5.md5(
737+ JSON.dumps(
738+ specs,
739+ sort_keys=True,
740+ default=serialiseDate
741+ )
742+ ).hexdigest(),
743+ ".png"
744+ )),
745+ generate_chart
746+ )
747+
748+ exports.update(
749+ MapPlugin = MapPlugin
750+ )
751+
752+ del globals()["define"]
753
754=== added file 'modules/ClimateDataPortal/__init__.py'
755--- modules/ClimateDataPortal/__init__.py 1970-01-01 00:00:00 +0000
756+++ modules/ClimateDataPortal/__init__.py 2011-09-05 13:55:13 +0000
757@@ -0,0 +1,201 @@
758+
759+"""
760+ Climate Data Module
761+
762+ @author: Mike Amy
763+"""
764+
765+# datasets are stored in actual tables
766+# - e.g. rainfall_mm
767+
768+# data collection points in dataset
769+# values at a point within a time range
770+
771+# e.g. observed temperature in Kathmandu between Feb 2006 - April 2007
772+
773+
774+sample_types = dict(
775+ O = "Observed",
776+ G = "Gridded",
777+
778+ r = "Projected (RC)",
779+ g = "Projected (GC)",
780+ s = "Scenario",
781+)
782+
783+sample_codes = {}
784+
785+import re
786+for code, name in sample_types.iteritems():
787+ globals()[re.sub("\W", "", name)] = code
788+ sample_codes[name] = code
789+
790+
791+# Until I figure out how to sanely import things from web2py,
792+# apply a prophylactic import method...
793+def define_models(env):
794+ """
795+ Define Climate Data models.
796+ """
797+ db = env.db
798+ Field = env.Field
799+
800+ def create_index(table_name, field_name):
801+ db.executesql(
802+ """
803+ CREATE INDEX IF NOT EXISTS
804+ "index_%(table_name)s__%(field_name)s"
805+ ON "%(table_name)s" ("%(field_name)s");
806+ """ % locals()
807+ )
808+
809+ place = db.define_table(
810+ "place",
811+ Field(
812+ "longitude",
813+ "double",
814+ notnull=True,
815+ required=True,
816+ ),
817+ Field(
818+ "latitude",
819+ "double",
820+ notnull=True,
821+ required=True,
822+ )
823+ )
824+
825+ # not all places are stations with elevations
826+ # as in the case of "gridded" data
827+ # a station can only be in one place
828+ observation_station = db.define_table(
829+ "observation_station",
830+ Field(
831+ "id",
832+ "id", # must be a place,
833+ notnull=True,
834+ required=True,
835+ ),
836+ Field(
837+ "name",
838+ "string",
839+ notnull=True,
840+ unique=False,
841+ required=True,
842+ ),
843+ Field(
844+ "elevation_metres",
845+ "integer"
846+ )
847+ )
848+
849+ def sample_table(name, value_type):
850+ table = db.define_table(
851+ name,
852+ Field(
853+ "sample_type",
854+ "string",
855+ length = 1,
856+ notnull=True,
857+ # necessary as web2py requires a default value even for
858+ # not null fields
859+ default="-1",
860+ required=True
861+ ),
862+ Field(
863+ "time_period",
864+ "integer",
865+ notnull=True,
866+ default=-1000,
867+ required=True
868+ ),
869+ Field(
870+ # this should become a GIS field
871+ "place_id",
872+ place,
873+ notnull=True,
874+ required=True
875+ ),
876+ Field(
877+ "value",
878+ value_type,
879+ notnull = True,
880+ required=True,
881+ ),
882+ )
883+
884+ create_index(name, "id")
885+ create_index(name, "sample_type")
886+ create_index(name, "time_period")
887+ create_index(name, "place_id")
888+
889+ return table
890+
891+ rainfall_mm = sample_table("climate_rainfall_mm", "double")
892+ min_temperature_celsius = sample_table("climate_min_temperature_celsius", "double")
893+ max_temperature_celsius = sample_table("climate_max_temperature_celsius", "double")
894+
895+ tables = {
896+ "Rainfall mm": ("climate_rainfall_mm", rainfall_mm),
897+ "Max Temperature C": ("climate_max_temperature_celsius", max_temperature_celsius),
898+ "Min Temperature C": ("climate_min_temperature_celsius", min_temperature_celsius),
899+ }
900+
901+ def year_month_to_month_number(year, month):
902+ """Time periods are integers representing months in years,
903+ from 1960 onwards.
904+
905+ e.g. 0 = Jan 1960, 1 = Feb 1960, 12 = Jan 1961
906+
907+ This function converts a year and month to a month number.
908+ """
909+ return ((year-1960) * 12) + (month-1)
910+
911+ def date_to_month_number(date):
912+ """This function converts a date to a month number.
913+
914+ See also year_month_to_month_number(year, month)
915+ """
916+ return year_month_to_month_number(date.year, date.month)
917+
918+# def month_number_to_date(month_number):
919+# ret
920+
921+ from .MapPlugin import define
922+ define(
923+ env,
924+ place,
925+ tables,
926+ date_to_month_number,
927+ sample_codes,
928+ globals()
929+ )
930+
931+ # exports:
932+ globals().update(
933+ sample_types = sample_types,
934+
935+ place = place,
936+ observation_station = observation_station,
937+
938+ tables = tables,
939+
940+ rainfall_mm = rainfall_mm,
941+ max_temperature_celsius = max_temperature_celsius,
942+ min_temperature_celsius = min_temperature_celsius,
943+
944+ date_to_month_number = date_to_month_number,
945+ year_month_to_month_number = year_month_to_month_number,
946+ )
947+
948+ def redefine_models(env):
949+ # avoid risking insidious aliasing bugs
950+ # by not defining things more than once
951+ env.db.update(
952+ climate_rainfall_mm = rainfall_mm,
953+ climate_max_temperature_celsius = max_temperature_celsius,
954+ climate_min_temperature_celsius = min_temperature_celsius,
955+ place = place,
956+ observation_station = observation_station,
957+ )
958+ globals()["define_models"] = redefine_models
959
960=== added file 'modules/ClimateDataPortal/import_NetCDF_readings.py'
961--- modules/ClimateDataPortal/import_NetCDF_readings.py 1970-01-01 00:00:00 +0000
962+++ modules/ClimateDataPortal/import_NetCDF_readings.py 2011-09-05 13:55:13 +0000
963@@ -0,0 +1,131 @@
964+
965+ClimateDataPortal = local_import("ClimateDataPortal")
966+
967+
968+def get_or_create(dict, key, creator):
969+ try:
970+ value = dict[key]
971+ except KeyError:
972+ value = dict[key] = creator()
973+ return value
974+
975+def get_or_create_record(table, query):
976+ query_terms = []
977+ for key, value in query.iteritems():
978+ query_terms.append(getattr(table, key) == value)
979+ reduced_query = reduce(
980+ (lambda left, right: left & right),
981+ query_terms
982+ )
983+ records = db(reduced_query).select()
984+ count = len(records)
985+ assert count <= 1, "Multiple records for %s" % query
986+ if count == 0:
987+ record = table.insert(**query)
988+ db.commit()
989+ else:
990+ record = records.first()
991+ return record.id
992+
993+def nearly(expected_float, actual_float):
994+ return (expected_float * 0.999) < actual_float < (expected_float * 1.001)
995+
996+def add_reading_if_none(
997+ database_table,
998+ sample_type,
999+ time_period,
1000+ place_id,
1001+ value
1002+):
1003+ records = db(
1004+ (database_table.sample_type == sample_type) &
1005+ (database_table.time_period == time_period) &
1006+ (database_table.place_id == place_id)
1007+ ).select(database_table.value, database_table.id)
1008+ count = len(records)
1009+ assert count <= 1
1010+ if count == 0:
1011+ database_table.insert(
1012+ sample_type = sample_type,
1013+ time_period = time_period,
1014+ place_id = place_id,
1015+ value = value
1016+ )
1017+ else:
1018+ existing = records.first()
1019+ assert nearly(existing.value, value), (existing.value, value, place_id)
1020+
1021+
1022+
1023+import datetime
1024+
1025+def import_climate_readings(
1026+ netcdf_file,
1027+ database_table,
1028+ add_reading,
1029+ start_time = datetime.date(1971,1,1),
1030+ is_undefined = lambda x: -99.900003 < x < -99.9
1031+):
1032+ """
1033+ Assumptions:
1034+ * there are no places
1035+ * the data is in order of places
1036+ """
1037+ variables = netcdf_file.variables
1038+
1039+ # create grid of places
1040+ place_ids = {}
1041+
1042+ def to_list(variable):
1043+ result = []
1044+ for i in range(len(variable)):
1045+ result.append(variable[i])
1046+ return result
1047+
1048+ def iter_pairs(list):
1049+ for index in range(len(list)):
1050+ yield index, list[index]
1051+
1052+ times = to_list(variables["time"])
1053+ lat = to_list(variables["lat"])
1054+ lon = to_list(variables["lon"])
1055+ for latitude in lat:
1056+ for longitude in lon:
1057+ record = get_or_create_record(
1058+ ClimateDataPortal.place,
1059+ dict(
1060+ longitude = longitude,
1061+ latitude = latitude
1062+ )
1063+ )
1064+ place_ids[(latitude, longitude)] = record
1065+ #print longitude, latitude, record
1066+
1067+ tt = variables["tt"]
1068+ print "up to:", len(times)
1069+ for time_index, time in iter_pairs(times):
1070+ print time_index
1071+ time_period = start_time+datetime.timedelta(hours=time)
1072+ for latitude_index, latitude in iter_pairs(lat):
1073+ for longitude_index, longitude in iter_pairs(lon):
1074+ value = tt[time_index][latitude_index][longitude_index]
1075+ if not is_undefined(value):
1076+ add_reading(
1077+ database_table = database_table,
1078+ sample_type = ClimateDataPortal.Gridded,
1079+ time_period = ClimateDataPortal.date_to_month_number(time_period),
1080+ place_id = place_ids[(latitude, longitude)],
1081+ value = value
1082+ )
1083+ db.commit()
1084+
1085+import sys
1086+
1087+from Scientific.IO import NetCDF
1088+
1089+file_name = sys.argv[1]
1090+import_climate_readings(
1091+ NetCDF.NetCDFFile(file_name),
1092+ ClimateDataPortal.min_temperature_celsius,
1093+ add_reading_if_none
1094+)
1095
1096=== added file 'modules/ClimateDataPortal/import_stations.py'
1097--- modules/ClimateDataPortal/import_stations.py 1970-01-01 00:00:00 +0000
1098+++ modules/ClimateDataPortal/import_stations.py 2011-09-05 13:55:13 +0000
1099@@ -0,0 +1,53 @@
1100+
1101+ClimateDataPortal = local_import("ClimateDataPortal")
1102+
1103+from decimal import Decimal
1104+
1105+def import_stations(file_name):
1106+ """
1107+ Expects a file containing lines of the form e.g.:
1108+226 JALESORE 1122 172 26.65 85.78
1109+275 PHIDIM (PANCHTH 1419 1205 27.15 87.75
1110+unused Station name <-id <-elev <-lat <-lon
1111+0123456789012345678901234567890123456789012345678901234567890123456789
1112+0 1 2 3 4 5 6
1113+ """
1114+ place = ClimateDataPortal.place
1115+ observation_station = ClimateDataPortal.observation_station
1116+ observation_station.truncate()
1117+ place.truncate()
1118+ db.commit()
1119+
1120+ for line in open(file_name, "r").readlines():
1121+ try:
1122+ place_id_text = line[27:33]
1123+ except IndexError:
1124+ continue
1125+ else:
1126+ try:
1127+ place_id = int(place_id_text)
1128+ except ValueError:
1129+ continue
1130+ else:
1131+ station_name = line[8:25].strip() # don't restrict if they add more
1132+ elevation_metres = int(line[37:43])
1133+
1134+ latitude = Decimal(line[47:53])
1135+ longitude = Decimal(line[57:623])
1136+
1137+ assert place.insert(
1138+ id = place_id,
1139+ longitude = longitude,
1140+ latitude = latitude
1141+ ) == place_id
1142+
1143+ station_id = observation_station.insert(
1144+ id = place_id,
1145+ name = station_name,
1146+ elevation_metres = elevation_metres
1147+ )
1148+ print place_id, station_name, latitude, longitude, elevation_metres
1149+ db.commit()
1150+
1151+import sys
1152+import_stations(sys.argv[1])
1153
1154=== added file 'modules/ClimateDataPortal/import_tabbed_readings.py'
1155--- modules/ClimateDataPortal/import_tabbed_readings.py 1970-01-01 00:00:00 +0000
1156+++ modules/ClimateDataPortal/import_tabbed_readings.py 2011-09-05 13:55:13 +0000
1157@@ -0,0 +1,154 @@
1158+
1159+ClimateDataPortal = local_import("ClimateDataPortal")
1160+
1161+from decimal import Decimal
1162+
1163+
1164+def get_or_create(dict, key, creator):
1165+ try:
1166+ value = dict[key]
1167+ except KeyError:
1168+ value = dict[key] = creator()
1169+ return value
1170+
1171+import os
1172+
1173+class Readings(object):
1174+ def __init__(
1175+ self,
1176+ database_table,
1177+ null_value,
1178+ maximum = None,
1179+ minimum = None
1180+ ):
1181+ self.database_table = database_table
1182+ db(database_table.sample_type == ClimateDataPortal.Observed).delete()
1183+ self.null_value = null_value
1184+ self.maximum = maximum
1185+ self.minimum = minimum
1186+
1187+ self.aggregated_values = {}
1188+
1189+ def add_reading(self, time_period, reading, out_of_range):
1190+ if reading != self.null_value:
1191+ if (
1192+ (self.minimum is not None and reading < self.minimum) or
1193+ (self.maximum is not None and reading > self.maximum)
1194+ ):
1195+ out_of_range(reading)
1196+ else:
1197+ readings = get_or_create(
1198+ self.aggregated_values,
1199+ time_period,
1200+ list
1201+ )
1202+ readings.append(reading)
1203+
1204+ def done(self, place_id):
1205+ for month_number, values in self.aggregated_values.iteritems():
1206+ self.database_table.insert(
1207+ sample_type = ClimateDataPortal.Observed,
1208+ time_period = month_number,
1209+ place_id = place_id,
1210+ value = sum(values) / len(values)
1211+ )
1212+
1213+import datetime
1214+
1215+
1216+def import_tabbed_readings(
1217+ folder_name,
1218+ variables = [],
1219+ place_ids = None
1220+):
1221+ """
1222+ Expects a folder containing files with name rtXXXX.txt
1223+
1224+ each file contains lines of the form e.g.:
1225+1978\t1\t1\t0\t-99.9\t-99.9
1226+
1227+representing year, month, day, rainfall(mm), minimum and maximum temperature
1228+ """
1229+ observation_station = ClimateDataPortal.observation_station
1230+
1231+ null_value = Decimal("-99.9") # seems to be
1232+
1233+ for row in db(observation_station).select(observation_station.id):
1234+ place_id = row.id
1235+ if place_ids is not None:
1236+ # avoid certain place ids (to allow importing particular places)
1237+ start_place, end_place = map(int, place_ids.split(":"))
1238+ assert start_place <= end_place
1239+ if place_id < start_place or place_id > end_place:
1240+ continue
1241+ print place_id
1242+
1243+ data_file_path = os.path.join(folder_name, "rt%04i.txt" % place_id)
1244+ if not os.path.exists(data_file_path):
1245+ print "%s not found" % data_file_path
1246+ else:
1247+ try:
1248+ for line in open(data_file_path, "r").readlines():
1249+ if line:
1250+ data = line.split()
1251+ if data:
1252+ try:
1253+ year = int(data[0])
1254+ month = int(data[1])
1255+ day = int(data[2])
1256+
1257+ time_period = ClimateDataPortal.year_month_to_month_number(year, month)
1258+
1259+ for variable, reading_data in zip(
1260+ variables,
1261+ data[3:6]
1262+ ):
1263+ def out_of_range(reading):
1264+ print "%s/%s/%s: %s out of range" % (
1265+ day, month, year, reading
1266+ )
1267+ reading = Decimal(reading_data)
1268+ variable.add_reading(
1269+ time_period,
1270+ reading,
1271+ out_of_range = out_of_range
1272+ )
1273+
1274+ except Exception, exception:
1275+ print exception
1276+ for variable in variables:
1277+ variable.done(place_id)
1278+ except:
1279+ print line
1280+ raise
1281+
1282+ db.commit()
1283+ else:
1284+ print "No stations!"
1285+
1286+import sys
1287+
1288+null_value = Decimal("-99.9")
1289+import_tabbed_readings(
1290+ folder_name = sys.argv[1],
1291+ variables = [
1292+ Readings(
1293+ ClimateDataPortal.rainfall_mm,
1294+ null_value = null_value,
1295+ minimum = 0,
1296+ ),
1297+ Readings(
1298+ database_table = ClimateDataPortal.min_temperature_celsius,
1299+ null_value = null_value,
1300+ minimum = -120,
1301+ maximum = 55
1302+ ),
1303+ Readings(
1304+ database_table = ClimateDataPortal.max_temperature_celsius,
1305+ null_value = null_value,
1306+ minimum = -120,
1307+ maximum = 55
1308+ ),
1309+ ],
1310+ place_ids = sys.argv[2:] or None
1311+)
1312
1313=== modified file 'modules/s3/s3gis.py'
1314--- modules/s3/s3gis.py 2011-08-30 06:22:20 +0000
1315+++ modules/s3/s3gis.py 2011-09-05 13:55:13 +0000
1316@@ -74,16 +74,11 @@
1317 Provide an easy, safe, systematic way of handling Debug output
1318 (print to stdout doesn't work with WSGI deployments)
1319 """
1320- try:
1321- output = "S3 Debug: %s" % str(message)
1322- if value:
1323- output += ": %s" % str(value)
1324- except:
1325- output = "S3 Debug: %s" % unicode(message)
1326- if value:
1327- output += ": %s" % unicode(value)
1328-
1329- print >> sys.stderr, output
1330+ # should be using python's built-in logging module
1331+ output = u"S3 Debug: %s" % unicode(message)
1332+ if value:
1333+ output += u": %s" % unicode(value)
1334+ sys.stderr.write(output)
1335
1336 SHAPELY = False
1337 try:
1338@@ -240,7 +235,6 @@
1339 """
1340
1341 def __init__(self):
1342-
1343 self.deployment_settings = current.deployment_settings
1344 self.public_url = current.deployment_settings.get_base_public_url()
1345 if not current.db is not None:
1346@@ -308,6 +302,18 @@
1347 else:
1348 return wkt
1349
1350+ def debug(self, message, value=None):
1351+ # should be using python's built-in logging module
1352+ session = current.session
1353+ if session.s3.debug:
1354+ raise Exception(message)
1355+ else:
1356+ output = u"S3 Debug: %s" % unicode(message)
1357+ if value:
1358+ output += u": %s" % unicode(value)
1359+ sys.stderr.write(output)
1360+ session.error = current.T(message)
1361+
1362 # -------------------------------------------------------------------------
1363 def download_kml(self, record_id, filename):
1364 """
1365@@ -329,11 +335,11 @@
1366 db = current.db
1367
1368 layer = KMLLayer(self)
1369+
1370 query = (layer.table.id == record_id)
1371 record = db(query).select(limitby=(0, 1)).first()
1372 url = record.url
1373
1374- layer.add_record(record)
1375 cachepath = layer.cachepath
1376 filepath = os.path.join(cachepath, filename)
1377
1378@@ -430,7 +436,7 @@
1379 myfile = zipfile.ZipFile(fp)
1380 try:
1381 file = myfile.read("doc.kml")
1382- except:
1383+ except: # Naked except!!
1384 file = myfile.read(myfile.infolist()[0].filename)
1385 myfile.close()
1386
1387@@ -534,7 +540,7 @@
1388 try:
1389 lon = features[0].lon
1390 simple = True
1391- except:
1392+ except AttributeError:
1393 simple = False
1394
1395 for feature in features:
1396@@ -547,7 +553,7 @@
1397 # A Join
1398 lon = feature.gis_location.lon
1399 lat = feature.gis_location.lat
1400- except:
1401+ except AttributeError:
1402 # Skip any rows without the necessary lat/lon fields
1403 continue
1404
1405@@ -988,7 +994,8 @@
1406 _marker = db.gis_marker
1407 _projection = db.gis_projection
1408 have_tables = _config and _projection
1409- except:
1410+ except Exception, exception:
1411+ self.debug(exception)
1412 have_tables = False
1413
1414 row = None
1415@@ -1005,16 +1012,13 @@
1416 if not row:
1417 if auth.is_logged_in():
1418 # Read personalised config, if available.
1419- try:
1420- query = (db.pr_person.uuid == auth.user.person_uuid) & \
1421- (_config.pe_id == db.pr_person.pe_id) & \
1422- (_marker.id == _config.marker_id) & \
1423- (_projection.id == _config.projection_id)
1424- row = db(query).select(limitby=(0, 1)).first()
1425- if row:
1426- config_id = row["gis_config"].id
1427- except:
1428- pass
1429+ query = (db.pr_person.uuid == auth.user.person_uuid) & \
1430+ (_config.pe_id == db.pr_person.pe_id) & \
1431+ (_marker.id == _config.marker_id) & \
1432+ (_projection.id == _config.projection_id)
1433+ row = db(query).select(limitby=(0, 1)).first()
1434+ if row:
1435+ config_id = row["gis_config"].id
1436 if not row:
1437 # No personal config or not logged in. Use site default.
1438 config_id = 1
1439@@ -1151,7 +1155,7 @@
1440 if level:
1441 try:
1442 return location_hierarchy[level]
1443- except:
1444+ except KeyError:
1445 return level
1446 else:
1447 return location_hierarchy
1448@@ -1200,7 +1204,8 @@
1449 if level:
1450 try:
1451 return all_levels[level]
1452- except:
1453+ except Exception, exception:
1454+
1455 return level
1456 else:
1457 return all_levels
1458@@ -1475,7 +1480,7 @@
1459 represent = db(table.id == value).select(table.name,
1460 cache=cache,
1461 limitby=(0, 1)).first().name
1462- except:
1463+ except: # @ToDo: provide specific exception
1464 # Keep the default from earlier
1465 pass
1466
1467@@ -1515,24 +1520,21 @@
1468 lat_max = location.lat_max
1469
1470 else:
1471- s3_debug("Location searched within isn't a Polygon!")
1472- session.error = T("Location searched within isn't a Polygon!")
1473+ self.debug("Location searched within isn't a Polygon!")
1474 return None
1475- except:
1476+ except: # @ToDo: need specific exception
1477 wkt = location
1478 if (wkt.startswith("POLYGON") or wkt.startswith("MULTIPOLYGON")):
1479 # ok
1480 lon_min = None
1481 else:
1482- s3_debug("This isn't a Polygon!")
1483- session.error = T("This isn't a Polygon!")
1484+ self.debug("This isn't a Polygon!")
1485 return None
1486
1487 try:
1488 polygon = wkt_loads(wkt)
1489- except:
1490- s3_debug("Invalid Polygon!")
1491- session.error = T("Invalid Polygon!")
1492+ except: # @ToDo: need specific exception
1493+ self.debug("Invalid Polygon!")
1494 return None
1495
1496 table = db[tablename]
1497@@ -1540,8 +1542,7 @@
1498
1499 if "location_id" not in table.fields():
1500 # @ToDo: Add any special cases to be able to find the linked location
1501- s3_debug("This table doesn't have a location_id!")
1502- session.error = T("This table doesn't have a location_id!")
1503+ self.debug("This table doesn't have a location_id!")
1504 return None
1505
1506 query = (table.location_id == locations.id)
1507@@ -1573,7 +1574,10 @@
1508 # Save Record
1509 output.records.append(row)
1510 except shapely.geos.ReadingError:
1511- s3_debug("Error reading wkt of location with id", row.id)
1512+ self.debug(
1513+ "Error reading wkt of location with id",
1514+ value=row.id
1515+ )
1516 else:
1517 # 1st check for Features included within the bbox (faster)
1518 def in_bbox(row):
1519@@ -1599,7 +1603,10 @@
1520 # Save Record
1521 output.records.append(row)
1522 except shapely.geos.ReadingError:
1523- s3_debug("Error reading wkt of location with id", row.id)
1524+ self.debug(
1525+ "Error reading wkt of location with id",
1526+ value = row.id,
1527+ )
1528
1529 return output
1530
1531@@ -2075,38 +2082,41 @@
1532 current_row += 1
1533 try:
1534 name0 = row.pop("ADM0_NAME")
1535- except:
1536+ except KeyError:
1537 name0 = ""
1538 try:
1539 name1 = row.pop("ADM1_NAME")
1540- except:
1541+ except KeyError:
1542 name1 = ""
1543 try:
1544 name2 = row.pop("ADM2_NAME")
1545- except:
1546+ except KeyError:
1547 name2 = ""
1548 try:
1549 name3 = row.pop("ADM3_NAME")
1550- except:
1551+ except KeyError:
1552 name3 = ""
1553 try:
1554 name4 = row.pop("ADM4_NAME")
1555- except:
1556+ except KeyError:
1557 name4 = ""
1558 try:
1559 name5 = row.pop("ADM5_NAME")
1560- except:
1561+ except KeyError:
1562 name5 = ""
1563
1564 if not name5 and not name4 and not name3 and \
1565 not name2 and not name1:
1566 # We need a name! (L0's are already in DB)
1567- s3_debug("No name provided", current_row)
1568+ self.debug(
1569+ "No name provided",
1570+ current_row,
1571+ )
1572 continue
1573
1574 try:
1575 wkt = row.pop("WKT")
1576- except:
1577+ except KeyError:
1578 wkt = None
1579 try:
1580 lat = row.pop("LAT")
1581@@ -2115,21 +2125,17 @@
1582 lat = None
1583 lon = None
1584
1585- if domain:
1586- try:
1587- uuid = "%s/%s" % (domain,
1588- row.pop("UUID"))
1589- except:
1590- uuid = ""
1591+ try:
1592+ uuid = row.pop("UUID")
1593+ except KeyError:
1594+ uuid = ""
1595 else:
1596- try:
1597- uuid = row.pop("UUID")
1598- except:
1599- uuid = ""
1600+ if domain:
1601+ uuid = "%s/%s" % (domain, uuid)
1602
1603 try:
1604 code = row.pop("CODE")
1605- except:
1606+ except KeyError:
1607 code = ""
1608
1609 population = ""
1610@@ -2175,7 +2181,7 @@
1611 # Calculate Centroid & Bounds
1612 if wkt:
1613 try:
1614- # Valid WKT
1615+ # Valid WKT
1616 shape = wkt_loads(wkt)
1617 centroid_point = shape.centroid
1618 lon = centroid_point.x
1619@@ -2189,8 +2195,8 @@
1620 feature_type = 1 # Point
1621 else:
1622 feature_type = 3 # Polygon
1623- except:
1624- s3_debug("Invalid WKT", name)
1625+ except: # @ToDo: provide specific exception
1626+ self.debug("Invalid WKT", name)
1627 continue
1628 else:
1629 lon_min = lon_max = lon
1630@@ -2234,8 +2240,8 @@
1631 else:
1632 path += "%s/" % _parent.id
1633 else:
1634- s3_debug("Location", name)
1635- s3_debug("Parent cannot be found", parent)
1636+ self.debug("Location", name)
1637+ self.debug("Parent cannot be found", parent)
1638 parent = ""
1639
1640 # Check for duplicates
1641@@ -2245,8 +2251,8 @@
1642 duplicate = db(query).select(table.id, limitby=(0, 1)).first()
1643
1644 if duplicate:
1645- s3_debug("Location", name)
1646- s3_debug("Duplicate - updating...")
1647+ self.debug("Location", name)
1648+ self.debug("Duplicate - updating...")
1649 path += str(duplicate.id)
1650 # Update with any new information
1651 query = (table.id == duplicate.id)
1652@@ -2329,7 +2335,7 @@
1653 else:
1654 cached = False
1655 if not os.access(cachepath, os.W_OK):
1656- s3_debug("Folder not writable", cachepath)
1657+ self.debug("Folder not writable", cachepath)
1658 return
1659
1660 if not cached:
1661@@ -2338,11 +2344,11 @@
1662 f = fetch(url)
1663 except (urllib2.URLError,):
1664 e = sys.exc_info()[1]
1665- s3_debug("URL Error", e)
1666+ self.debug("URL Error", e)
1667 return
1668 except (urllib2.HTTPError,):
1669 e = sys.exc_info()[1]
1670- s3_debug("HTTP Error", e)
1671+ self.debug("HTTP Error", e)
1672 return
1673
1674 # Unzip File
1675@@ -2355,8 +2361,8 @@
1676 # For now, 2.5 users need to download/unzip manually to cache folder
1677 myfile.extract(filename, cachepath)
1678 myfile.close()
1679- except:
1680- s3_debug("Zipfile contents don't seem correct!")
1681+ except IOError:
1682+ self.debug("Zipfile contents don't seem correct!")
1683 myfile.close()
1684 return
1685
1686@@ -2472,7 +2478,7 @@
1687 # Should be just a single parent
1688 break
1689 except shapely.geos.ReadingError:
1690- s3_debug("Error reading wkt of location with id", row.id)
1691+ self.debug("Error reading wkt of location with id", row.id)
1692
1693 # Add entry to database
1694 table.insert(uuid=uuid,
1695@@ -2492,7 +2498,7 @@
1696 else:
1697 continue
1698
1699- s3_debug("All done!")
1700+ self.debug("All done!")
1701 return
1702
1703 # -------------------------------------------------------------------------
1704@@ -2735,7 +2741,7 @@
1705
1706 db = current.db
1707 in_bbox = self.query_features_by_bbox(*shape.bounds)
1708- has_wkt = (db.gis_location.wkt != None) & (db.gis_location.wkt != '')
1709+ has_wkt = (db.gis_location.wkt != None) & (db.gis_location.wkt != "")
1710
1711 for loc in db(in_bbox & has_wkt).select():
1712 try:
1713@@ -2743,7 +2749,7 @@
1714 if location_shape.intersects(shape):
1715 yield loc
1716 except shapely.geos.ReadingError:
1717- s3_debug("Error reading wkt of location with id", loc.id)
1718+ self.debug("Error reading wkt of location with id", loc.id)
1719
1720 # -------------------------------------------------------------------------
1721 def _get_features_by_latlon(self, lat, lon):
1722@@ -2799,7 +2805,7 @@
1723 try :
1724 shape = wkt_loads(location.wkt)
1725 except:
1726- s3_debug("Error reading WKT", location.wkt)
1727+ self.debug("Error reading WKT", location.wkt)
1728 continue
1729 bounds = shape.bounds
1730 table[location.id] = dict(
1731@@ -2950,7 +2956,12 @@
1732 map_width = width
1733 else:
1734 map_width = config.map_width
1735- if bbox and (-90 < bbox["max_lat"] < 90) and (-90 < bbox["min_lat"] < 90) and (-180 < bbox["max_lon"] < 180) and (-180 < bbox["min_lon"] < 180):
1736+ if (bbox
1737+ and (-90 < bbox["max_lat"] < 90)
1738+ and (-90 < bbox["min_lat"] < 90)
1739+ and (-180 < bbox["max_lon"] < 180)
1740+ and (-180 < bbox["min_lon"] < 180)
1741+ ):
1742 # We have sane Bounds provided, so we should use them
1743 pass
1744 else:
1745@@ -2974,15 +2985,21 @@
1746 projection = config.epsg
1747
1748
1749- if projection != 900913 and projection != 4326:
1750+ if projection not in (900913, 4326):
1751 # Test for Valid Projection file in Proj4JS library
1752- projpath = os.path.join(request.folder, "static", "scripts", "gis", "proj4js", "lib", "defs", "EPSG%s.js" % projection)
1753+ projpath = os.path.join(
1754+ request.folder, "static", "scripts", "gis", "proj4js", \
1755+ "lib", "defs", "EPSG%s.js" % projection
1756+ )
1757 try:
1758 f = open(projpath, "r")
1759 f.close()
1760 except:
1761- session.error = "%s /static/scripts/gis/proj4js/lib/defs" % T("Projection not supported - please add definition to")
1762- redirect(URL(c="gis", f="projection"))
1763+ session.error = "'%s' %s /static/scripts/gis/proj4js/lib/defs" % (
1764+ projection,
1765+ T("Projection not supported - please add definition to")
1766+ )
1767+ redirect(URL(r=request, c="gis", f="projection"))
1768
1769 units = config.units
1770 maxResolution = config.maxResolution
1771@@ -3055,11 +3072,12 @@
1772 #########
1773 # Scripts
1774 #########
1775+
1776 def add_javascript(script):
1777 if type(script) == SCRIPT:
1778 html.append(script)
1779 elif script.startswith("http"):
1780- html.append(
1781+ html.append(
1782 SCRIPT(_type="text/javascript",
1783 _src=script))
1784 else:
1785@@ -3069,7 +3087,7 @@
1786
1787 debug = session.s3.debug
1788 if debug:
1789- if projection != 900913 and projection != 4326:
1790+ if projection not in (900913, 4326):
1791 add_javascript("scripts/gis/proj4js/lib/proj4js-combined.js")
1792 add_javascript("scripts/gis/proj4js/lib/defs/EPSG%s.js" % projection)
1793
1794@@ -3082,7 +3100,7 @@
1795 add_javascript("scripts/gis/usng2.js")
1796 add_javascript("scripts/gis/MP.js")
1797 else:
1798- if projection != 900913 and projection != 4326:
1799+ if projection not in (900913, 4326):
1800 add_javascript("scripts/gis/proj4js/lib/proj4js-compressed.js")
1801 add_javascript("scripts/gis/proj4js/lib/defs/EPSG%s.js" % projection)
1802 add_javascript("scripts/gis/OpenLayers.js")
1803@@ -3184,20 +3202,20 @@
1804 # If we do come back to it, then it should be moved to static
1805 if print_tool:
1806 url = print_tool["url"]
1807- url+'' # check url can be concatenated with strings
1808+ url+"" # check url can be concatenated with strings
1809 if "title" in print_tool:
1810- mapTitle = str(print_tool["mapTitle"])
1811+ mapTitle = unicode(print_tool["mapTitle"])
1812 else:
1813- mapTitle = str(T("Map from Sahana Eden"))
1814+ mapTitle = unicode(T("Map from Sahana Eden"))
1815 if "subtitle" in print_tool:
1816- subTitle = str(print_tool["subTitle"])
1817+ subTitle = unicode(print_tool["subTitle"])
1818 else:
1819- subTitle = str(T("Printed from Sahana Eden"))
1820+ subTitle = unicode(T("Printed from Sahana Eden"))
1821 if session.auth:
1822- creator = session.auth.user.email
1823+ creator = unicode(session.auth.user.email)
1824 else:
1825 creator = ""
1826- print_tool1 = "".join(("""
1827+ print_tool1 = u"".join(("""
1828 if (typeof(printCapabilities) != 'undefined') {
1829 // info.json from script headers OK
1830 printProvider = new GeoExt.data.PrintProvider({
1831@@ -3221,7 +3239,7 @@
1832 // printProvider: printProvider
1833 //});
1834 // A layer to display the print page extent
1835- //var pageLayer = new OpenLayers.Layer.Vector('""", str(T("Print Extent")), """');
1836+ //var pageLayer = new OpenLayers.Layer.Vector('""", unicode(T("Print Extent")), """');
1837 //pageLayer.addFeatures(printPage.feature);
1838 //pageLayer.setVisibility(false);
1839 //map.addLayer(pageLayer);
1840@@ -3237,7 +3255,7 @@
1841 //});
1842 // The form with fields controlling the print output
1843 S3.gis.printFormPanel = new Ext.form.FormPanel({
1844- title: '""", str(T("Print Map")), """',
1845+ title: '""", unicode(T("Print Map")), """',
1846 rootVisible: false,
1847 split: true,
1848 autoScroll: true,
1849@@ -3250,7 +3268,7 @@
1850 defaults: {anchor: '100%%'},
1851 listeners: {
1852 'expand': function() {
1853- //if (null == mapPanel.map.getLayersByName('""", str(T("Print Extent")), """')[0]) {
1854+ //if (null == mapPanel.map.getLayersByName('""", unicode(T("Print Extent")), """')[0]) {
1855 // mapPanel.map.addLayer(pageLayer);
1856 //}
1857 if (null == mapPanel.plugins[0]) {
1858@@ -3278,7 +3296,7 @@
1859 xtype: 'textarea',
1860 name: 'comment',
1861 value: '',
1862- fieldLabel: '""", str(T("Comment")), """',
1863+ fieldLabel: '""", unicode(T("Comment")), """',
1864 plugins: new GeoExt.plugins.PrintPageField({
1865 printPage: printPage
1866 })
1867@@ -3286,7 +3304,7 @@
1868 xtype: 'combo',
1869 store: printProvider.layouts,
1870 displayField: 'name',
1871- fieldLabel: '""", str(T("Layout")), """',
1872+ fieldLabel: '""", T("Layout").decode("utf-8"), """',
1873 typeAhead: true,
1874 mode: 'local',
1875 triggerAction: 'all',
1876@@ -3297,7 +3315,7 @@
1877 xtype: 'combo',
1878 store: printProvider.dpis,
1879 displayField: 'name',
1880- fieldLabel: '""", str(T("Resolution")), """',
1881+ fieldLabel: '""", unicode(T("Resolution")), """',
1882 tpl: '<tpl for="."><div class="x-combo-list-item">{name} dpi</div></tpl>',
1883 typeAhead: true,
1884 mode: 'local',
1885@@ -3314,7 +3332,7 @@
1886 // xtype: 'combo',
1887 // store: printProvider.scales,
1888 // displayField: 'name',
1889- // fieldLabel: '""", str(T("Scale")), """',
1890+ // fieldLabel: '""", unicode(T("Scale")), """',
1891 // typeAhead: true,
1892 // mode: 'local',
1893 // triggerAction: 'all',
1894@@ -3324,13 +3342,13 @@
1895 //}, {
1896 // xtype: 'textfield',
1897 // name: 'rotation',
1898- // fieldLabel: '""", str(T("Rotation")), """',
1899+ // fieldLabel: '""", unicode(T("Rotation")), """',
1900 // plugins: new GeoExt.plugins.PrintPageField({
1901 // printPage: printPage
1902 // })
1903 }],
1904 buttons: [{
1905- text: '""", str(T("Create PDF")), """',
1906+ text: '""", unicode(T("Create PDF")), """',
1907 handler: function() {
1908 // the PrintExtent plugin is the mapPanel's 1st plugin
1909 //mapPanel.plugins[0].print();
1910@@ -3348,7 +3366,7 @@
1911 } else {
1912 // Display error diagnostic
1913 S3.gis.printFormPanel = new Ext.Panel ({
1914- title: '""", str(T("Print Map")), """',
1915+ title: '""", unicode(T("Print Map")), """',
1916 rootVisible: false,
1917 split: true,
1918 autoScroll: true,
1919@@ -3359,7 +3377,7 @@
1920 bodyStyle: 'padding:5px',
1921 labelAlign: 'top',
1922 defaults: {anchor: '100%'},
1923- html: '""", str(T("Printing disabled since server not accessible")), """: <BR />""", url, """'
1924+ html: '""", unicode(T("Printing disabled since server not accessible")), """: <BR />""", unicode(url), """'
1925 });
1926 }
1927 """))
1928@@ -3445,40 +3463,40 @@
1929 name_safe = re.sub("'", "", layer.name)
1930 if layer.url2:
1931 url2 = """,
1932- url2: '%s'""" % layer.url2
1933+ "url2": "%s\"""" % layer.url2
1934 else:
1935 url2 = ""
1936 if layer.url3:
1937 url3 = """,
1938- url3: '%s'""" % layer.url3
1939+ "url3": "%s\"""" % layer.url3
1940 else:
1941 url3 = ""
1942 if layer.base:
1943 base = ""
1944 else:
1945 base = """,
1946- isBaseLayer: false"""
1947+ "isBaseLayer": false"""
1948 if layer.visible:
1949 visibility = ""
1950 else:
1951 visibility = """,
1952- visibility: false"""
1953+ "visibility": false"""
1954 if layer.attribution:
1955 attribution = """,
1956- attribution: '%s'""" % layer.attribution
1957+ "attribution": %s""" % repr(layer.attribution)
1958 else:
1959 attribution = ""
1960 if layer.zoom_levels is not None and layer.zoom_levels != 19:
1961 zoomLevels = """,
1962- zoomLevels: %i""" % layer.zoom_levels
1963+ "zoomLevels": %i""" % layer.zoom_levels
1964 else:
1965 zoomLevels = ""
1966
1967 # Generate JS snippet to pass to static
1968 layers_osm += """
1969 S3.gis.layers_osm[%i] = {
1970- name: '%s',
1971- url1: '%s'%s%s%s%s%s%s
1972+ "name": "%s",
1973+ "url1": "%s"%s%s%s%s%s%s
1974 }
1975 """ % (counter,
1976 name_safe,
1977@@ -3490,6 +3508,7 @@
1978 attribution,
1979 zoomLevels)
1980
1981+
1982 # ---------------------------------------------------------------------
1983 # XYZ
1984 # @ToDo: Migrate to Class/Static
1985@@ -3681,7 +3700,7 @@
1986
1987 if "active" in layer and not layer["active"]:
1988 visibility = """,
1989- visibility: false"""
1990+ "visibility": false"""
1991 else:
1992 visibility = ""
1993
1994@@ -3712,32 +3731,33 @@
1995 marker_url = ""
1996 if marker_url:
1997 markerLayer = """,
1998- marker_url: '%s',
1999- marker_height: %i,
2000- marker_width: %i""" % (marker_url, marker_height, marker_width)
2001+ "marker_url": "%s",
2002+ "marker_height": %i,
2003+ "marker_width": %i""" % (marker_url, marker_height, marker_width)
2004
2005 if "opacity" in layer and layer["opacity"] != 1:
2006 opacity = """,
2007- opacity: %.1f""" % layer["opacity"]
2008+ "opacity": %.1f""" % layer["opacity"]
2009 else:
2010 opacity = ""
2011 if "cluster_distance" in layer and layer["cluster_distance"] != self.cluster_distance:
2012 cluster_distance = """,
2013- cluster_distance: %i""" % layer["cluster_distance"]
2014+ "cluster_distance": %i""" % layer["cluster_distance"]
2015 else:
2016 cluster_distance = ""
2017 if "cluster_threshold" in layer and layer["cluster_threshold"] != self.cluster_threshold:
2018 cluster_threshold = """,
2019- cluster_threshold: %i""" % layer["cluster_threshold"]
2020+ "cluster_threshold": %i""" % layer["cluster_threshold"]
2021 else:
2022 cluster_threshold = ""
2023
2024 # Generate JS snippet to pass to static
2025 layers_feature_queries += """
2026 S3.gis.layers_feature_queries[%i] = {
2027- name: '%s',
2028- url: '%s'%s%s%s%s%s
2029-}""" % (counter,
2030+ "name": "%s",
2031+ "url": "%s"%s%s%s%s%s
2032+}
2033+""" % (counter,
2034 name,
2035 url,
2036 visibility,
2037@@ -3745,36 +3765,44 @@
2038 opacity,
2039 cluster_distance,
2040 cluster_threshold)
2041-
2042+
2043 # ---------------------------------------------------------------------
2044 # Add Layers from the Catalogue
2045 # ---------------------------------------------------------------------
2046 layers_config = ""
2047 if catalogue_layers:
2048 for LayerType in [
2049- #OSMLayer,
2050- Bing,
2051- Google,
2052- Yahoo,
2053- TMSLayer,
2054- WMSLayer,
2055- FeatureLayer,
2056- GeoJSONLayer,
2057- GeoRSSLayer,
2058- GPXLayer,
2059- KMLLayer,
2060- WFSLayer
2061- ]:
2062- # Instantiate the Class
2063- layer = LayerType(self)
2064- layer_type_js = layer.as_javascript()
2065- if layer_type_js:
2066- # Add to the output JS
2067- layers_config = "".join((layers_config,
2068- layer_type_js))
2069- if layer.scripts:
2070- for script in layer.scripts:
2071- add_javascript(script)
2072+ #OSMLayer,
2073+ BingLayer,
2074+ GoogleLayer,
2075+ YahooLayer,
2076+ TMSLayer,
2077+ WMSLayer,
2078+ FeatureLayer,
2079+ GeoJSONLayer,
2080+ GeoRSSLayer,
2081+ GPXLayer,
2082+ KMLLayer,
2083+ WFSLayer
2084+ ]:
2085+ try:
2086+ # Instantiate the Class
2087+ layer = LayerType(self)
2088+ layer_type_js = layer.as_javascript()
2089+ if layer_type_js:
2090+ # Add to the output JS
2091+ layers_config = "".join((layers_config,
2092+ layer_type_js))
2093+ if layer.scripts:
2094+ for script in layer.scripts:
2095+ add_javascript(script)
2096+ except Exception, exception:
2097+ if debug:
2098+ raise
2099+ else:
2100+ session.warning.append(
2101+ LayerType.__name__ + " not shown due to error"
2102+ )
2103
2104 # -----------------------------------------------------------------
2105 # Coordinate Grid - only one possible
2106@@ -3848,6 +3876,7 @@
2107 "S3.gis.marker_default_width = %i;\n" % marker_default.width,
2108 osm_auth,
2109 layers_osm,
2110+ layers_feature_queries,
2111 _features,
2112 layers_config,
2113 # i18n Labels
2114@@ -3880,6 +3909,7 @@
2115 ))))
2116
2117 # Static Script
2118+
2119 if debug:
2120 add_javascript("scripts/S3/s3.gis.js")
2121 add_javascript("scripts/S3/s3.gis.layers.js")
2122@@ -3889,7 +3919,6 @@
2123
2124 # Dynamic Script (stuff which should, as far as possible, be moved to static)
2125 html.append(SCRIPT(layers_js + \
2126- #layers_xyz + \
2127 print_tool1))
2128
2129 # Set up map plugins
2130@@ -3904,8 +3933,7 @@
2131
2132 return html
2133
2134-
2135-# =============================================================================
2136+# -----------------------------------------------------------------------------
2137 class Marker(object):
2138 """ Represents a Map Marker """
2139 def __init__(self, gis, id=None):
2140@@ -3938,6 +3966,13 @@
2141 #self.url = URL(c="static", f="img",
2142 # args=["markers", marker.image])
2143
2144+ def add_attributes_to_output(self, output):
2145+ output.update(
2146+ marker_image = self.image,
2147+ marker_height = self.height,
2148+ marker_width = self.width,
2149+ )
2150+
2151 # -----------------------------------------------------------------------------
2152 class Projection(object):
2153 """ Represents a Map Projection """
2154@@ -3963,41 +3998,34 @@
2155 self.epsg = projection.epsg
2156
2157 # -----------------------------------------------------------------------------
2158+
2159+def config_dict(mandatory, defaulted):
2160+ d = dict(mandatory)
2161+ for key, (value, defaults) in defaulted.iteritems():
2162+ if value not in defaults:
2163+ d[key] = value
2164+ return d
2165+
2166+
2167+# the layer code only needs to do:
2168+# any database lookups to get extra data
2169+# security checks.
2170+
2171+# then it generates appropriate JSON strings.
2172+
2173 class Layer(object):
2174 """
2175- Base Class for Layers
2176- Not meant to be instantiated direct
2177+ Abstract Base Class for Layers
2178 """
2179- def __init__(self, gis, record=None):
2180+ def __init__(self, gis):
2181+ self.gis = gis
2182 db = current.db
2183-
2184- self.gis = gis
2185- # This usually arrives later
2186- self.record = record
2187- # Ensure all attributes available (even if Null)
2188- self._refresh()
2189- self.scripts = []
2190-
2191- def add_record(self, record):
2192- """
2193- Update the record & refresh the attributes
2194- """
2195- if record:
2196- self.record = record
2197- self._refresh()
2198- else:
2199- return
2200-
2201- def as_dict(self):
2202- """
2203- Output the Layer as a Python dictionary
2204- - this is used to build a JSON of the overall dict of layers
2205- """
2206- record = self.record
2207- if record:
2208- return record
2209- else:
2210- return
2211+ try:
2212+ self.table = db[self.table_name]
2213+ except:
2214+ current.manager.load(self.table_name)
2215+ self.table = db[tablename]
2216+
2217
2218 def as_json(self):
2219 """
2220@@ -4009,663 +4037,480 @@
2221 else:
2222 return
2223
2224- def as_javascript(self):
2225- """
2226- Output the Layer as Javascript
2227- - suitable for inclusion in the HTML page
2228- """
2229- gis = self.gis
2230- auth = gis.auth
2231- db = current.db
2232- table = self.table
2233-
2234- layer_type_list = []
2235- # Read the enabled Layers
2236- records = db(table.enabled == True).select()
2237- for record in records:
2238- # Check user is allowed to access the layer
2239- role_required = record.role_required
2240- if (not role_required) or auth.s3_has_role(role_required):
2241- # Pass the record to the Class
2242- self.add_record(record)
2243- # Read the output dict for this layer
2244- layer_dict = self.as_dict()
2245- if layer_dict:
2246- # Add this layer to the list of layers for this layer type
2247- layer_type_list.append(layer_dict)
2248-
2249- if layer_type_list:
2250- # Output the Layer Type as JSON
2251- layer_type_json = json.dumps(layer_type_list,
2252- sort_keys=True,
2253- indent=4)
2254- layer_type_js = "".join(("%s = " % self.js_array,
2255- layer_type_json,
2256- "\n"))
2257- return layer_type_js
2258-
2259- def _name_safe(self, name):
2260- """
2261- Make the name safe for use in JSON
2262- i.e. any Unicode character allowed except for " & \
2263- """
2264- return re.sub('[\\"]', "", name)
2265-
2266- def _refresh(self):
2267- " Refresh the attributes of the Layer "
2268- table = self.table
2269- if "marker_id" in table:
2270- self.set_marker()
2271- if "projection_id" in table:
2272- self.set_projection()
2273-
2274- def set_marker(self):
2275- " Set the Marker for the Layer "
2276- gis = self.gis
2277- record = self.record
2278- if record:
2279- marker = Marker(gis, record.marker_id)
2280- self.marker = marker
2281- else:
2282- self.marker = None
2283-
2284- def set_projection(self):
2285- " Set the Projection for the Layer "
2286- gis = self.gis
2287- record = self.record
2288- if record:
2289- projection = Projection(gis, record.projection_id)
2290- self.projection = projection
2291- else:
2292- self.projection = None
2293-
2294-# -----------------------------------------------------------------------------
2295-class OneLayer(Layer):
2296- """
2297- Base Class for Layers with just a single record
2298- Not meant to be instantiated direct
2299- """
2300-
2301- def __init__(self, gis, record=None):
2302- db = current.db
2303- tablename = ""
2304- try:
2305- table = db[tablename]
2306- except:
2307- current.manager.load(tablename)
2308- table = db[tablename]
2309- if not record:
2310- # There is only ever 1 layer
2311- record = db(table.id > 0).select().first()
2312-
2313- self.gis = gis
2314- self.table = table
2315- self.js_array = "S3.gis.OneLayer"
2316- self.record = record
2317- self._refresh()
2318- self.scripts = []
2319-
2320- def as_javascript(self):
2321- """
2322- Output the Layer as Javascript
2323- - suitable for inclusion in the HTML page
2324- """
2325- auth = self.gis.auth
2326- record = self.record
2327- # Check Layer exists in the DB
2328- if not record:
2329- return None
2330- # Check Layer is enabled
2331- if not record.enabled:
2332- return None
2333- # Check user is allowed to access the Layer
2334- role_required = record.role_required
2335- if role_required and not auth.s3_has_role(role_required):
2336- return None
2337- # Read the output JSON for this layer
2338- layer_type_json = self.as_json()
2339- layer_type_js = "".join(("%s = " % self.js_array,
2340- layer_type_json,
2341- "\n"))
2342- return layer_type_js
2343-
2344- def _set_api_key(self):
2345- " Set the API Key for the Layer "
2346- record = self.record
2347- if record:
2348- self.apikey = record.apikey
2349- else:
2350- self.apikey = None
2351-
2352- def _refresh(self):
2353- " Refresh the attributes of the Layer "
2354- table = self.table
2355- if "apikey" in table:
2356- self._set_api_key()
2357-
2358-# -----------------------------------------------------------------------------
2359-class Bing(OneLayer):
2360- """ Bing Layers from Catalogue """
2361- def __init__(self, gis, record=None):
2362- db = current.db
2363- tablename = "gis_layer_bing"
2364- try:
2365- table = db[tablename]
2366- except:
2367- current.manager.load(tablename)
2368- table = db[tablename]
2369- if not record:
2370- # There is only ever 1 layer
2371- record = db(table.id > 0).select().first()
2372-
2373- self.gis = gis
2374- self.table = table
2375- self.js_array = "S3.gis.Bing"
2376- self.record = record
2377- self._refresh()
2378- self.scripts = []
2379-
2380+
2381+
2382+# -----------------------------------------------------------------------------
2383+class SingleRecordLayer(Layer):
2384+ """
2385+ Abstract Base Class for Layers with just a single record
2386+ """
2387+
2388+ def __init__(self, gis):
2389+ super(SingleRecordLayer, self).__init__(gis)
2390+ table = self.table
2391+ records = current.db(table.id > 0).select()
2392+ assert len(records) <= 1, (
2393+ "There should only ever be 0 or 1 %s" % self.__class__.__name__
2394+ )
2395+ self.record = None
2396+ record = records.first()
2397+ if record is not None:
2398+ if record.enabled:
2399+ role_required = record.role_required
2400+ if not role_required or self.gis.auth.s3_has_role(role_required):
2401+ self.record = record
2402+ # Refresh the attributes of the Layer
2403+ if "apikey" in table:
2404+ if record:
2405+ self.apikey = record.apikey
2406+ else:
2407+ self.apikey = None
2408+ self.gis = gis
2409+ self.scripts = []
2410+
2411+ def as_javascript(self):
2412+ """
2413+ Output the Layer as Javascript
2414+ - suitable for inclusion in the HTML page
2415+ """
2416+ if self.record:
2417+ if "apikey" in self.table and not self.apikey:
2418+ raise Exception("Cannot display a %s if we have no valid API Key" % self.__class__.__name__)
2419+ json = self.as_json()
2420+ if json:
2421+ return "%s = %s\n" % (
2422+ self.js_array,
2423+ json
2424+ )
2425+ else:
2426+ return None
2427+ else:
2428+ return None
2429+
2430+# -----------------------------------------------------------------------------
2431+class BingLayer(SingleRecordLayer):
2432+ """ Bing Layer from Catalogue """
2433+ table_name = "gis_layer_bing"
2434+ js_array = "S3.gis.Bing"
2435+
2436 def as_dict(self):
2437 gis = self.gis
2438 record = self.record
2439- apikey = self.apikey
2440-
2441- if not apikey:
2442- # Cannot display Bing layers if we have no valid API Key
2443- return None
2444-
2445- config = gis.get_config()
2446- if Projection(gis, id=config.projection_id).epsg != 900913:
2447- # Cannot display Bing layers unless we're using the
2448- # Spherical Mercator Projection
2449- return None
2450-
2451- # Mandatory attributes
2452- output = {
2453- "ApiKey": self.apikey
2454- }
2455-
2456- # Attributes which are defaulted client-side if not set
2457- if record.aerial_enabled:
2458- output["Aerial"] = record.aerial or "Bing Satellite"
2459- if record.road_enabled:
2460- output["Road"] = record.road or "Bing Roads"
2461- if record.hybrid_enabled:
2462- output["Hybrid"] = record.hybrid or "Bing Hybrid"
2463-
2464- return output
2465+ if record is not None:
2466+ config = self.gis.get_config()
2467+ if Projection(gis, id=config.projection_id).epsg != 900913:
2468+ raise Exception("Cannot display Bing layers unless we're using the Spherical Mercator Projection")
2469+ else:
2470+ # Mandatory attributes
2471+ output = {
2472+ "ApiKey": self.apikey
2473+ }
2474+
2475+ # Attributes which are defaulted client-side if not set
2476+ if record.aerial_enabled:
2477+ output["Aerial"] = record.aerial or "Bing Satellite"
2478+ if record.road_enabled:
2479+ output["Road"] = record.road or "Bing Roads"
2480+ if record.hybrid_enabled:
2481+ output["Hybrid"] = record.hybrid or "Bing Hybrid"
2482+ return output
2483+ else:
2484+ return None
2485
2486 # -----------------------------------------------------------------------------
2487-class Google(OneLayer):
2488+class GoogleLayer(SingleRecordLayer):
2489 """
2490 Google Layers/Tools from Catalogue
2491 """
2492- def __init__(self, gis, record=None):
2493- db = current.db
2494- tablename = "gis_layer_google"
2495- try:
2496- table = db[tablename]
2497- except:
2498- current.manager.load(tablename)
2499- table = db[tablename]
2500- debug = current.session.s3.debug
2501- if not record:
2502- # There is only ever 1 layer
2503- record = db(table.id > 0).select().first()
2504-
2505- self.gis = gis
2506- self.table = table
2507- self.js_array = "S3.gis.Google"
2508- self.record = record
2509- self._refresh()
2510-
2511- if record:
2512+ table_name = "gis_layer_google"
2513+ js_array = "S3.gis.Google"
2514+
2515+ def __init__(self, gis):
2516+ super(GoogleLayer, self).__init__(gis)
2517+ record = self.record
2518+ if record is not None:
2519+ debug = current.session.s3.debug
2520+ add_script = self.scripts.append
2521 if record.mapmaker_enabled or record.mapmakerhybrid_enabled:
2522 # Need to use v2 API
2523 # http://code.google.com/p/gmaps-api-issues/issues/detail?id=2349
2524- self.scripts = ["http://maps.google.com/maps?file=api&v=2&key=%s" % self.apikey]
2525+ add_script("http://maps.google.com/maps?file=api&v=2&key=%s" % self.apikey)
2526 else:
2527 # v3 API
2528- self.scripts = ["http://maps.google.com/maps/api/js?v=3.2&sensor=false"]
2529+ add_script("http://maps.google.com/maps/api/js?v=3.2&sensor=false")
2530 if debug and record.streetview_enabled:
2531- self.scripts.append("scripts/gis/gxp/widgets/GoogleStreetViewPanel.js")
2532+ add_script("scripts/gis/gxp/widgets/GoogleStreetViewPanel.js")
2533 if record.earth_enabled:
2534- self.scripts.append("http://www.google.com/jsapi?key=%s" % self.apikey)
2535- self.scripts.append(SCRIPT("google && google.load('earth', '1');", _type="text/javascript"))
2536+ add_script("http://www.google.com/jsapi?key=%s" % self.apikey)
2537+ add_script(SCRIPT("google && google.load('earth', '1');", _type="text/javascript"))
2538 if debug:
2539- self.scripts.append("scripts/gis/gxp/widgets/GoogleEarthPanel.js")
2540- else:
2541- self.scripts = []
2542+ add_script("scripts/gis/gxp/widgets/GoogleEarthPanel.js")
2543
2544 def as_dict(self):
2545 gis = self.gis
2546 T = current.T
2547 record = self.record
2548- apikey = self.apikey
2549-
2550- if not apikey and (record.mapmaker_enabled or record.mapmakerhybrid_enabled):
2551- # Cannot display Google layers if we have no valid API Key
2552+ if record is not None:
2553+ config = gis.get_config()
2554+ if Projection(gis, id=config.projection_id).epsg != 900913:
2555+ if record.earth_enabled:
2556+ # But the Google Earth panel can still be enabled
2557+ return {
2558+ "Earth": str(T("Switch to 3D"))
2559+ }
2560+ else:
2561+ raise Exception("Cannot display Google layers unless we're using the Spherical Mercator Projection")
2562+
2563+
2564+ # Mandatory attributes
2565+ #"ApiKey": self.apikey
2566+ output = {
2567+ }
2568+
2569+ # Attributes which are defaulted client-side if not set
2570+ if record.satellite_enabled:
2571+ output["Satellite"] = record.satellite or "Google Satellite"
2572+ if record.maps_enabled:
2573+ output["Maps"] = record.maps or "Google Maps"
2574+ if record.hybrid_enabled:
2575+ output["Hybrid"] = record.hybrid or "Google Hybrid"
2576+ if record.mapmaker_enabled:
2577+ output["MapMaker"] = record.mapmaker or "Google MapMaker"
2578+ if record.mapmakerhybrid_enabled:
2579+ output["MapMakerHybrid"] = record.mapmakerhybrid or "Google MapMaker Hybrid"
2580+ if record.earth_enabled:
2581+ output["Earth"] = str(T("Switch to 3D"))
2582+ if record.streetview_enabled and not (record.mapmaker_enabled or record.mapmakerhybrid_enabled):
2583+ # Streetview doesn't work with v2 API
2584+ output["StreetviewButton"] = str(T("Click where you want to open Streetview"))
2585+ output["StreetviewTitle"] = str(T("Street View"))
2586+
2587+ return output
2588+ else:
2589 return None
2590
2591- config = gis.get_config()
2592- if Projection(gis, id=config.projection_id).epsg != 900913:
2593- # Cannot display Google layers unless we're using the
2594- # Spherical Mercator Projection
2595- if record.earth_enabled:
2596- # But the Google Earth panel can still be enabled
2597- output = {
2598- "Earth": str(T("Switch to 3D"))
2599- }
2600- return output
2601- else:
2602- return None
2603-
2604- # Mandatory attributes
2605- #"ApiKey": self.apikey
2606- output = {
2607- }
2608-
2609- # Attributes which are defaulted client-side if not set
2610- if record.satellite_enabled:
2611- output["Satellite"] = record.satellite or "Google Satellite"
2612- if record.maps_enabled:
2613- output["Maps"] = record.maps or "Google Maps"
2614- if record.hybrid_enabled:
2615- output["Hybrid"] = record.hybrid or "Google Hybrid"
2616- if record.mapmaker_enabled:
2617- output["MapMaker"] = record.mapmaker or "Google MapMaker"
2618- if record.mapmakerhybrid_enabled:
2619- output["MapMakerHybrid"] = record.mapmakerhybrid or "Google MapMaker Hybrid"
2620- if record.earth_enabled:
2621- output["Earth"] = str(T("Switch to 3D"))
2622- if record.streetview_enabled and not (record.mapmaker_enabled or record.mapmakerhybrid_enabled):
2623- # Streetview doesn't work with v2 API
2624- output["StreetviewButton"] = str(T("Click where you want to open Streetview"))
2625- output["StreetviewTitle"] = str(T("Street View"))
2626-
2627- return output
2628-
2629 # -----------------------------------------------------------------------------
2630-class Yahoo(OneLayer):
2631+class YahooLayer(SingleRecordLayer):
2632 """
2633 Yahoo Layers from Catalogue
2634
2635 NB This will stop working on 13 September 2011
2636 http://developer.yahoo.com/blogs/ydn/posts/2011/06/yahoo-maps-apis-service-closure-announcement-new-maps-offerings-coming-soon/
2637 """
2638- def __init__(self, gis, record=None):
2639- db = current.db
2640- tablename = "gis_layer_yahoo"
2641- try:
2642- table = db[tablename]
2643- except:
2644- current.manager.load(tablename)
2645- table = db[tablename]
2646- if not record:
2647- # There is only ever 1 layer
2648- record = db(table.id > 0).select().first()
2649-
2650- self.gis = gis
2651- self.table = table
2652- self.js_array = "S3.gis.Yahoo"
2653- self.record = record
2654- self._refresh()
2655- if record:
2656- self.scripts = ["http://api.maps.yahoo.com/ajaxymap?v=3.8&appid=%s" % self.apikey]
2657- else:
2658- self.scripts = []
2659+ js_array = "S3.gis.Yahoo"
2660+ table_name = "gis_layer_yahoo"
2661+
2662+ def __init__(self, gis):
2663+ super(YahooLayer, self).__init__(gis)
2664+ if self.record:
2665+ self.scripts.append("http://api.maps.yahoo.com/ajaxymap?v=3.8&appid=%s" % self.apikey)
2666+ config = gis.get_config()
2667+ if Projection(gis, id=config.projection_id).epsg != 900913:
2668+ raise Exception("Cannot display Yahoo layers unless we're using the Spherical Mercator Projection")
2669
2670 def as_dict(self):
2671- gis = self.gis
2672 record = self.record
2673- apikey = self.apikey
2674-
2675- if not apikey:
2676- # Cannot display Yahoo layers if we have no valid API Key
2677- return None
2678-
2679- config = gis.get_config()
2680- if Projection(gis, id=config.projection_id).epsg != 900913:
2681- # Cannot display Yahoo layers unless we're using the
2682- # Spherical Mercator Projection
2683- return None
2684-
2685- # Mandatory attributes
2686- #"ApiKey": self.apikey
2687- output = {
2688- }
2689-
2690- # Attributes which are defaulted client-side if not set
2691- if record.satellite_enabled:
2692- output["Satellite"] = record.satellite or "Yahoo Satellite"
2693- if record.maps_enabled:
2694- output["Maps"] = record.maps or "Yahoo Maps"
2695- if record.hybrid_enabled:
2696- output["Hybrid"] = record.hybrid or "Yahoo Hybrid"
2697-
2698- return output
2699+ if record is not None:
2700+ # Mandatory attributes
2701+ #"ApiKey": self.apikey
2702+ output = {
2703+ }
2704+
2705+ # Attributes which are defaulted client-side if not set
2706+ if record.satellite_enabled:
2707+ output["Satellite"] = record.satellite or "Yahoo Satellite"
2708+ if record.maps_enabled:
2709+ output["Maps"] = record.maps or "Yahoo Maps"
2710+ if record.hybrid_enabled:
2711+ output["Hybrid"] = record.hybrid or "Yahoo Hybrid"
2712+
2713+ return output
2714+ else:
2715+ return None
2716+
2717+class MultiRecordLayer(Layer):
2718+ def __init__(self, gis):
2719+ super(MultiRecordLayer, self).__init__(gis)
2720+ self.sublayers = []
2721+ self.scripts = []
2722+
2723+ auth = gis.auth
2724+
2725+ layer_type_list = []
2726+ # Read the enabled Layers
2727+ for record in current.db(self.table.enabled == True).select():
2728+ # Check user is allowed to access the layer
2729+ role_required = record.role_required
2730+ if (not role_required) or auth.s3_has_role(role_required):
2731+ self.sublayers.append(self.SubLayer(gis, record))
2732+
2733+ def as_javascript(self):
2734+ """
2735+ Output the Layer as Javascript
2736+ - suitable for inclusion in the HTML page
2737+ """
2738+ sublayer_dicts = []
2739+ for sublayer in self.sublayers:
2740+ # Read the output dict for this sublayer
2741+ sublayer_dict = sublayer.as_dict()
2742+ if sublayer_dict:
2743+ # Add this layer to the list of layers for this layer type
2744+ sublayer_dicts.append(sublayer_dict)
2745+
2746+ if sublayer_dicts:
2747+ # Output the Layer Type as JSON
2748+ layer_type_json = json.dumps(sublayer_dicts,
2749+ sort_keys=True,
2750+ indent=4)
2751+ return "%s = %s\n" % (self.js_array, layer_type_json)
2752+ else:
2753+ return None
2754+
2755+ class SubLayer(object):
2756+ def __init__(self, gis, record):
2757+ # Ensure all attributes available (even if Null)
2758+ self.gis = gis
2759+ self.__dict__.update(record)
2760+ del record
2761+ self.safe_name = re.sub('[\\"]', "", self.name)
2762+
2763+ if hasattr(self, "marker_id"):
2764+ self.marker = Marker(gis, self.marker_id)
2765+ if hasattr(self, "projection_id"):
2766+ self.projection = Projection(gis, self.projection_id)
2767+
2768+ def setup_clustering(self, output):
2769+ gis = self.gis
2770+ cluster_distance = gis.cluster_distance
2771+ cluster_threshold = gis.cluster_threshold
2772+ if self.cluster_distance != cluster_distance:
2773+ output["cluster_distance"] = self.cluster_distance
2774+ if self.cluster_threshold != cluster_threshold:
2775+ output["cluster_threshold"] = self.cluster_threshold
2776+
2777+ def setup_visibility_and_opacity(self, output):
2778+ if not self.visible:
2779+ output["visibility"] = False
2780+ if self.opacity != 1:
2781+ output["opacity"] = "%.1f" % self.opacity
2782+
2783+ def add_attributes_if_not_default(self, output, **values_and_defaults):
2784+ # could also write values in debug mode, to check if defaults ignored.
2785+ # could also check values are not being overwritten.
2786+ for key, (value, defaults) in values_and_defaults.iteritems():
2787+ if value not in defaults:
2788+ output[key] = value
2789+
2790+ #def set_marker(self):
2791+ # " Set the Marker for the Layer "
2792+ # gis = self.gis
2793+ # self.marker = Marker(gis, self.marker_id)
2794+
2795+ #def set_projection(self):
2796+ # " Set the Projection for the Layer "
2797+ # gis = self.gis
2798+ # self.projection = Projection(gis, self.projection_id)
2799
2800 # -----------------------------------------------------------------------------
2801-class FeatureLayer(Layer):
2802+class FeatureLayer(MultiRecordLayer):
2803 """ Feature Layer from Catalogue """
2804- def __init__(self, gis, record=None):
2805- db = current.db
2806- tablename = "gis_layer_feature"
2807- try:
2808- table = db[tablename]
2809- except:
2810- current.manager.load(tablename)
2811- table = db[tablename]
2812-
2813- self.gis = gis
2814- self.table = table
2815- self.js_array = "S3.gis.layers_features"
2816- self.record = record
2817- self._refresh()
2818- self.scripts = []
2819-
2820- def as_dict(self):
2821- gis = self.gis
2822- cluster_distance = gis.cluster_distance
2823- cluster_threshold = gis.cluster_threshold
2824- record = self.record
2825- marker = self.marker
2826-
2827- auth = gis.auth
2828- request = current.request
2829- deployment_settings = gis.deployment_settings
2830-
2831- if record.module not in deployment_settings.modules:
2832- # Module is disabled
2833- return
2834- if not auth.permission(c=record.module, f=record.resource):
2835- # User has no permission to this resource (in ACL)
2836- return
2837-
2838- name_safe = self._name_safe(record.name)
2839-
2840- url = "%s.geojson?layer=%i" % (URL(record.module,record.resource),
2841- record.id)
2842- if record.filter:
2843- url = "%s&%s" % (url, record.filter)
2844-
2845- # Mandatory attributes
2846- output = {
2847- "name": name_safe,
2848+ table_name = "gis_layer_feature"
2849+ js_array = "S3.gis.layers_features"
2850+
2851+ class SubLayer(MultiRecordLayer.SubLayer):
2852+ def __init__(self, gis, record):
2853+ record_module = record.module
2854+ if record_module is not None:
2855+ if record_module not in gis.deployment_settings.modules:
2856+ raise Exception("%s module is disabled" % record_module)
2857+ if not gis.auth.permission(c=record.module, f=record.resource):
2858+ raise Exception("User has no permission to this resource (in ACL)")
2859+ else:
2860+ raise Exception("FeatureLayer Record '%s' has no module" % record.name)
2861+ super(FeatureLayer.SubLayer, self).__init__(gis, record)
2862+
2863+ def as_dict(self):
2864+ gis = self.gis
2865+
2866+ request = current.request
2867+ deployment_settings = gis.deployment_settings
2868+
2869+ url = "%s.geojson?layer=%i" % (URL(self.module, self.resource),
2870+ self.id)
2871+ if self.filter:
2872+ url = "%s&%s" % (url, self.filter)
2873+
2874+ # Mandatory attributes
2875+ output = {
2876+ "name": self.safe_name,
2877 "url": url,
2878- "marker_image": marker.image,
2879- "marker_height": marker.height,
2880- "marker_width": marker.width,
2881 }
2882-
2883- # Attributes which are defaulted client-side if not set
2884- if not record.visible:
2885- output["visibility"] = False
2886- if record.opacity != 1:
2887- output["opacity"] = "%.1f" % record.opacity
2888- if record.cluster_distance != cluster_distance:
2889- output["cluster_distance"] = record.cluster_distance
2890- if record.cluster_threshold != cluster_threshold:
2891- output["cluster_threshold"] = record.cluster_threshold
2892-
2893- return output
2894+ self.marker.add_attributes_to_output(output)
2895+ self.setup_visibility_and_opacity(output)
2896+ self.setup_clustering(output)
2897+
2898+ return output
2899
2900 # -----------------------------------------------------------------------------
2901-class GeoJSONLayer(Layer):
2902+class GeoJSONLayer(MultiRecordLayer):
2903 """ GeoJSON Layer from Catalogue """
2904- def __init__(self, gis, record=None):
2905- db = current.db
2906- tablename = "gis_layer_geojson"
2907- try:
2908- table = db[tablename]
2909- except:
2910- current.manager.load(tablename)
2911- table = db[tablename]
2912-
2913- self.gis = gis
2914- self.table = table
2915- self.js_array = "S3.gis.layers_geojson"
2916- self.record = record
2917- self._refresh()
2918- self.scripts = []
2919-
2920- def as_dict(self):
2921- gis = self.gis
2922- cluster_distance = gis.cluster_distance
2923- cluster_threshold = gis.cluster_threshold
2924- record = self.record
2925- marker = self.marker
2926- projection = self.projection
2927-
2928- name_safe = self._name_safe(record.name)
2929- # Mandatory attributes
2930- output = {
2931- "name": name_safe,
2932- "url": record.url,
2933- "marker_image": marker.image,
2934- "marker_height": marker.height,
2935- "marker_width": marker.width,
2936+ table_name = "gis_layer_geojson"
2937+ js_array = "S3.gis.layers_geojson"
2938+
2939+ class SubLayer(MultiRecordLayer.SubLayer):
2940+ def as_dict(self):
2941+ # Mandatory attributes
2942+ output = {
2943+ "name": self.safe_name,
2944+ "url": self.url,
2945 }
2946-
2947- # Attributes which are defaulted client-side if not set
2948- if projection.epsg != 4326:
2949- output["projection"] = projection.epsg
2950- if not record.visible:
2951- output["visibility"] = False
2952- if record.opacity != 1:
2953- output["opacity"] = "%.1f" % record.opacity
2954- if record.cluster_distance != cluster_distance:
2955- output["cluster_distance"] = record.cluster_distance
2956- if record.cluster_threshold != cluster_threshold:
2957- output["cluster_threshold"] = record.cluster_threshold
2958-
2959- return output
2960+ self.marker.add_attributes_to_output(output)
2961+
2962+ # Attributes which are defaulted client-side if not set
2963+ projection = self.projection
2964+ if projection.epsg != 4326:
2965+ output["projection"] = projection.epsg
2966+ self.setup_visibility_and_opacity(output)
2967+ self.setup_clustering(output)
2968+
2969+ return output
2970
2971 # -----------------------------------------------------------------------------
2972-class GeoRSSLayer(Layer):
2973+class GeoRSSLayer(MultiRecordLayer):
2974 """ GeoRSS Layer from Catalogue """
2975- def __init__(self, gis, record=None):
2976- db = current.db
2977- tablename = "gis_layer_georss"
2978- try:
2979- table = db[tablename]
2980- except:
2981- current.manager.load(tablename)
2982- table = db[tablename]
2983-
2984- self.gis = gis
2985- self.table = table
2986- self.cachetable = db.gis_cache
2987- self.js_array = "S3.gis.layers_georss"
2988- self.record = record
2989- self._refresh()
2990- self.scripts = []
2991-
2992- def as_dict(self):
2993- gis = self.gis
2994- cluster_distance = gis.cluster_distance
2995- cluster_threshold = gis.cluster_threshold
2996- record = self.record
2997- marker = self.marker
2998-
2999- db = current.db
3000- request = current.request
3001- public_url = gis.public_url
3002- cachetable = self.cachetable
3003-
3004- url = record.url
3005- # Check to see if we should Download layer to the cache
3006- download = True
3007- query = (cachetable.source == url)
3008- cached = db(query).select(cachetable.modified_on,
3009- limitby=(0, 1)).first()
3010- refresh = record.refresh or 900 # 15 minutes set if we have no data (legacy DB)
3011- if cached:
3012- modified_on = cached.modified_on
3013- cutoff = modified_on + timedelta(seconds=refresh)
3014- if request.utcnow < cutoff:
3015- download = False
3016- if download:
3017- # Download layer to the Cache
3018- # @ToDo: Call directly without going via HTTP
3019- # s3mgr = current.manager
3020- # @ToDo: Make this async by using Celery (also use this for the refresh time)
3021- fields = ""
3022- if record.data:
3023- fields = "&data_field=%s" % record.data
3024- if record.image:
3025- fields = "%s&image_field=%s" % (fields, record.image)
3026- _url = "%s%s/update.georss?fetchurl=%s%s" % (public_url,
3027- URL(c="gis", f="cache_feed"),
3028- url,
3029- fields)
3030- try:
3031- # @ToDo: Need to commit to not have DB locked with SQLite?
3032- fetch(_url)
3033- if cached:
3034- # Clear old records which are no longer active
3035- query = (cachetable.source == url) & \
3036- (cachetable.modified_on < cutoff)
3037- db(query).delete()
3038- except:
3039- # Feed down
3040- if cached:
3041- # Use cached copy
3042- # Should we Update timestamp to prevent every
3043- # subsequent request attempting the download?
3044- #query = (cachetable.source == url)
3045- #db(query).update(modified_on=request.utcnow)
3046- pass
3047- else:
3048- # No cached copy available - skip layer
3049- return
3050-
3051- name_safe = self._name_safe(record.name)
3052-
3053- # Pass the GeoJSON URL to the client
3054- # Filter to the source of this feed
3055- url = "%s.geojson?cache.source=%s" % (URL(c="gis", f="cache_feed"),
3056- url)
3057-
3058- # Mandatory attributes
3059- output = {
3060- "name": name_safe,
3061- "url": url,
3062- "marker_image": marker.image,
3063- "marker_height": marker.height,
3064- "marker_width": marker.width,
3065- }
3066-
3067- # Attributes which are defaulted client-side if not set
3068- if record.refresh != 900:
3069- output["refresh"] = record.refresh
3070- if not record.visible:
3071- output["visibility"] = False
3072- if record.opacity != 1:
3073- output["opacity"] = "%.1f" % record.opacity
3074- if record.cluster_distance != cluster_distance:
3075- output["cluster_distance"] = record.cluster_distance
3076- if record.cluster_threshold != cluster_threshold:
3077- output["cluster_threshold"] = record.cluster_threshold
3078-
3079- return output
3080+ table_name = "gis_layer_georss"
3081+ js_array = "S3.gis.layers_georss"
3082+
3083+ def __init__(self, gis):
3084+ super(GeoRSSLayer, self).__init__(gis)
3085+ GeoRSSLayer.SubLayer.cachetable = current.db.gis_cache
3086+
3087+ class SubLayer(MultiRecordLayer.SubLayer):
3088+ def as_dict(self):
3089+ gis = self.gis
3090+
3091+ db = current.db
3092+ request = current.request
3093+ public_url = gis.public_url
3094+ cachetable = self.cachetable
3095+
3096+ url = self.url
3097+ # Check to see if we should Download layer to the cache
3098+ download = True
3099+ query = (cachetable.source == url)
3100+ existing_cached_copy = db(query).select(cachetable.modified_on,
3101+ limitby=(0, 1)).first()
3102+ refresh = self.refresh or 900 # 15 minutes set if we have no data (legacy DB)
3103+ if existing_cached_copy:
3104+ modified_on = existing_cached_copy.modified_on
3105+ cutoff = modified_on + timedelta(seconds=refresh)
3106+ if request.utcnow < cutoff:
3107+ download = False
3108+ if download:
3109+ # Download layer to the Cache
3110+ # @ToDo: Call directly without going via HTTP
3111+ # s3mgr = current.manager
3112+ # @ToDo: Make this async by using Celery (also use this for the refresh time)
3113+ fields = ""
3114+ if self.data:
3115+ fields = "&data_field=%s" % self.data
3116+ if self.image:
3117+ fields = "%s&image_field=%s" % (fields, self.image)
3118+ _url = "%s%s/update.georss?fetchurl=%s%s" % (public_url,
3119+ URL(c="gis", f="cache_feed"),
3120+ url,
3121+ fields)
3122+ try:
3123+ # @ToDo: Need to commit to not have DB locked with SQLite?
3124+ fetch(_url)
3125+ if existing_cached_copy:
3126+ # Clear old selfs which are no longer active
3127+ query = (cachetable.source == url) & \
3128+ (cachetable.modified_on < cutoff)
3129+ db(query).delete()
3130+ except:
3131+ # Feed down
3132+ if existing_cached_copy:
3133+ # Use cached copy
3134+ # Should we Update timestamp to prevent every
3135+ # subsequent request attempting the download?
3136+ #query = (cachetable.source == url)
3137+ #db(query).update(modified_on=request.utcnow)
3138+ pass
3139+ else:
3140+ raise Exception("No cached copy available - skip layer")
3141+
3142+ name_safe = self.safe_name
3143+
3144+ # Pass the GeoJSON URL to the client
3145+ # Filter to the source of this feed
3146+ url = "%s.geojson?cache.source=%s" % (URL(c="gis", f="cache_feed"),
3147+ url)
3148+
3149+ # Mandatory attributes
3150+ output = {
3151+ "name": name_safe,
3152+ "url": url,
3153+ }
3154+ self.marker.add_attributes_to_output(output)
3155+
3156+ # Attributes which are defaulted client-side if not set
3157+ if self.refresh != 900:
3158+ output["refresh"] = self.refresh
3159+ self.setup_visibility_and_opacity(output)
3160+ self.setup_clustering(output)
3161+
3162+ return output
3163
3164 # -----------------------------------------------------------------------------
3165-class GPXLayer(Layer):
3166+class GPXLayer(MultiRecordLayer):
3167 """ GPX Layer from Catalogue """
3168- def __init__(self, gis, record=None):
3169- db = current.db
3170- tablename = "gis_layer_gpx"
3171- try:
3172- table = db[tablename]
3173- except:
3174- current.manager.load(tablename)
3175- table = db[tablename]
3176-
3177- self.gis = gis
3178- self.table = table
3179- self.js_array = "S3.gis.layers_gpx"
3180- self.record = record
3181- self._refresh()
3182- self.scripts = []
3183-
3184- def as_dict(self):
3185- gis = self.gis
3186- cluster_distance = gis.cluster_distance
3187- cluster_threshold = gis.cluster_threshold
3188- record = self.record
3189- marker = self.marker
3190-
3191- name_safe = self._name_safe(record.name)
3192-
3193- request = current.request
3194- url = URL(c="default", f="download",
3195- args=record.track)
3196-
3197- # Mandatory attributes
3198- output = {
3199- "name": name_safe,
3200- "url": url,
3201- "marker_image": marker.image,
3202- "marker_height": marker.height,
3203- "marker_width": marker.width,
3204- }
3205-
3206- # Attributes which are defaulted client-side if not set
3207- if not record.waypoints:
3208- output["waypoints"] = False
3209- if not record.tracks:
3210- output["tracks"] = False
3211- if not record.routes:
3212- output["routes"] = False
3213- if not record.visible:
3214- output["visibility"] = False
3215- if record.opacity != 1:
3216- output["opacity"] = "%.1f" % record.opacity
3217- if record.cluster_distance != cluster_distance:
3218- output["cluster_distance"] = record.cluster_distance
3219- if record.cluster_threshold != cluster_threshold:
3220- output["cluster_threshold"] = record.cluster_threshold
3221-
3222- return output
3223-
3224- def refresh(self):
3225- " Refresh the attributes of the Layer "
3226- gis = self.gis
3227- request = current.request
3228- record = self.record
3229- table = self.table
3230- if "marker_id" in table:
3231- self.set_marker()
3232- if "projection_id" in table:
3233- self.set_projection()
3234- if record:
3235- self.url = "%s/%s" % (URL(c="default", f="download"),
3236- record.track)
3237- else:
3238- self.url = None
3239+ table_name = "gis_layer_gpx"
3240+ js_array = "S3.gis.layers_gpx"
3241+
3242+ def __init__(self, gis):
3243+ super(GPXLayer, self).__init__(gis)
3244+
3245+# if record:
3246+# self.url = "%s/%s" % (URL(c="default", f="download"),
3247+# record.track)
3248+# else:
3249+# self.url = None
3250+
3251+ class SubLayer(MultiRecordLayer.SubLayer):
3252+ def as_dict(self):
3253+ gis = self.gis
3254+ request = current.request
3255+
3256+ url = URL(c="default", f="download",
3257+ args=self.track)
3258+
3259+ # Mandatory attributes
3260+ output = {
3261+ "name": self.safe_name,
3262+ "url": url,
3263+ }
3264+ self.marker.add_attributes_to_output(output)
3265+ self.add_attributes_if_not_default(
3266+ output,
3267+ waypoints = (self.waypoints, (True,)),
3268+ tracks = (self.tracks, (True,)),
3269+ routes = (self.routes, (True,)),
3270+ )
3271+ self.setup_visibility_and_opacity(output)
3272+ self.setup_clustering(output)
3273+ return output
3274
3275 # -----------------------------------------------------------------------------
3276-class KMLLayer(Layer):
3277+class KMLLayer(MultiRecordLayer):
3278 """ KML Layer from Catalogue """
3279- def __init__(self, gis, record=None):
3280- db = current.db
3281- tablename = "gis_layer_kml"
3282- try:
3283- table = db[tablename]
3284- except:
3285- current.manager.load(tablename)
3286- table = db[tablename]
3287-
3288- self.gis = gis
3289- self.table = table
3290- # @ToDo: Migrate to gis_cache
3291- self.cachetable = db.gis_cache2
3292- self.js_array = "S3.gis.layers_kml"
3293- self.record = record
3294- self._refresh()
3295- self.scripts = []
3296-
3297+ table_name = "gis_layer_kml"
3298+ js_array = "S3.gis.layers_kml"
3299+
3300+ def __init__(self, gis):
3301+ super(KMLLayer, self).__init__(gis)
3302+
3303+ "Set up the KML cache, should be done once per request"
3304 # Can we cache downloaded KML feeds?
3305 # Needed for unzipping & filtering as well
3306 # @ToDo: Should we move this folder to static to speed up access to cached content?
3307@@ -4676,263 +4521,172 @@
3308 else:
3309 try:
3310 os.mkdir(cachepath)
3311+ except OSError, os_error:
3312+ self.gis.debug(
3313+ "GIS: KML layers cannot be cached: %s %s" % (
3314+ cachepath,
3315+ os_error
3316+ )
3317+ )
3318+ cacheable = False
3319+ else:
3320 cacheable = True
3321- except:
3322- cacheable = False
3323- self.cacheable = cacheable
3324- self.cachepath = cachepath
3325-
3326- def as_dict(self):
3327- gis = self.gis
3328- cluster_distance = gis.cluster_distance
3329- cluster_threshold = gis.cluster_threshold
3330- record = self.record
3331- marker = self.marker
3332-
3333- T = current.T
3334- db = current.db
3335- request = current.request
3336- response = current.response
3337- public_url = gis.public_url
3338- cacheable = self.cacheable
3339- cachepath = self.cachepath
3340- cachetable = self.cachetable
3341-
3342- name = record.name
3343- name_safe = self._name_safe(record.name)
3344-
3345- if cacheable:
3346- _name = urllib2.quote(name)
3347- _name = _name.replace("%", "_")
3348- filename = "%s.file.%s.kml" % (cachetable._tablename,
3349- _name)
3350-
3351- # Should we download a fresh copy of the source file?
3352- download = True
3353- query = (cachetable.name == name)
3354- cached = db(query).select(cachetable.modified_on,
3355- limitby=(0, 1)).first()
3356- refresh = record.refresh or 900 # 15 minutes set if we have no data (legacy DB)
3357- if cached:
3358- modified_on = cached.modified_on
3359- cutoff = modified_on + timedelta(seconds=refresh)
3360- if request.utcnow < cutoff:
3361- download = False
3362-
3363- if download:
3364- # Download file
3365- if response.s3.tasks_active():
3366- # Async call
3367- db.task_scheduled.insert(name="download_kml_%s" % uuid.uuid4(),
3368- func="download_kml",
3369- args=json.dumps([record.id,
3370- filename]))
3371- else:
3372- # Sync call
3373- gis.download_kml(record.id, filename)
3374+ # @ToDo: Migrate to gis_cache
3375+ KMLLayer.cachetable = current.db.gis_cache2
3376+ KMLLayer.cacheable = cacheable
3377+ KMLLayer.cachepath = cachepath
3378+
3379+
3380+ class SubLayer(MultiRecordLayer.SubLayer):
3381+ def as_dict(self):
3382+ gis = self.gis
3383+
3384+ T = current.T
3385+ db = current.db
3386+ request = current.request
3387+ response = current.response
3388+ public_url = gis.public_url
3389+
3390+ cachetable = KMLLayer.cachetable
3391+ cacheable = KMLLayer.cacheable
3392+ cachepath = KMLLayer.cachepath
3393+
3394+ name = self.name
3395+ if cacheable:
3396+ _name = urllib2.quote(name)
3397+ _name = _name.replace("%", "_")
3398+ filename = "%s.file.%s.kml" % (cachetable._tablename,
3399+ _name)
3400+
3401+ # Should we download a fresh copy of the source file?
3402+ download = True
3403+ query = (cachetable.name == name)
3404+ cached = db(query).select(cachetable.modified_on,
3405+ limitby=(0, 1)).first()
3406+ refresh = self.refresh or 900 # 15 minutes set if we have no data (legacy DB)
3407 if cached:
3408- db(query).update(modified_on=request.utcnow)
3409- else:
3410- cachetable.insert(name=name, file=filename)
3411-
3412- url = URL(c="default", f="download",
3413- args=[filename])
3414- else:
3415- # No caching possible (e.g. GAE), display file direct from remote (using Proxy)
3416- # (Requires OpenLayers.Layer.KML to be available)
3417- url = record.url
3418-
3419- # Mandatory attributes
3420- output = {
3421- "name": name_safe,
3422- "url": url,
3423- "marker_image": marker.image,
3424- "marker_height": marker.height,
3425- "marker_width": marker.width,
3426- }
3427-
3428- # Attributes which are defaulted client-side if not set
3429- if record.title and record.title != "name":
3430- output["title"] = record.title
3431- if record.body and record.body != "description":
3432- output["body"] = record.body
3433- if record.refresh != 900:
3434- output["refresh"] = record.refresh
3435- if not record.visible:
3436- output["visibility"] = False
3437- if record.opacity != 1:
3438- output["opacity"] = "%.1f" % record.opacity
3439- if record.cluster_distance != cluster_distance:
3440- output["cluster_distance"] = record.cluster_distance
3441- if record.cluster_threshold != cluster_threshold:
3442- output["cluster_threshold"] = record.cluster_threshold
3443-
3444- return output
3445+ modified_on = cached.modified_on
3446+ cutoff = modified_on + timedelta(seconds=refresh)
3447+ if request.utcnow < cutoff:
3448+ download = False
3449+
3450+ if download:
3451+ # Download file
3452+ if response.s3.tasks_active():
3453+ # Async call
3454+ db.task_scheduled.insert(name="download_kml_%s" % uuid.uuid4(),
3455+ func="download_kml",
3456+ args=json.dumps([record.id,
3457+ filename]))
3458+ else:
3459+ # Sync call
3460+ gis.download_kml(self.id, filename)
3461+ if cached:
3462+ db(query).update(modified_on=request.utcnow)
3463+ else:
3464+ cachetable.insert(name=name, file=filename)
3465+
3466+ url = URL(r=request, c="default", f="download",
3467+ args=[filename])
3468+ else:
3469+ # No caching possible (e.g. GAE), display file direct from remote (using Proxy)
3470+ # (Requires OpenLayers.Layer.KML to be available)
3471+ url = self.url
3472+
3473+ output = dict(
3474+ name = self.safe_name,
3475+ url = url,
3476+ )
3477+ self.add_attributes_if_not_default(
3478+ output,
3479+ title = (self.title, ("name", None, "")),
3480+ body = (self.body, ("description", None)),
3481+ refresh = (self.refresh, (900,)),
3482+ )
3483+ self.setup_visibility_and_opacity(output)
3484+ self.setup_clustering(output)
3485+ self.marker.add_attributes_to_output(output)
3486+ return output
3487
3488 # -----------------------------------------------------------------------------
3489-class TMSLayer(Layer):
3490+class TMSLayer(MultiRecordLayer):
3491 """ TMS Layer from Catalogue """
3492- def __init__(self, gis, record=None):
3493- db = current.db
3494- tablename = "gis_layer_tms"
3495- try:
3496- table = db[tablename]
3497- except:
3498- current.manager.load(tablename)
3499- table = db[tablename]
3500-
3501- self.gis = gis
3502- self.table = table
3503- self.js_array = "S3.gis.layers_tms"
3504- self.record = record
3505- self._refresh()
3506- self.scripts = []
3507-
3508- def as_dict(self):
3509- #gis = self.gis
3510- record = self.record
3511-
3512- name_safe = self._name_safe(record.name)
3513-
3514- # Mandatory attributes
3515- output = {
3516- "name": name_safe,
3517- "url": record.url,
3518- "layername": record.layername
3519- }
3520-
3521- # Attributes which are defaulted client-side if not set
3522- if record.url2:
3523- output["url2"] = record.url2
3524- if record.url3:
3525- output["url3"] = record.url3
3526- if record.img_format != "png":
3527- output["format"] = record.img_format
3528- if record.zoom_levels != 9:
3529- output["zoomLevels"] = record.zoom_levels
3530- if record.attribution:
3531- output["attribution"] = record.attribution
3532-
3533- return output
3534+ table_name = "gis_layer_tms"
3535+ js_array = "S3.gis.layers_tms"
3536+
3537+ class SubLayer(MultiRecordLayer.SubLayer):
3538+ def as_dict(self):
3539+ output = {
3540+ "name": self.safe_name,
3541+ "url": self.url,
3542+ "layername": self.layername
3543+ }
3544+ self.add_attributes_if_not_default(
3545+ output,
3546+ url2 = (self.url2, (None,)),
3547+ url3 = (self.url3, (None,)),
3548+ format = (self.img_format, ("png", None)),
3549+ zoomLevels = (self.zoom_levels, (9,)),
3550+ attribution = (self.attribution, (None,)),
3551+ )
3552+ return output
3553
3554 # -----------------------------------------------------------------------------
3555-class WFSLayer(Layer):
3556+class WFSLayer(MultiRecordLayer):
3557 """ WFS Layer from Catalogue """
3558- def __init__(self, gis, record=None):
3559- db = current.db
3560- tablename = "gis_layer_wfs"
3561- try:
3562- table = db[tablename]
3563- except:
3564- current.manager.load(tablename)
3565- table = db[tablename]
3566-
3567- self.gis = gis
3568- self.table = table
3569- self.js_array = "S3.gis.layers_wfs"
3570- self.record = record
3571- self._refresh()
3572- self.scripts = []
3573-
3574- def as_dict(self):
3575- gis = self.gis
3576- cluster_distance = gis.cluster_distance
3577- cluster_threshold = gis.cluster_threshold
3578- record = self.record
3579- projection = self.projection
3580-
3581- name_safe = self._name_safe(record.name)
3582-
3583- # Mandatory attributes
3584- output = {
3585- "name": name_safe,
3586- "url": record.url,
3587- "title": record.title,
3588- "featureType": record.featureType,
3589- "featureNS": record.featureNS,
3590- "schema": record.wfs_schema,
3591+ table_name = "gis_layer_wfs"
3592+ js_array = "S3.gis.layers_wfs"
3593+
3594+ class SubLayer(MultiRecordLayer.SubLayer):
3595+ def as_dict(self):
3596+ output = dict(
3597+ name = self.safe_name,
3598+ url = self.url,
3599+ title = self.title,
3600+ featureType = self.featureType,
3601+ featureNS = self.featureNS,
3602+ schema = self.wfs_schema,
3603+ )
3604+ self.add_attributes_if_not_default(
3605+ output,
3606+ version = (self.version, ("1.1.0",)),
3607+ geometryName = (self.geometryName, ("the_geom",)),
3608+ styleField = (self.style_field, (None,)),
3609+ styleValues = (self.style_values, ("{}", None)),
3610+ projection = (self.projection.epsg, (4326,)),
3611 #editable
3612- }
3613-
3614- # Attributes which are defaulted client-side if not set
3615- if record.version != "1.1.0":
3616- output["version"] = record.version
3617- if record.geometryName != "the_geom":
3618- output["geometryName"] = record.geometryName
3619- if record.style_field:
3620- output["styleField"] = record.style_field
3621- if record.style_values and record.style_values != "{}":
3622- output["styleValues"] = record.style_values
3623- if projection.epsg != 4326:
3624- output["projection"] = projection.epsg
3625- if not record.visible:
3626- output["visibility"] = False
3627- if record.opacity != 1:
3628- output["opacity"] = "%.1f" % record.opacity
3629- if record.cluster_distance != cluster_distance:
3630- output["cluster_distance"] = record.cluster_distance
3631- if record.cluster_threshold != cluster_threshold:
3632- output["cluster_threshold"] = record.cluster_threshold
3633-
3634- return output
3635+ )
3636+ self.setup_visibility_and_opacity(output)
3637+ self.setup_clustering(output)
3638+ return output
3639
3640 # -----------------------------------------------------------------------------
3641-class WMSLayer(Layer):
3642+class WMSLayer(MultiRecordLayer):
3643 """ WMS Layer from Catalogue """
3644- def __init__(self, gis, record=None):
3645- db = current.db
3646- tablename = "gis_layer_wms"
3647- try:
3648- table = db[tablename]
3649- except:
3650- current.manager.load(tablename)
3651- table = db[tablename]
3652-
3653- self.gis = gis
3654- self.table = table
3655- self.js_array = "S3.gis.layers_wms"
3656- self.record = record
3657- self._refresh()
3658- self.scripts = []
3659-
3660- def as_dict(self):
3661- #gis = self.gis
3662- record = self.record
3663-
3664- name_safe = self._name_safe(record.name)
3665-
3666- # Mandatory attributes
3667- output = {
3668- "name": name_safe,
3669- "url": record.url,
3670- "layers": record.layers
3671- }
3672-
3673- # Attributes which are defaulted client-side if not set
3674- if not record.visible:
3675- output["visibility"] = False
3676- if record.opacity != 1:
3677- output["opacity"] = "%.1f" % record.opacity
3678- if not record.transparent:
3679- output["transparent"] = False
3680- if record.version != "1.1.1":
3681- output["version"] = record.version
3682- if record.img_format != "image/png":
3683- output["format"] = record.img_format
3684- if record.map:
3685- output["map"] = record.map
3686- if record.style:
3687- output["style"] = record.style
3688- if record.bgcolor:
3689- output["bgcolor"] = record.bgcolor
3690- if record.tiled:
3691- output["tiled"] = True
3692- if record.buffer:
3693- output["buffer"] = record.buffer
3694- if record.base:
3695- output["base"] = True
3696-
3697- return output
3698+ js_array = "S3.gis.layers_wms"
3699+ table_name = "gis_layer_wms"
3700+
3701+ class SubLayer(MultiRecordLayer.SubLayer):
3702+ def as_dict(self):
3703+ output = dict(
3704+ name = self.safe_name,
3705+ url = self.url,
3706+ layers = self.layers
3707+ )
3708+ self.add_attributes_if_not_default(
3709+ output,
3710+ transparent = (self.transparent, (True,)),
3711+ version = (self.version, ("1.1.1",)),
3712+ format = (self.img_format, ("image/png",)),
3713+ map = (self.map, (None,)),
3714+ buffer = (self.buffer, (0,)),
3715+ base = (self.base, (False,)),
3716+ style = (self.style, (None,)),
3717+ bgcolor = (self.bgcolor, (None,)),
3718+ tiled = (self.tiled, (False, )),
3719+ )
3720+ self.setup_visibility_and_opacity(output)
3721+ return output
3722
3723 # =============================================================================
3724 class S3MAP(S3Method):
3725@@ -5041,4 +4795,3 @@
3726 return page
3727
3728 # END =========================================================================
3729-
3730
3731=== added file 'modules/test_utils/AddedRole.py'
3732--- modules/test_utils/AddedRole.py 1970-01-01 00:00:00 +0000
3733+++ modules/test_utils/AddedRole.py 2011-09-05 13:55:13 +0000
3734@@ -0,0 +1,25 @@
3735+
3736+class AddedRole(object):
3737+ """Adds a role and removes it at the end of a test no matter what happens.
3738+
3739+ """
3740+ def __init__(self, session, role):
3741+ self.role = role
3742+ self.session = session
3743+
3744+ def __enter__(self):
3745+ roles = self.session.s3.roles
3746+ role = self.role
3747+ if not role in roles:
3748+ roles.append(role)
3749+
3750+ def __exit__(self, type, value, traceback):
3751+ session_s3_roles = self.session.s3.roles
3752+ roles = list(session_s3_roles)
3753+ for i in range(len(roles)):
3754+ session_s3_roles.pop(0)
3755+ add_role = session_s3_roles.append
3756+ role = self.role
3757+ for role in roles:
3758+ if role is not role:
3759+ add_role(role)
3760
3761=== added file 'modules/test_utils/Change.py'
3762--- modules/test_utils/Change.py 1970-01-01 00:00:00 +0000
3763+++ modules/test_utils/Change.py 2011-09-05 13:55:13 +0000
3764@@ -0,0 +1,25 @@
3765+
3766+_UNDEFINED = object()
3767+
3768+class Change(object):
3769+ def __init__(self, target, changes):
3770+ self.changes = changes
3771+ self.target = target
3772+
3773+ def __enter__(self):
3774+ assert not hasattr(self, "originals")
3775+ self.originals = originals = {}
3776+ # store originals and set new values
3777+ for name, value in self.changes.iteritems():
3778+ originals[name] = getattr(self.target, name, _UNDEFINED)
3779+ setattr(self.target, name, value)
3780+
3781+ def __exit__(self, type, value, traceback):
3782+ # restore originals
3783+ for name, value in self.originals.iteritems():
3784+ if value is _UNDEFINED:
3785+ delattr(self, name)
3786+ else:
3787+ setattr(self.target, name, value)
3788+ del self.originals
3789+
3790
3791=== added file 'modules/test_utils/ExpectSessionWarning.py'
3792--- modules/test_utils/ExpectSessionWarning.py 1970-01-01 00:00:00 +0000
3793+++ modules/test_utils/ExpectSessionWarning.py 2011-09-05 13:55:13 +0000
3794@@ -0,0 +1,14 @@
3795+
3796+class ExpectSessionWarning(object):
3797+ def __init__(self, session, warning):
3798+ self.warning = warning
3799+ self.session = session
3800+
3801+ def __enter__(self):
3802+ session = self.session
3803+ warnings = []
3804+ self.warnings = session.warning = warnings
3805+
3806+ def __exit__(self, type, value, traceback):
3807+ if type is None:
3808+ assert self.warning in self.warnings
3809
3810=== added file 'modules/test_utils/ExpectedException.py'
3811--- modules/test_utils/ExpectedException.py 1970-01-01 00:00:00 +0000
3812+++ modules/test_utils/ExpectedException.py 2011-09-05 13:55:13 +0000
3813@@ -0,0 +1,13 @@
3814+
3815+class ExpectedException(object):
3816+ def __init__(self, ExceptionClass):
3817+ self.ExceptionClass = ExceptionClass
3818+
3819+ def __enter__(self):
3820+ pass
3821+
3822+ def __exit__(self, type, value, traceback):
3823+ return issubclass(type, self.ExceptionClass), (
3824+ "%s not raised" % self.ExceptionClass.__name__
3825+ )
3826+
3827
3828=== added file 'modules/test_utils/InsertedRecord.py'
3829--- modules/test_utils/InsertedRecord.py 1970-01-01 00:00:00 +0000
3830+++ modules/test_utils/InsertedRecord.py 2011-09-05 13:55:13 +0000
3831@@ -0,0 +1,19 @@
3832+
3833+from clear_table import clear_table
3834+
3835+class InsertedRecord(object):
3836+ """Inserts and commits a record and removes it at the end of
3837+ a test no matter what happens.
3838+
3839+ """
3840+ def __init__(self, db, table, data):
3841+ self.db = db
3842+ self.table = table
3843+ self.data = data
3844+
3845+ def __enter__(self):
3846+ self.table.insert(**self.data)
3847+ self.db.commit()
3848+
3849+ def __exit__(self, type, value, traceback):
3850+ clear_table(self.db, self.table)
3851
3852=== added file 'modules/test_utils/Web2pyNosePlugin.py'
3853--- modules/test_utils/Web2pyNosePlugin.py 1970-01-01 00:00:00 +0000
3854+++ modules/test_utils/Web2pyNosePlugin.py 2011-09-05 13:55:13 +0000
3855@@ -0,0 +1,106 @@
3856+
3857+import nose
3858+import re
3859+from itertools import imap
3860+import unittest
3861+
3862+class Web2pyNosePlugin(nose.plugins.base.Plugin):
3863+ # see: http://somethingaboutorange.com/mrl/projects/nose/0.11.1/plugins/writing.html
3864+
3865+ """This plugin is designed to give the web2py environment to the tests.
3866+ """
3867+ score = 0
3868+ # always enable as this plugin can only
3869+ # be selected by running this script
3870+ enabled = True
3871+
3872+ def __init__(
3873+ self,
3874+ application_name,
3875+ environment,
3876+ directory_pattern,
3877+ test_folders
3878+ ):
3879+ super(Web2pyNosePlugin, self).__init__()
3880+ self.application_name = application_name
3881+ self.environment = environment
3882+ self.directory_pattern = directory_pattern
3883+ self.test_folders = test_folders
3884+
3885+ def options(self, parser, env):
3886+ """Register command line options"""
3887+ pass
3888+
3889+ def wantDirectory(self, dirname):
3890+ return bool(re.search(self.directory_pattern, dirname))
3891+
3892+ def wantFile(self, file_name):
3893+ print file_name
3894+ return file_name.endswith(".py") and any(
3895+ imap(file_name.__contains__, self.test_folders)
3896+ )
3897+
3898+ def wantModule(self, module):
3899+ return False
3900+
3901+ def loadTestsFromName(self, file_name, discovered):
3902+ """Sets up the unit-testing environment.
3903+
3904+ This involves loading modules as if by web2py.
3905+ Also we must have a test database.
3906+
3907+ If testing controllers, tests need to set up the request themselves.
3908+
3909+ """
3910+ if file_name.endswith(".py"):
3911+
3912+ # Is it possible that the module could load
3913+ # other code that is using the original db?
3914+
3915+ test_globals = self.environment
3916+
3917+ module_globals = dict(self.environment)
3918+ # execfile is used because it doesn't create a module
3919+ # or load the module from sys.modules if it exists.
3920+
3921+ execfile(file_name, module_globals)
3922+
3923+ import inspect
3924+ # we have to return something, otherwise nose
3925+ # will let others have a go, and they won't pass
3926+ # in the web2py environment, so we'll get errors
3927+ tests = []
3928+
3929+ for name, thing in module_globals.iteritems():
3930+ if (
3931+ # don't bother with globally imported things
3932+ name not in test_globals \
3933+ # unless they have been overridden
3934+ or test_globals[name] is not thing
3935+ ):
3936+ if (
3937+ isinstance(thing, type)
3938+ and issubclass(thing, unittest.TestCase)
3939+ ):
3940+ # look for test methods
3941+ for member_name in dir(thing):
3942+ if member_name.startswith("test"):
3943+ if callable(getattr(thing, member_name)):
3944+ tests.append(thing(member_name))
3945+ elif (
3946+ name.startswith("test")
3947+ or name.startswith("Test")
3948+ ):
3949+ if inspect.isfunction(thing):
3950+ function = thing
3951+ function_name = name
3952+ # things coming from execfile have no module
3953+ #print file_name, function_name, function.__module__
3954+ if function.__module__ in ("__main__", None):
3955+ tests.append(
3956+ nose.case.FunctionTestCase(function)
3957+ )
3958+ return tests
3959+ else:
3960+ return []
3961+
3962
3963=== modified file 'modules/test_utils/__init__.py'
3964--- modules/test_utils/__init__.py 2011-08-11 19:25:52 +0000
3965+++ modules/test_utils/__init__.py 2011-09-05 13:55:13 +0000
3966@@ -1,2 +1,12 @@
3967
3968-from compare_lines import compare_lines
3969\ No newline at end of file
3970+from compare_lines import compare_lines
3971+from clear_table import clear_table
3972+from find_JSON_format_data_structure import *
3973+from Web2pyNosePlugin import Web2pyNosePlugin
3974+from assert_equal import *
3975+
3976+from InsertedRecord import *
3977+from AddedRole import *
3978+from ExpectedException import *
3979+from Change import *
3980+from ExpectSessionWarning import ExpectSessionWarning
3981
3982=== added file 'modules/test_utils/assert_equal.py'
3983--- modules/test_utils/assert_equal.py 1970-01-01 00:00:00 +0000
3984+++ modules/test_utils/assert_equal.py 2011-09-05 13:55:13 +0000
3985@@ -0,0 +1,60 @@
3986+
3987+
3988+def assert_same_type(expected, actual):
3989+ assert isinstance(actual, type(expected)), "%s vs. %s" % (type(expected), type(actual))
3990+
3991+def assert_equal_sequence(expected, actual):
3992+ assert len(expected) == len(actual), "length should be %i, not %i:\n%s" % (
3993+ len(expected), len(actual), actual
3994+ )
3995+ for i in range(len(expected)):
3996+ try:
3997+ assert_equal(expected[i], actual[i])
3998+ except AssertionError, assertion_error:
3999+ raise AssertionError(
4000+ str(assertion_error)
4001+ )
4002+
4003+def assert_equal_set(expected, actual):
4004+ missing = expected.difference(actual)
4005+ assert not missing, "Missing: %s" % ", ".join(missing)
4006+
4007+ extra = actual.difference(expected)
4008+ assert not extra, "Extra: %s" % ", ".join(extra)
4009+
4010+def assert_equal_dict(expected, actual):
4011+ assert_equal_set(
4012+ expected = set(expected.keys()),
4013+ actual = set(actual.keys())
4014+ )
4015+ for key in expected.iterkeys():
4016+ try:
4017+ assert_equal(expected[key], actual[key])
4018+ except AssertionError, assertion_error:
4019+ raise AssertionError(
4020+ "[%s] %s" % (
4021+ key,
4022+ str(assertion_error),
4023+ )
4024+ )
4025+
4026+def assert_equal_value(expected, actual):
4027+ assert expected == actual, "%s != %s" % (expected, actual)
4028+
4029+_compare_procs = {
4030+ list: assert_equal_sequence,
4031+ int: assert_equal_value,
4032+ float: assert_equal_value,
4033+ str: assert_equal_value,
4034+ unicode: assert_equal_value, #sequence,
4035+ dict: assert_equal_dict,
4036+ set: assert_equal_set,
4037+}
4038+
4039+def assert_equal(expected, actual):
4040+ assert_same_type(expected, actual)
4041+ compare_proc = _compare_procs.get(type(expected), assert_equal_value)
4042+ compare_proc(
4043+ expected,
4044+ actual
4045+ )
4046
4047=== added file 'modules/test_utils/clear_table.py'
4048--- modules/test_utils/clear_table.py 1970-01-01 00:00:00 +0000
4049+++ modules/test_utils/clear_table.py 2011-09-05 13:55:13 +0000
4050@@ -0,0 +1,4 @@
4051+
4052+def clear_table(db, db_table):
4053+ db(db_table.id).delete()
4054+ db.commit()
4055
4056=== added file 'modules/test_utils/find_JSON_format_data_structure.py'
4057--- modules/test_utils/find_JSON_format_data_structure.py 1970-01-01 00:00:00 +0000
4058+++ modules/test_utils/find_JSON_format_data_structure.py 2011-09-05 13:55:13 +0000
4059@@ -0,0 +1,54 @@
4060+
4061+import re
4062+from json.decoder import JSONDecoder
4063+
4064+__all__ = (
4065+ "not_found",
4066+ "cannot_parse_JSON",
4067+ "find_JSON_format_data_structure"
4068+)
4069+
4070+def not_found(name, string):
4071+ raise Exception(
4072+ u"Cannot find %s in %s" % (name, string)
4073+ )
4074+
4075+def cannot_parse_JSON(string):
4076+ raise Exception(
4077+ u"Cannot parse JSON: '%s'" % string
4078+ )
4079+
4080+def find_JSON_format_data_structure(
4081+ string,
4082+ name,
4083+ found,
4084+ not_found,
4085+ cannot_parse_JSON
4086+):
4087+ """Finds a named JSON-format data structure in the string.
4088+
4089+ The name can be any string.
4090+ The pattern "name = " will be looked for in the string,
4091+ and the data structure following it parsed and returned as a python
4092+ data structure.
4093+ """
4094+ try:
4095+ name_start = string.index(name)
4096+ except ValueError:
4097+ not_found(name, string)
4098+ else:
4099+ name_length = len(name)
4100+ name_end = name_start + name_length
4101+
4102+ _, remaining = re.Scanner([
4103+ (r"\s*=\s*", lambda scanner, token: None)
4104+ ]).scan(
4105+ string[name_end:]
4106+ )
4107+
4108+ try:
4109+ data, end_position = JSONDecoder().raw_decode(remaining)
4110+ except ValueError, value_error:
4111+ cannot_parse_JSON(remaining)
4112+ else:
4113+ found(data)
4114
4115=== modified file 'modules/test_utils/run.py'
4116--- modules/test_utils/run.py 2011-08-11 19:25:52 +0000
4117+++ modules/test_utils/run.py 2011-09-05 13:55:13 +0000
4118@@ -1,5 +1,8 @@
4119 #!python
4120
4121+# capture web2py environment before doing anything else
4122+
4123+web2py_environment = dict(globals())
4124
4125 __doc__ = """This script is run from the nose command in the
4126 application being tested:
4127@@ -8,249 +11,81 @@
4128
4129 python2.6 ./applications/eden/tests/nose.py <nose arguments>
4130
4131+web2py runs a file which:
4132+1. Sets up a plugin. This plugin registers itself so nose can use it.
4133+2. Runs nose programmatically giving it the plugin
4134+nose loads the tests via the plugin.
4135+when the plugin loads the tests, it injects the web2py environment.
4136+
4137 """
4138
4139-########################################
4140-#
4141-# web2py runs a file which:
4142-# 1. Sets up a plugin. This plugin registers itself so nose can use it.
4143-# 2. Runs nose programmatically giving it the plugin
4144-#
4145-# nose loads the tests via the plugin.
4146-# when the plugin loads the tests, it injects the web2py environment.
4147-#
4148-########################################
4149-
4150-
4151-# @ToDo:
4152-# in particular, haven't checked that db is being replaced with the test_db
4153-# (work in progress)
4154-
4155-# using --import_models is OK for a single test, but for running a suite,
4156-# we probably need the test_env function to set up an environment
4157-
4158-# @ToDo: Provide an ignore_warns mode so that we can tackle ERRORs 1st
4159-# but FAILs often give us clues that help us fix ERRORs
4160-# fixing an error might itself cause a failure, but not be spotted until later.
4161-
4162-def use_test_db(db):
4163- print "Creating test database..."
4164- try:
4165- test_db = use_test_db.db
4166- except AttributeError:
4167- # create test database by copying db
4168- test_db_name = "sqlite://testing.sqlite"
4169- print "Copying db tables into test database..."
4170- test_db = DAL(test_db_name) # Name and location of the test DB file
4171- # Copy tables!
4172- for tablename in db.tables:
4173- table_copy = []
4174- for data in db[tablename]:
4175- table_copy.append(
4176- copy.copy(data)
4177- )
4178- test_db.define_table(tablename, *table_copy)
4179- use_test_db.db = test_db
4180- return test_db
4181-
4182+import sys
4183+
4184+from types import ModuleType
4185 import nose
4186-
4187+import glob
4188+import os.path
4189+import os
4190 import copy
4191-def test_env(
4192- imports,
4193- application,
4194- controller,
4195- function,
4196- folder,
4197- globals,
4198- create_test_db = use_test_db,
4199- _module_root = "applications",
4200-):
4201- """Sets up the unit-testing environment.
4202-
4203- This involves loading modules as if by web2py.
4204- Also we must make a test database.
4205-
4206- """
4207- from gluon.globals import Request
4208- globals["Request"] = Request
4209- request = Request()
4210- request.application = application
4211- request.controller = controller
4212- request.function = function
4213- request.folder = folder
4214-
4215- globals["request"] = request
4216-
4217- test_db = db#create_test_db(db)
4218-
4219+from gluon.globals import Request
4220+import unittest
4221+
4222+
4223+def load_module(application_relative_module_path):
4224 import os
4225- for import_path, names in imports:
4226- #print import_path
4227- #module = {"db": test_db}
4228- #module.update(globals)
4229- module = dict(globals)
4230- path_components = [_module_root] + import_path.split(".")
4231- file_path = os.path.join(*path_components)+".py"
4232- # execfile is used because it doesn't create a module
4233- # and doesn't load the module if it exists.
4234- execfile(file_path, module)
4235- if names is "*":
4236- globals.update(module)
4237- else:
4238- for name in names:
4239- globals[name] = module[name]
4240-
4241-# -----------------------
4242-
4243-import os.path
4244+ web2py_relative_module_path = ".".join((
4245+ "applications", request.application, application_relative_module_path
4246+ ))
4247+ imported_module = __import__(web2py_relative_module_path)
4248+ for step in web2py_relative_module_path.split(".")[1:]:
4249+ imported_module = getattr(imported_module, step)
4250+ return imported_module
4251+
4252+web2py_environment["load_module"] = load_module
4253+web2py_env_module = ModuleType("web2py_env")
4254+web2py_env_module.__dict__.update(web2py_environment)
4255+sys.modules["web2py_env"] = web2py_env_module
4256+
4257
4258 application_name = request.application
4259-
4260-model_files_pattern = os.path.join("applications",application_name,"models","*.py")
4261-import glob
4262-
4263-test_env(
4264- globals = globals(),
4265- application = application_name,
4266- controller = "controller",
4267- function = "function",
4268- folder = "folder",
4269- imports = [
4270- (application_name+".models.%s" % (module_name[len(model_files_pattern)-4:-3]), "*")
4271- for module_name in glob.glob(model_files_pattern)
4272- ]
4273-)
4274-
4275-import sys
4276-log = sys.stderr.write
4277-
4278-import unittest
4279-from itertools import imap
4280+application_folder_path = os.path.join("applications",application_name)
4281+
4282+application = application_name
4283+controller = "controller"
4284+function = "function"
4285+folder = os.path.join(os.getcwd(), "applications", application_name)
4286+
4287+web2py_environment["Request"] = Request
4288+request = Request()
4289+request.application = application
4290+request.controller = controller
4291+request.function = function
4292+request.folder = folder
4293+
4294+web2py_environment["request"] = request
4295+current.request = request
4296+
4297+controller_configuration = OrderedDict()
4298+for controller_name in ["default"]+glob.glob(
4299+ os.path.join(application_folder_path, "controllers", "*.py")
4300+):
4301+ controller_configuration[controller_name] = Storage(
4302+ name_nice = controller_name,
4303+ description = controller_name,
4304+ restricted = False,
4305+ module_type = 0
4306+ )
4307+
4308+current.deployment_settings.modules = controller_configuration
4309
4310 test_folders = set()
4311-
4312-class Web2pyNosePlugin(nose.plugins.base.Plugin):
4313- # see: http://somethingaboutorange.com/mrl/projects/nose/0.11.1/plugins/writing.html
4314-
4315- """This plugin is designed to give the web2py environment to the tests.
4316- """
4317- score = 0
4318- # always enable as this plugin can only
4319- # be selected by running this script
4320- enabled = True
4321-
4322- def __init__(
4323- self,
4324- application_name,
4325- environment,
4326- create_test_db,
4327- directory_pattern
4328- ):
4329- super(Web2pyNosePlugin, self).__init__()
4330- self.application_name = application_name
4331- self.environment = dict(
4332- db = db#create_test_db(db)
4333- )
4334- self.environment.update(environment)
4335- self.directory_pattern = directory_pattern
4336-
4337- def options(self, parser, env):
4338- """Register command line options"""
4339- return
4340- parser.add_option(
4341- "--web2py",
4342- dest="web2py",
4343- action="append",
4344- metavar="ATTR",
4345- help="Use web2py environment when loading tests"
4346- )
4347-
4348- def wantDirectory(self, dirname):
4349- return bool(re.search(self.directory_pattern, dirname))
4350-
4351- def wantFile(self, file_name):
4352- return file_name.endswith(".py") and any(
4353- imap(file_name.__contains__, test_folders)
4354- )
4355-
4356- def wantModule(self, module):
4357- return False
4358-
4359- def loadTestsFromName(self, file_name, discovered):
4360- """Sets up the unit-testing environment.
4361-
4362- This involves loading modules as if by web2py.
4363- Also we must have a test database.
4364-
4365- If testing controllers, tests need to set up the request themselves.
4366-
4367- """
4368- if file_name.endswith(".py"):
4369-# log(file_name)
4370-
4371- # assert 0, file_name
4372- # stop
4373-
4374- # Is it possible that the module could load
4375- # other code that is using the original db?
4376-
4377- test_globals = self.environment
4378-
4379- # execfile is used because it doesn't create a module
4380- # and doesn't load the module into sys.modules if it exists.
4381- module_globals = dict(self.environment)
4382- execfile(file_name, module_globals)
4383-
4384- import inspect
4385- # we have to return something, otherwise nose
4386- # will let others have a go, and they won't pass
4387- # in the web2py environment, so we'll get errors
4388- tests = []
4389-
4390- for name, thing in module_globals.iteritems():
4391- if (
4392- # don't bother with globally imported things
4393- name not in test_globals \
4394- # unless they have been overridden
4395- or test_globals[name] is not thing
4396- ):
4397- if (
4398- isinstance(thing, type)
4399- and issubclass(thing, unittest.TestCase)
4400- ):
4401- # look for test methods
4402- for member_name in dir(thing):
4403- if member_name.startswith("test"):
4404- if callable(getattr(thing, member_name)):
4405- tests.append(thing(member_name))
4406- elif (
4407- name.startswith("test")
4408- or name.startswith("Test")
4409- ):
4410- if inspect.isfunction(thing):
4411- function = thing
4412- function_name = name
4413- # things coming from execfile have no module
4414- #print file_name, function_name, function.__module__
4415- if function.__module__ is None:
4416- tests.append(
4417- nose.case.FunctionTestCase(function)
4418- )
4419- return tests
4420- else:
4421- return
4422-
4423-import re
4424-
4425-argv = [
4426- #"--verbosity=2",
4427- #"--debug=nose"
4428-]
4429+argv = []
4430
4431 # folder in which tests are kept
4432 # non-option arguments (test paths) are made relative to this
4433-test_root = os.path.join("applications", application_name, "tests", "unit_tests")
4434+test_root = os.path.join(application_folder_path, "tests", "unit_tests")
4435+
4436+current_working_directory = os.getcwd()
4437
4438 disallowed_options = {}
4439 disallowed_options["-w"] = disallowed_options["--where"] = (
4440@@ -275,10 +110,10 @@
4441 argv.append(arg)
4442 else:
4443 test_path = arg
4444- test_fuller_path = os.path.join(test_root, test_path)
4445- test_folders.add(test_fuller_path)
4446- if not os.path.exists(test_fuller_path):
4447- print "\n", test_fuller_path, "not found"
4448+ test_folder_fuller_path = os.path.join(test_root, test_path)
4449+ test_folders.add(test_folder_fuller_path)
4450+ if not os.path.exists(test_folder_fuller_path):
4451+ print "\n", test_folder_fuller_path, "not found"
4452 #sys.exit(1)
4453
4454 # test paths in command line aren't passed, just added to test_folders
4455@@ -293,14 +128,15 @@
4456
4457 sys.argv[1:] = argv
4458
4459+test_utils = local_import("test_utils")
4460+
4461 nose.main(
4462 # seems at least this version of nose ignores passed in argv
4463 # argv = argv,
4464 addplugins = nose.plugins.PluginManager([
4465- Web2pyNosePlugin(
4466+ test_utils.Web2pyNosePlugin(
4467 application_name,
4468- globals(),
4469- use_test_db,
4470+ web2py_environment,
4471 re.compile(
4472 re.escape(os.path.sep).join(
4473 (
4474@@ -311,7 +147,8 @@
4475 "[^","]*)*)?)?)?$"
4476 )
4477 )
4478- )
4479+ ),
4480+ test_folders
4481 )
4482 ])
4483 )
4484
4485=== added file 'private/prepopulate/default/tasks.cfg'
4486--- private/prepopulate/default/tasks.cfg 1970-01-01 00:00:00 +0000
4487+++ private/prepopulate/default/tasks.cfg 2011-09-05 13:55:13 +0000
4488@@ -0,0 +1,18 @@
4489+##########################################################################
4490+# Add a list of csv file to import into the system
4491+# the list of import file sis a comma separated list as follows:
4492+# "prefix","tablename","csv file name","stylesheet"
4493+#
4494+# The csv file is assumed to be in the same directory as this file
4495+# The style sheet is assumed to be in either of the following directories:
4496+# static/format/s3csv/"prefix"/
4497+# static/format/s3csv/
4498+#
4499+# For details on how to import data into the system see the following:
4500+# zzz_1st_run
4501+# s3Tools::S3BulkImporter
4502+##########################################################################
4503+"supply","catalog_item","DefaultItems.csv","supply_items.xsl"
4504+"supply","catalog_item","StandardItems.csv","supply_items.xsl"
4505+"hrm","skill","DefaultSkillList.csv","skill.xsl"
4506+"hrm","competency_rating",DefaultSkillCompetency.csv,competency_rating.xsl
4507\ No newline at end of file
4508
4509=== removed file 'private/prepopulate/default/tasks.cfg'
4510--- private/prepopulate/default/tasks.cfg 2011-08-17 15:13:43 +0000
4511+++ private/prepopulate/default/tasks.cfg 1970-01-01 00:00:00 +0000
4512@@ -1,18 +0,0 @@
4513-##########################################################################
4514-# Add a list of csv file to import into the system
4515-# the list of import file sis a comma separated list as follows:
4516-# "prefix","tablename","csv file name","stylesheet"
4517-#
4518-# The csv file is assumed to be in the same directory as this file
4519-# The style sheet is assumed to be in either of the following directories:
4520-# static/format/s3csv/"prefix"/
4521-# static/format/s3csv/
4522-#
4523-# For details on how to import data into the system see the following:
4524-# zzz_1st_run
4525-# s3Tools::S3BulkImporter
4526-##########################################################################
4527-"supply","catalog_item","DefaultItems.csv","supply_items.xsl"
4528-"supply","catalog_item","StandardItems.csv","supply_items.xsl"
4529-"hrm","skill","DefaultSkillList.csv","skill.xsl"
4530-"hrm","competency_rating",DefaultSkillCompetency.csv,competency_rating.xsl
4531
4532=== modified file 'static/scripts/S3/s3.gis.climate.js'
4533--- static/scripts/S3/s3.gis.climate.js 2011-06-15 09:47:54 +0000
4534+++ static/scripts/S3/s3.gis.climate.js 2011-09-05 13:55:13 +0000
4535@@ -9,154 +9,372 @@
4536 }
4537 }
4538
4539-
4540 ClimateDataMapPlugin = function (config) {
4541- var self = this // so no this-clobbering
4542- self.data_type_option_names = config.data_type_option_names
4543- self.parameter_names = config.parameter_names
4544- self.projected_option_type_names = config.projected_option_type_names
4545- self.year_min = config.year_min
4546- self.year_max = config.year_max
4547-
4548- self.data_type_label = config.data_type_label
4549- self.projected_option_type_label = config.projected_option_type_label
4550-
4551- self.setup = function () {
4552- var graphic = new OpenLayers.Layer.Image(
4553- 'Test Data',
4554- '/eden/climate/climate_image_overlay',
4555- new OpenLayers.Bounds(8900000, 3020000, 9850000, 3580000),
4556-// new OpenLayers.Bounds(-180, -88.759, 180, 88.759),
4557- new OpenLayers.Size(249, 139),
4558- {
4559- // numZoomLevels: 3,
4560- isBaseLayer:false,
4561- opacity: 0.5,
4562- transparent:true
4563- }
4564- );
4565- graphic.events.on({
4566- loadstart: function() {
4567- OpenLayers.Console.log("loadstart");
4568- },
4569- loadend: function() {
4570- OpenLayers.Console.log("loadend");
4571- }
4572- });
4573- map.addLayer(graphic);
4574+ var plugin = this // let's be explicit!
4575+ plugin.data_type_option_names = config.data_type_option_names
4576+ plugin.parameter_names = config.parameter_names
4577+ plugin.year_min = config.year_min
4578+ plugin.year_max = config.year_max
4579+
4580+ plugin.data_type_label = config.data_type_label
4581+ plugin.overlay_data_URL = config.overlay_data_URL
4582+ plugin.chart_URL = config.chart_URL
4583+ delete config
4584+
4585+ plugin.setup = function () {
4586+ var overlay_layer = plugin.overlay_layer = new OpenLayers.Layer.Vector(
4587+ 'Climate data map overlay',
4588+ {
4589+ isBaseLayer:false,
4590+ }
4591+ );
4592+ map.addLayer(overlay_layer);
4593+
4594+ // selection
4595+ OpenLayers.Feature.Vector.style['default']['strokeWidth'] = '2'
4596+ var selectCtrl = new OpenLayers.Control.SelectFeature(
4597+ overlay_layer,
4598+ {
4599+ clickout: true,
4600+ toggle: false,
4601+ multiple: false,
4602+ hover: false,
4603+ toggleKey: 'altKey',
4604+ multipleKey: 'shiftKey',
4605+ box: true,
4606+ onSelect: function (feature) {
4607+ feature.style.strokeColor = 'black'
4608+ feature.style.strokeDashstyle = 'dash'
4609+ overlay_layer.drawFeature(feature)
4610+ },
4611+ onUnselect: function (feature) {
4612+ feature.style.strokeColor = 'none'
4613+ overlay_layer.drawFeature(feature)
4614+ },
4615+ }
4616+ );
4617+
4618+ map.addControl(selectCtrl);
4619+
4620+ selectCtrl.activate();
4621 }
4622- self.addToMapWindow = function (items) {
4623- function toggle_projected_options() {
4624- $('#projected-options').toggle(
4625- $('#id_Projected').attr('checked') == 'checked'
4626+ plugin.addToMapWindow = function (items) {
4627+ var combo_box_size = {
4628+ width: 120,
4629+ heigth:25
4630+ }
4631+
4632+ function make_combo_box(
4633+ data,
4634+ fieldLabel,
4635+ hiddenName
4636+ ) {
4637+ var options = []
4638+ each(
4639+ data,
4640+ function (option) {
4641+ options.push([option, option])
4642+ }
4643 )
4644- }
4645- var climate_data_type_options = [];
4646- each(
4647- self.data_type_option_names,
4648- function (option_name) {
4649- var radio_button = new Ext.form.Radio({
4650- name: "data-type",
4651- id: "id_%s" % option_name,
4652- boxLabel: option_name,
4653- checked: option_name == self.data_type_option_names[0],
4654- })
4655- radio_button.on({
4656- change: toggle_projected_options
4657- })
4658- climate_data_type_options.push(radio_button)
4659- }
4660- )
4661- var projected_options = [];
4662- each(
4663- self.projected_option_type_names,
4664- function (projected_option_type_name) {
4665- projected_options.push(
4666- new Ext.form.Radio({
4667- name: "projected-option-type",
4668- id: "id_%s" % projected_option_type_name,
4669- boxLabel: projected_option_type_name,
4670- })
4671- )
4672- }
4673- )
4674- var projected_options_widget = new Ext.form.FieldSet({
4675- title: self.projected_option_type_label,
4676- items: [
4677- new Ext.form.CheckboxGroup({
4678- items: projected_options,
4679- xtype: 'checkboxgroup',
4680- columns: 1
4681- })
4682- ]
4683- })
4684-
4685- var climate_data_type_options = new Ext.form.FieldSet({
4686- title: self.data_type_label,
4687- items: [
4688- new Ext.form.RadioGroup({
4689- items: climate_data_type_options,
4690- columns: 1,
4691- })
4692- ]
4693- })
4694-
4695- var parameter_options = [];
4696- each(
4697- self.parameter_names,
4698- function (parameter_name) {
4699- var checkbox = new Ext.form.Checkbox({
4700- name: parameter_name,
4701- id: "id_%s" % parameter_name,
4702- boxLabel: parameter_name,
4703- })
4704- parameter_options.push(checkbox)
4705- }
4706- )
4707-
4708- var parameters_widget = new Ext.form.FieldSet({
4709- title: "Parameters",
4710- items: [
4711- new Ext.form.CheckboxGroup({
4712- items: parameter_options,
4713- xtype: 'checkboxgroup',
4714- columns: 1
4715- })
4716- ]
4717- })
4718-
4719- var period_widget = new Ext.form.FieldSet({
4720- title: "Period",
4721- items: [
4722- new Ext.form.NumberField({
4723- fieldLabel: "From",
4724- minValue: self.year_min,
4725- maxValue: self.year_max,
4726- value: self.year_min
4727+ var combo_box = new Ext.form.ComboBox({
4728+ fieldLabel: fieldLabel,
4729+ hiddenName: hiddenName,
4730+ store: new Ext.data.SimpleStore({
4731+ fields: ['name', 'option'],
4732+ data: options
4733 }),
4734- new Ext.form.NumberField({
4735- fieldLabel: "To",
4736- minValue: self.year_min,
4737- maxValue: self.year_max,
4738- value: self.year_max
4739- })
4740- ]
4741- })
4742+ displayField: 'name',
4743+ typeAhead: true,
4744+ mode: 'local',
4745+ triggerAction: 'all',
4746+ emptyText:'Choose...',
4747+ selectOnFocus:true
4748+ })
4749+ combo_box.setSize(combo_box_size)
4750+ return combo_box
4751+ }
4752+ var data_type_combo_box = make_combo_box(
4753+ plugin.data_type_option_names,
4754+ 'Data type',
4755+ 'data_type'
4756+ )
4757+
4758+ var variable_combo_box = make_combo_box(
4759+ plugin.parameter_names,
4760+ 'Variable',
4761+ 'parameter'
4762+ )
4763+
4764+ var statistic_combo_box = make_combo_box(
4765+ ['Minimum','Maximum','Average'],
4766+ 'Aggregate values',
4767+ 'statistic'
4768+ )
4769+
4770 var climate_data_panel = new Ext.FormPanel({
4771 id: 'climate_data_panel',
4772- title: 'Climate data',
4773+ title: 'Climate data map overlay',
4774 collapsible: true,
4775 collapseMode: 'mini',
4776 items: [{
4777 region: 'center',
4778 items: [
4779- climate_data_type_options,
4780- projected_options_widget,
4781- parameters_widget,
4782- period_widget
4783+ new Ext.form.FieldSet({
4784+ title: 'Data set',
4785+ items: [
4786+ data_type_combo_box,
4787+ variable_combo_box
4788+ ]
4789+ }),
4790+ new Ext.form.FieldSet({
4791+ title: 'Period',
4792+ items: [
4793+ new Ext.form.NumberField({
4794+ fieldLabel: 'From',
4795+ name: 'from_date',
4796+ minValue: plugin.year_min,
4797+ maxValue: plugin.year_max,
4798+ value: plugin.year_min
4799+ }),
4800+ new Ext.form.NumberField({
4801+ fieldLabel: 'To',
4802+ name: 'to_date',
4803+ minValue: plugin.year_min,
4804+ maxValue: plugin.year_max,
4805+ value: plugin.year_max,
4806+ size: combo_box_size
4807+ })
4808+ ]
4809+ }),
4810+ new Ext.form.FieldSet({
4811+ title: 'Map overlay colours',
4812+ items: [
4813+ statistic_combo_box,
4814+ ]
4815+ })
4816 ]
4817 }]
4818 });
4819+
4820+ var update_map_layer_button = new Ext.Button({
4821+ text: 'Update map layer',
4822+ disabled: true,
4823+ handler: function() {
4824+ plugin.overlay_layer.destroyFeatures()
4825+
4826+ // request new features
4827+ var form_values = climate_data_panel.getForm().getValues()
4828+
4829+ // add new features
4830+ $.ajax({
4831+ url: plugin.overlay_data_URL,
4832+ data: {
4833+ data_type: form_values.data_type,
4834+ statistic: form_values.statistic,
4835+ parameter: form_values.parameter,
4836+ from_date: form_values.from_date,
4837+ to_date: form_values.to_date
4838+ },
4839+ success: function(feature_data, status_code) {
4840+ function Vector(geometry, attributes, style) {
4841+ style.strokeColor= 'none'
4842+ style.fillOpacity= 0.8
4843+ style.strokeWidth = 1
4844+
4845+ return new OpenLayers.Feature.Vector(
4846+ geometry, attributes, style
4847+ )
4848+ }
4849+ function Polygon(components) {
4850+ return new OpenLayers.Geometry.Polygon(components)
4851+ }
4852+ function Point(lon, lat) {
4853+ var point = new OpenLayers.Geometry.Point(lat, lon)
4854+ return point.transform(
4855+ S3.gis.proj4326,
4856+ S3.gis.projection_current
4857+ )
4858+ }
4859+ function LinearRing(point_list) {
4860+ point_list.push(point_list[0])
4861+ return new OpenLayers.Geometry.LinearRing(point_list)
4862+ }
4863+ eval('var data = '+feature_data)
4864+ $('#id_key_min_value').html(data.min)
4865+ $('#id_key_max_value').html(data.max)
4866+ plugin.overlay_layer.addFeatures(data.features)
4867+ }
4868+ });
4869+ }
4870+ });
4871+
4872+ function enable_update_layer_button_if_form_complete(
4873+ box, record, index
4874+ ) {
4875+ if (
4876+ !!data_type_combo_box.getValue() &&
4877+ !!variable_combo_box.getValue() &&
4878+ !!statistic_combo_box.getValue()
4879+ ) {
4880+ update_map_layer_button.enable()
4881+ }
4882+ }
4883+ data_type_combo_box.on(
4884+ 'change',
4885+ enable_update_layer_button_if_form_complete
4886+ );
4887+ variable_combo_box.on(
4888+ 'change',
4889+ enable_update_layer_button_if_form_complete
4890+ );
4891+ statistic_combo_box.on(
4892+ 'change',
4893+ enable_update_layer_button_if_form_complete
4894+ );
4895+ climate_data_panel.addButton(update_map_layer_button)
4896+
4897+ var show_chart_button = new Ext.Button({
4898+ text: 'Show chart',
4899+ disabled: true,
4900+ handler: function() {
4901+ // create URL
4902+ var place_ids = []
4903+ each(
4904+ plugin.overlay_layer.selectedFeatures,
4905+ function (feature) {
4906+ place_ids.push(feature.data.id)
4907+ }
4908+ )
4909+ var form_values = climate_data_panel.getForm().getValues(),
4910+ data_type = form_values.data_type,
4911+ parameter = form_values.parameter,
4912+ from_date = form_values.from_date,
4913+ to_date = form_values.to_date,
4914+ place_ids = place_ids;
4915+
4916+ var spec = JSON.stringify({
4917+ data_type: data_type,
4918+ parameter: parameter,
4919+ from_date: from_date,
4920+ to_date: to_date,
4921+ place_ids: place_ids
4922+ })
4923+
4924+ var chart_name = [
4925+ data_type, parameter,
4926+ 'from', from_date,
4927+ 'to', to_date,
4928+ 'for', (
4929+ place_ids.length < 3?
4930+ 'places: '+ place_ids:
4931+ place_ids.length+' places'
4932+ )
4933+ ].join(' ')
4934+
4935+ // get hold of a chart manager instance
4936+ if (!plugin.chart_window) {
4937+ var chart_window = plugin.chart_window = window.open(
4938+ 'climate/chart_popup.html',
4939+ 'chart',
4940+ 'width=660,height=600,toolbar=0,resizable=0'
4941+ )
4942+ chart_window.onload = function () {
4943+ chart_window.chart_manager = new chart_window.ChartManager(plugin.chart_URL)
4944+ chart_window.chart_manager.addChartSpec(spec, chart_name)
4945+ }
4946+ chart_window.onbeforeunload = function () {
4947+ delete plugin.chart_window
4948+ }
4949+ } else {
4950+ // some duplication here:
4951+ plugin.chart_window. chart_manager.addChartSpec(spec, chart_name)
4952+ }
4953+
4954+ }
4955+ });
4956+
4957+
4958+ function enable_show_chart_button_if_data_and_variable_selected(
4959+ box, record, index
4960+ ) {
4961+ if (
4962+ !!data_type_combo_box.getValue() &&
4963+ !!variable_combo_box.getValue()
4964+ ) {
4965+ show_chart_button.enable()
4966+ }
4967+ }
4968+
4969+ data_type_combo_box.on(
4970+ 'change',
4971+ enable_show_chart_button_if_data_and_variable_selected
4972+ );
4973+
4974+ variable_combo_box.on(
4975+ 'change',
4976+ enable_show_chart_button_if_data_and_variable_selected
4977+ );
4978+
4979+
4980+
4981+ climate_data_panel.addButton(show_chart_button)
4982+
4983 items.push(climate_data_panel)
4984+
4985+ var key_panel = new Ext.Panel({
4986+ id: 'key_panel',
4987+ title: 'Key',
4988+ collapsible: true,
4989+ collapseMode: 'mini',
4990+ items: [
4991+ {
4992+ layout: {
4993+ type: 'table',
4994+ columns: 3,
4995+ },
4996+ defaults: {
4997+ width: '100%',
4998+ height: 20,
4999+ style: 'margin: 10px'
5000+ },
The diff has been truncated for viewing.