Merge lp:~mike-amy/sahana-eden/climate into lp:sahana-eden

Proposed by Mike Amy
Status: Merged
Merged at revision: 2720
Proposed branch: lp:~mike-amy/sahana-eden/climate
Merge into: lp:sahana-eden
Diff against target: 7720 lines (+4013/-2676) (has conflicts)
55 files modified
controllers/climate.py (+159/-96)
deployment-templates/models/000_config.py (+1/-1)
models/03_gis.py (+2/-0)
models/climate.py (+5/-0)
modules/ClimateDataPortal/MapPlugin.py (+430/-0)
modules/ClimateDataPortal/__init__.py (+201/-0)
modules/ClimateDataPortal/import_NetCDF_readings.py (+131/-0)
modules/ClimateDataPortal/import_stations.py (+53/-0)
modules/ClimateDataPortal/import_tabbed_readings.py (+154/-0)
modules/s3/s3gis.py (+785/-975)
modules/test_utils/AddedRole.py (+25/-0)
modules/test_utils/Change.py (+25/-0)
modules/test_utils/ExpectSessionWarning.py (+14/-0)
modules/test_utils/ExpectedException.py (+13/-0)
modules/test_utils/InsertedRecord.py (+19/-0)
modules/test_utils/Web2pyNosePlugin.py (+106/-0)
modules/test_utils/__init__.py (+11/-1)
modules/test_utils/assert_equal.py (+60/-0)
modules/test_utils/clear_table.py (+4/-0)
modules/test_utils/find_JSON_format_data_structure.py (+54/-0)
modules/test_utils/run.py (+76/-239)
private/prepopulate/default/tasks.cfg (+0/-18)
static/scripts/S3/s3.gis.climate.js (+352/-134)
tests/climate/__init__.py (+101/-0)
tests/nose.py (+2/-2)
tests/unit_tests/gis/basic_map.html (+0/-91)
tests/unit_tests/gis/bing.html (+0/-62)
tests/unit_tests/gis/feature_queries.html (+0/-52)
tests/unit_tests/gis/google.html (+0/-67)
tests/unit_tests/gis/map_with_layers.html (+0/-150)
tests/unit_tests/gis/s3gis.py (+0/-417)
tests/unit_tests/gis/testgis.cmd (+0/-8)
tests/unit_tests/gis/true_code_paths.html (+0/-302)
tests/unit_tests/gis/yahoo.html (+0/-61)
tests/unit_tests/modules/s3/s3gis/BingLayer.py (+28/-0)
tests/unit_tests/modules/s3/s3gis/CommonScripts.py (+47/-0)
tests/unit_tests/modules/s3/s3gis/FeatureLayer.py (+25/-0)
tests/unit_tests/modules/s3/s3gis/FeatureQueries.py (+44/-0)
tests/unit_tests/modules/s3/s3gis/GPXLayer.py (+29/-0)
tests/unit_tests/modules/s3/s3gis/GeoJSONLayer.py (+28/-0)
tests/unit_tests/modules/s3/s3gis/GeoRSSLayer.py (+74/-0)
tests/unit_tests/modules/s3/s3gis/GoogleLayer.py (+84/-0)
tests/unit_tests/modules/s3/s3gis/KMLLayer.py (+70/-0)
tests/unit_tests/modules/s3/s3gis/LayerFailures.py (+117/-0)
tests/unit_tests/modules/s3/s3gis/OpenStreetMap.py (+31/-0)
tests/unit_tests/modules/s3/s3gis/TMSLayer.py (+30/-0)
tests/unit_tests/modules/s3/s3gis/TrueCodePaths.py (+307/-0)
tests/unit_tests/modules/s3/s3gis/UserInterface.py (+7/-0)
tests/unit_tests/modules/s3/s3gis/WFSLayer.py (+31/-0)
tests/unit_tests/modules/s3/s3gis/WMSLayer.py (+28/-0)
tests/unit_tests/modules/s3/s3gis/YahooLayer.py (+48/-0)
tests/unit_tests/modules/s3/s3gis/__init__.py (+68/-0)
tests/unit_tests/modules/s3/s3rest.py (+6/-0)
tests/unit_tests/modules/test_utils/find_JSON_format_data_structure.py (+52/-0)
views/climate/chart_popup.html (+76/-0)
Text conflict in modules/s3/s3gis.py
To merge this branch: bzr merge lp:~mike-amy/sahana-eden/climate
Reviewer Review Type Date Requested Status
Fran Boon Pending
Review via email: mp+74202@code.launchpad.net

This proposal supersedes a proposal from 2011-09-05.

Description of the change

Reverted the error handling in the mentioned method.
Kind of scary that there are invisible rules surrounding whether or not there is a session.

==========

Fixed merge problems, thanks flavour.

views/climate/chart_popup.html uses {{include jquery.html}}
reverted file 'private/prepopulate/default/tasks.cfg'
I think that was some strange merge behaviour as I'd added that file when there was a bug around it being '.txt'.

removed the r=request from: # self.url = "%s/%s" % (URL(r=request, c="default", f="download"),

Removed single-quotes from Python code and double-quotes from javascript code. Some double quotes in the javascript code have been left in as the generated javascript data-structures are tested by parsing as JSON, which doesn't accept single-quotes.

==========

Added NetCDF importer for climate data, improved performance of the overlay layer generation.
Fixed tests, updated to use "current" global variable.

To post a comment you must log in.
Revision history for this message
Fran Boon (flavour) wrote : Posted in a previous version of this proposal

views/climate/chart_popup.html
Please {{include jquery.html}} instead of hardcoding the jquery version.
You are already a point version out & we'll shortly go up to the newly released 1.6.3...

This should be reverted:
=== removed file 'private/prepopulate/default/tasks.cfg'

Can remove the r=request from:
+# self.url = "%s/%s" % (URL(r=request, c="default", f="download"),

I also see a very large number of single quotes in the Python code & some double quotes in the javascript....I'm happy to clean up the odd one that slips through, but this number seems excessive for me.

review: Needs Fixing
Revision history for this message
Fran Boon (flavour) wrote : Posted in a previous version of this proposal

Thanks for cleaning up.

self.debug isn't appropriate for use in gis.import_csv()
This function only gets used via the CLI currently.
session doesn't exist here so session.s3.debug will raise an error.
I definitely want the old behaviour - i.e. provide a warning message to the console, but don't stop what you're doing (we can choose not to db.commit() if we see the errors as fatal). No point in adding stuff to session which won't be seen.
I'm not keen on it elsewhere, but I'll accept it for now at least, but in that location it's clearly wrong.

review: Needs Fixing

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'controllers/climate.py'
2--- controllers/climate.py 2011-08-06 18:24:53 +0000
3+++ controllers/climate.py 2011-09-06 11:51:25 +0000
4@@ -2,92 +2,14 @@
5
6 module = "climate"
7
8-class ClimateDataMapPlugin(object):
9- def __init__(self,
10- data_type_option_names,
11- parameter_names,
12- projected_option_type_names,
13- year_min,
14- year_max
15- ):
16- self.data_type_option_names = data_type_option_names
17- self.parameter_names = parameter_names
18- self.projected_option_type_names = projected_option_type_names
19- self.year_min = year_min
20- self.year_max = year_max
21-
22- def extend_gis_map(self, add_javascript, add_configuration):
23- add_javascript("scripts/S3/s3.gis.climate.js")
24- add_configuration(
25- SCRIPT(
26- "\n".join((
27- "registerPlugin(",
28- " new ClimateDataMapPlugin("+
29- json.dumps(
30- dict(
31- self.__dict__,
32- data_type_label = str(T("Data Type")),
33- projected_option_type_label = str(T("Projection Type"))
34- ),
35- indent = 4
36- )+
37- ")",
38- ")",
39- ))
40- )
41- )
42-
43- def add_html(self, html):
44- statistics_widget = FIELDSET(
45- LEGEND("Statistics"),
46- UL(
47- _style="list-style:none",
48- *(
49- LI(
50- INPUT(
51- _type="radio",
52- _name="statistics",
53- _id="id_%s" % statistic,
54- ),
55- LABEL(
56- statistic,
57- _for="id_%s" % statistic,
58- )
59- )
60- for statistic in ["Mean", "Max", "Min"]
61- )
62- )
63- )
64-
65- html.append(
66- DIV(
67- FORM(
68- _id="controller",
69- *(
70- SCRIPT(
71- _type="text/javascript",
72- *["""
73- """]
74- ),
75- climate_data_type_widget,
76- parameters_widget,
77- statistics_widget,
78- period_widget
79- )
80- )
81- )
82- )
83-
84- def get_image_overlay(self, ):
85- from gluon.contenttype import contenttype
86- response.headers["Content-Type"] = contenttype(".png")
87- # @ToDo: Should be a file in static
88- return response.stream(open("/Users/mike/Desktop/red_wave.png"))
89-
90-climate_data_map_plugin = ClimateDataMapPlugin(
91- data_type_option_names = ["Observed", "Gridded", "Projected"],
92- parameter_names = ["Rainfall", "Temperature", "Wind", "Humidity", "Sunshine"],
93- projected_option_type_names = ["RC Model", "GC Model", "Scenario"],
94+ClimateDataPortal = local_import("ClimateDataPortal")
95+
96+sample_type_names = ClimateDataPortal.sample_codes.keys()
97+variable_names = ClimateDataPortal.tables.keys()
98+
99+map_plugin = ClimateDataPortal.MapPlugin(
100+ data_type_option_names = sample_type_names,
101+ parameter_names = variable_names,
102 year_max = datetime.date.today().year,
103 year_min = 1960,
104 )
105@@ -120,16 +42,16 @@
106 print_tool = {"url": print_service}
107 else:
108 print_tool = {}
109-
110+
111 map = gis.show_map(
112+ lat = 28.5,
113+ lon = 84.1,
114+ zoom = 7,
115 toolbar = False,
116- catalogue_toolbar=catalogue_toolbar, # T/F, top tabs toolbar
117+# catalogue_toolbar=catalogue_toolbar, # T/F, top tabs toolbar
118 wms_browser = wms_browser, # dict
119- catalogue_layers=catalogue_layers, # T/F
120- mouse_position = deployment_settings.get_gis_mouse_position(),
121- print_tool = print_tool,
122 plugins = [
123- climate_data_map_plugin
124+ map_plugin
125 ]
126 )
127
128@@ -138,7 +60,148 @@
129 module_name=module_name,
130 map=map
131 )
132-
133-def climate_image_overlay():
134- return climate_data_map_plugin.get_image_overlay()
135-
136+
137+month_names = dict(
138+ January=1,
139+ February=2,
140+ March=3,
141+ April=4,
142+ May=5,
143+ June=6,
144+ July=7,
145+ August=8,
146+ September=9,
147+ October=10,
148+ November=11,
149+ December=12
150+)
151+
152+for name, number in month_names.items():
153+ month_names[name[:3]] = number
154+for name, number in month_names.items():
155+ month_names[name.upper()] = number
156+for name, number in month_names.items():
157+ month_names[name.lower()] = number
158+
159+def convert_date(default_month):
160+ def converter(year_month):
161+ components = year_month.split("-")
162+ year = int(components[0])
163+ assert 1960 <= year, "year must be >= 1960"
164+
165+ try:
166+ month_value = components[1]
167+ except IndexError:
168+ month = default_month
169+ else:
170+ try:
171+ month = int(month_value)
172+ except TypeError:
173+ month = month_names[month_value]
174+
175+ assert 1 <= month <= 12, "month must be in range 1:12"
176+ return datetime.date(year, month, 1)
177+ return converter
178+
179+def one_of(options):
180+ def validator(choice):
181+ assert choice in options, "should be one of %s, not '%s'" % (
182+ options,
183+ choice
184+ )
185+ return choice
186+ return validator
187+
188+def climate_overlay_data():
189+ kwargs = dict(request.vars)
190+ kwargs["parameter"] = kwargs["parameter"].replace("+", " ")
191+
192+ arguments = {}
193+ errors = []
194+ for kwarg_name, converter in dict(
195+ data_type = one_of(sample_type_names),
196+ statistic = one_of(("Maximum", "Minimum", "Average")),
197+ parameter = one_of(variable_names),
198+ from_date = convert_date(default_month = 1),
199+ to_date = convert_date(default_month = 12),
200+ ).iteritems():
201+ try:
202+ value = kwargs.pop(kwarg_name)
203+ except KeyError:
204+ errors.append("%s missing" % kwarg_name)
205+ else:
206+ try:
207+ arguments[kwarg_name] = converter(value)
208+ except TypeError:
209+ errors.append("%s is wrong type" % kwarg_name)
210+ except AssertionError, assertion_error:
211+ errors.append("%s: %s" % (kwarg_name, assertion_error))
212+ if kwargs:
213+ errors.append("Unexpected arguments: %s" % kwargs.keys())
214+
215+ if errors:
216+ raise HTTP(500, "<br />".join(errors))
217+ else:
218+ import gluon.contenttype
219+ data_path = map_plugin.get_overlay_data(
220+ env = Storage(globals()),
221+ **arguments
222+ )
223+ return response.stream(
224+ open(data_path,"rb"),
225+ chunk_size=4096
226+ )
227+
228+def list_of(converter):
229+ def convert_list(choices):
230+ return map(converter, choices)
231+ return convert_list
232+
233+def climate_chart():
234+ kwargs = dict(request.vars)
235+ import simplejson as JSON
236+ specs = JSON.loads(kwargs.pop("spec"))
237+
238+ checked_specs = []
239+ for spec in specs:
240+ arguments = {}
241+ errors = []
242+ for name, converter in dict(
243+ data_type = one_of(sample_type_names),
244+ parameter = one_of(variable_names),
245+ from_date = convert_date(default_month = 1),
246+ to_date = convert_date(default_month = 12),
247+ place_ids = list_of(int)
248+ ).iteritems():
249+ try:
250+ value = spec.pop(name)
251+ except KeyError:
252+ errors.append("%s missing" % name)
253+ else:
254+ try:
255+ arguments[name] = converter(value)
256+ except TypeError:
257+ errors.append("%s is wrong type" % name)
258+ except AssertionError, assertion_error:
259+ errors.append("%s: %s" % (name, assertion_error))
260+ if spec:
261+ errors.append("Unexpected arguments: %s" % spec.keys())
262+ checked_specs.append(arguments)
263+
264+ if errors:
265+ raise HTTP(500, "<br />".join(errors))
266+ else:
267+ import gluon.contenttype
268+ response.headers["Content-Type"] = gluon.contenttype.contenttype(".png")
269+ data_image_file_path = map_plugin.render_plots(
270+ env = Storage(globals()),
271+ specs = checked_specs
272+ )
273+ return response.stream(
274+ open(data_image_file_path,"rb"),
275+ chunk_size=4096
276+ )
277+
278+def chart_popup():
279+ return {}
280+
281
282=== modified file 'deployment-templates/models/000_config.py'
283--- deployment-templates/models/000_config.py 2011-08-25 09:17:02 +0000
284+++ deployment-templates/models/000_config.py 2011-09-06 11:51:25 +0000
285@@ -249,7 +249,7 @@
286 strict_hierarchy = False,
287 # Should all specific locations (e.g. addresses, waypoints) be required to
288 # link to where they are in the location hierarchy?
289- location_parent_required = False,
290+ location_parent_required = False
291 )
292 # Set this if there will be multiple areas in which work is being done,
293 # and a menu to select among them is wanted. With this on, any map
294
295=== modified file 'models/03_gis.py'
296--- models/03_gis.py 2011-09-05 22:10:34 +0000
297+++ models/03_gis.py 2011-09-06 11:51:25 +0000
298@@ -1369,6 +1369,8 @@
299 # =============================================================================
300 def gis_map_tables():
301 """ Load the GIS Map Tables when needed """
302+ if "gis_layer_bing" in db.tables:
303+ return
304
305 # -------------------------------------------------------------------------
306 # GPS Waypoints
307
308=== added file 'models/climate.py'
309--- models/climate.py 1970-01-01 00:00:00 +0000
310+++ models/climate.py 2011-09-06 11:51:25 +0000
311@@ -0,0 +1,5 @@
312+# -*- coding: utf-8 -*-
313+
314+module = "climate"
315+if deployment_settings.has_module(module):
316+ local_import("ClimateDataPortal").define_models(env = Storage(globals()))
317
318=== added directory 'modules/ClimateDataPortal'
319=== added file 'modules/ClimateDataPortal/MapPlugin.py'
320--- modules/ClimateDataPortal/MapPlugin.py 1970-01-01 00:00:00 +0000
321+++ modules/ClimateDataPortal/MapPlugin.py 2011-09-06 11:51:25 +0000
322@@ -0,0 +1,430 @@
323+
324+# notes:
325+
326+# dependencies:
327+# R
328+
329+# create folder for cache:
330+# mkdir -p /tmp/climate_data_portal/images/recent/
331+# mkdir -p /tmp/climate_data_portal/images/older/
332+
333+MAX_CACHE_FOLDER_SIZE = 2**24 # 16 MiB
334+
335+class TwoStageCache(object):
336+ def __init__(self, folder, max_size):
337+ self.folder = folder
338+ self.max_size
339+
340+ def purge(self):
341+ pass
342+
343+ def retrieve(self, file_name, generate_if_not_found):
344+ pass
345+
346+import os, errno
347+
348+def mkdir_p(path):
349+ try:
350+ os.makedirs(path)
351+ except OSError as exc: # Python >2.5
352+ if exc.errno == errno.EEXIST:
353+ pass
354+ else: raise
355+
356+def define(env, place, tables, date_to_month_number, sample_codes, exports):
357+ # This starts an R interpreter.
358+ # As we are sharing it (restarting it every time is inefficient),
359+ # we have to be somewhat careful to make sure objects are garbage collected
360+ # better to just not stick anything in R's globals
361+ try:
362+ import rpy2.robjects as robjects
363+ except ImportError:
364+ import logging
365+ logging.getLogger().error(
366+"""R is required by the climate data portal to generate charts
367+
368+To install R: refer to:
369+http://cran.r-project.org/doc/manuals/R-admin.html
370+
371+
372+rpy2 is required to interact with python.
373+
374+To install rpy2, refer to:
375+http://rpy.sourceforge.net/rpy2/doc-dev/html/overview.html
376+""")
377+ raise
378+
379+ R = robjects.r
380+
381+ from rpy2.robjects.packages import importr
382+
383+ base = importr("base")
384+
385+ from math import fsum
386+ def average(values):
387+ "Safe float average"
388+ l = len(values)
389+ if l is 0:
390+ return None
391+ else:
392+ return fsum(values)/l
393+
394+ class Maximum(object):
395+ def __init__(self, column, add_query_term):
396+ self.value_max = value_max = column.max()
397+ add_query_term(value_max)
398+
399+ def __call__(self, row):
400+ return row._extra[self.value_max]
401+
402+ class Minimum(object):
403+ def __init__(self, column, add_query_term):
404+ self.value_min = value_min = column.min()
405+ add_query_term(value_min)
406+
407+ def __call__(self, row):
408+ return row._extra[self.value_min]
409+
410+ class Average(object):
411+ def __init__(self, column, add_query_term):
412+ self.value_sum = value_sum = column.sum()
413+ self.value_count = value_count = column.count()
414+ add_query_term((
415+ value_sum,
416+ value_count
417+ ))
418+
419+ def __call__(self, row):
420+ return row._extra[self.value_sum] / row._extra[self.value_count]
421+
422+ aggregators = {
423+ "Maximum": Maximum,
424+ "Minimum": Minimum,
425+ "Average": Average
426+ }
427+
428+ def get_cached_or_generated_file(cache_file_name, generate):
429+ from os.path import join, exists
430+ from os import stat, makedirs
431+ # this needs to become a setting
432+ climate_data_image_cache_path = join(
433+ "/tmp","climate_data_portal","images"
434+ )
435+ recent_cache = join(climate_data_image_cache_path, "recent")
436+ mkdir_p(recent_cache)
437+ older_cache = join(climate_data_image_cache_path, "older")
438+ mkdir_p(older_cache)
439+ recent_cache_path = join(recent_cache, cache_file_name)
440+ if not exists(recent_cache_path):
441+ older_cache_path = join(older_cache, cache_file_name)
442+ if exists(older_cache_path):
443+ # move the older cache to the recent folder
444+ rename(older_cache_path, recent_cache_path)
445+ else:
446+ generate(recent_cache_path)
447+ file_path = recent_cache_path
448+
449+ # update the folder size file (race condition?)
450+ folder_size_file_path = join(climate_data_image_cache_path, "size")
451+ folder_size_file = open(folder_size_file_path, "w+")
452+ folder_size_file_contents = folder_size_file.read()
453+ try:
454+ folder_size = int(folder_size_file_contents)
455+ except ValueError:
456+ folder_size = 0
457+ folder_size_file.seek(0)
458+ folder_size_file.truncate()
459+ folder_size += stat(file_path).st_size
460+ if folder_size > MAX_CACHE_FOLDER_SIZE:
461+ rmdir(older_cache)
462+
463+ folder_size_file.write(str(folder_size))
464+ folder_size_file.close()
465+ else:
466+ # use the existing cached image
467+ file_path = recent_cache_path
468+ return file_path
469+
470+ class MapPlugin(object):
471+ def __init__(
472+ self,
473+ data_type_option_names,
474+ parameter_names,
475+ year_min,
476+ year_max
477+ ):
478+ self.data_type_option_names = data_type_option_names
479+ self.parameter_names = parameter_names
480+ self.year_min = year_min
481+ self.year_max = year_max
482+
483+ def extend_gis_map(self, add_javascript, add_configuration):
484+ add_javascript("scripts/S3/s3.gis.climate.js")
485+ SCRIPT = env.SCRIPT
486+ T = env.T
487+ import json
488+
489+ add_configuration(
490+ SCRIPT(
491+ "\n".join((
492+ "registerPlugin(",
493+ " new ClimateDataMapPlugin("+
494+ json.dumps(
495+ dict(
496+ data_type_option_names = self.data_type_option_names,
497+ parameter_names = self.parameter_names,
498+ year_min = self.year_min,
499+ year_max = self.year_max,
500+ overlay_data_URL = "/%s/climate/climate_overlay_data" % (
501+ env.request.application
502+ ),
503+ chart_URL = "/%s/climate/climate_chart" % (
504+ env.request.application
505+ ),
506+ data_type_label = str(T("Data Type")),
507+ projected_option_type_label = str(
508+ T("Projection Type")
509+ )
510+ ),
511+ indent = 4
512+ )+
513+ ")",
514+ ")",
515+ ))
516+ )
517+ )
518+
519+
520+ def get_overlay_data(
521+ self,
522+ env,
523+ data_type,
524+ parameter,
525+ from_date,
526+ to_date,
527+ statistic
528+ ):
529+ from_month = date_to_month_number(from_date)
530+ to_month = date_to_month_number(to_date)
531+ def generate_map_overlay_data(file_path):
532+ # generate the new file in the recent folder
533+
534+ db = env.db
535+ sample_table_name, sample_table = tables[parameter]
536+ place = db.place
537+ #sample_table = db[sample_table_name]
538+
539+ query = [
540+ place.id,
541+ place.longitude,
542+ place.latitude,
543+ ]
544+ aggregator = aggregators[statistic](
545+ sample_table.value,
546+ query.append
547+ )
548+
549+ sample_rows = db(
550+ (sample_table.time_period >= from_month) &
551+ (sample_table.time_period <= to_month) &
552+ (sample_table.sample_type == sample_codes[data_type]) &
553+ (place.id == sample_table.place_id)
554+ ).select(
555+ *query,
556+ groupby=sample_table.place_id
557+ )
558+
559+ # map positions to data
560+ # find max and min value
561+ positions = {}
562+ aggregated_values = []
563+ for row in sample_rows:
564+ place = row.place
565+ aggregated_value = aggregator(row)
566+ aggregated_values.append(aggregated_value)
567+ positions[place.id] = (
568+ place.latitude,
569+ place.longitude,
570+ aggregated_value
571+ )
572+ max_aggregated_value = max(aggregated_values)
573+ min_aggregated_value = min(aggregated_values)
574+ aggregated_range = max_aggregated_value - min_aggregated_value
575+
576+ data_lines = []
577+ write = data_lines.append
578+ from colorsys import hsv_to_rgb
579+ for id, (lat, lon, aggregated_value) in positions.iteritems():
580+ north = lat + 0.05
581+ south = lat - 0.05
582+ east = lon + 0.05
583+ west = lon - 0.05
584+ # only hue changes
585+ # hue range is from 2/3 (blue, low) to 0 (red, high)
586+ normalised_value = 1.0-((aggregated_value - min_aggregated_value) / aggregated_range)
587+ r,g,b = hsv_to_rgb(normalised_value *(2.0/3.0), 1.0, 1.0)
588+ hex_colour = "%02x%02x%02x" % (r*255, g*255, b*255)
589+ write(
590+ "Vector("
591+ "Polygon(["
592+ "LinearRing(["
593+ "Point(%(north)f,%(west)f),"
594+ "Point(%(north)f,%(east)f),"
595+ "Point(%(south)f,%(east)f),"
596+ "Point(%(south)f,%(west)f)"
597+ "])"
598+ "]),"
599+ "{"
600+ "value:%(aggregated_value)f,"
601+ "id:%(id)i"
602+ "},"
603+ "{"
604+ "fillColor:'#%(hex_colour)s'"
605+ "}"
606+ ")," % locals()
607+ )
608+ overlay_data_file = open(file_path, "w")
609+ write = overlay_data_file.write
610+ write("{")
611+ if max_aggregated_value < 10:
612+ float_format = "%0.2f"
613+ if max_aggregated_value < 100:
614+ float_format = "%0.1f"
615+ elif max_aggregated_value < 10000:
616+ float_format = "%0.0f"
617+ else:
618+ float_format = "%0.2e"
619+ write("max:%s," % float_format % max_aggregated_value)
620+ write("min:%s," % float_format % min_aggregated_value)
621+ write("features:[")
622+ write("".join(data_lines))
623+ overlay_data_file.seek(-1, 1) # delete last ",'
624+ write("]}")
625+ overlay_data_file.close()
626+
627+ return get_cached_or_generated_file(
628+ "_".join((
629+ statistic,
630+ data_type,
631+ parameter,
632+ str(from_month),
633+ str(to_month),
634+ ".js"
635+ )),
636+ generate_map_overlay_data
637+ )
638+
639+ def render_plots(
640+ self,
641+ env,
642+ specs
643+ ):
644+ def generate_chart(file_path):
645+ def render_plot(
646+ data_type,
647+ parameter,
648+ from_date,
649+ to_date,
650+ place_ids
651+ ):
652+ from_month = date_to_month_number(from_date)
653+ to_month = date_to_month_number(to_date)
654+
655+ db = env.db
656+ sample_table_name, sample_table = tables[parameter]
657+ place = db.place
658+ #sample_table = db[sample_table_name]
659+ sample_rows = db(
660+ (sample_table.time_period >= from_month) &
661+ (sample_table.time_period <= to_month) &
662+ (sample_table.sample_type == sample_codes[data_type]) &
663+ (sample_table.place_id.belongs(place_ids))
664+ ).select(
665+ sample_table.value,
666+ sample_table.time_period,
667+ )
668+
669+ # coalesce values by time_period:
670+ aggregated_values = {}
671+ for sample_row in sample_rows:
672+ time_period = sample_row.time_period
673+ value = sample_row.value
674+ try:
675+ aggregated_values[time_period]
676+ except KeyError:
677+ aggregated_values[time_period] = value
678+ else:
679+ aggregated_values[time_period] += value
680+
681+ values = []
682+ time_periods = aggregated_values.keys()
683+ time_periods.sort()
684+ for time_period in time_periods:
685+ values.append(aggregated_values[time_period])
686+ return from_date, to_date, data_type, parameter, values
687+
688+ time_serieses = []
689+ c = R("c")
690+ for spec in specs:
691+ from_date, to_date, data_type, parameter, values = render_plot(**spec)
692+ time_serieses.append(
693+ R("ts")(
694+ robjects.FloatVector(values),
695+ start = c(from_date.year, from_date.month),
696+ end = c(to_date.year, to_date.month),
697+ frequency = 12
698+ )
699+ )
700+
701+ R("png(filename = '%s', width=640, height=480)" % file_path)
702+ plot_chart = R(
703+ "function (xlab, ylab, n, ...) {"
704+ "ts.plot(...,"
705+ "gpars=list(xlab=xlab, ylab=ylab, col=c(1:n))"
706+ ")"
707+ "}"
708+ )
709+
710+ plot_chart(
711+ "Date",
712+ "Combined %s %s" % (data_type, parameter),
713+ len(time_serieses),
714+ *time_serieses
715+ )
716+ R("dev.off()")
717+
718+ import md5
719+ import gluon.contrib.simplejson as JSON
720+
721+ import datetime
722+ def serialiseDate(obj):
723+ if isinstance(
724+ obj,
725+ (
726+ datetime.date,
727+ datetime.datetime,
728+ datetime.time
729+ )
730+ ):
731+ return obj.isoformat()[:19].replace("T"," ")
732+ raise TypeError("%r is not JSON serializable" % (obj,))
733+
734+ return get_cached_or_generated_file(
735+ "_".join((
736+ md5.md5(
737+ JSON.dumps(
738+ specs,
739+ sort_keys=True,
740+ default=serialiseDate
741+ )
742+ ).hexdigest(),
743+ ".png"
744+ )),
745+ generate_chart
746+ )
747+
748+ exports.update(
749+ MapPlugin = MapPlugin
750+ )
751+
752+ del globals()["define"]
753
754=== added file 'modules/ClimateDataPortal/__init__.py'
755--- modules/ClimateDataPortal/__init__.py 1970-01-01 00:00:00 +0000
756+++ modules/ClimateDataPortal/__init__.py 2011-09-06 11:51:25 +0000
757@@ -0,0 +1,201 @@
758+
759+"""
760+ Climate Data Module
761+
762+ @author: Mike Amy
763+"""
764+
765+# datasets are stored in actual tables
766+# - e.g. rainfall_mm
767+
768+# data collection points in dataset
769+# values at a point within a time range
770+
771+# e.g. observed temperature in Kathmandu between Feb 2006 - April 2007
772+
773+
774+sample_types = dict(
775+ O = "Observed",
776+ G = "Gridded",
777+
778+ r = "Projected (RC)",
779+ g = "Projected (GC)",
780+ s = "Scenario",
781+)
782+
783+sample_codes = {}
784+
785+import re
786+for code, name in sample_types.iteritems():
787+ globals()[re.sub("\W", "", name)] = code
788+ sample_codes[name] = code
789+
790+
791+# Until I figure out how to sanely import things from web2py,
792+# apply a prophylactic import method...
793+def define_models(env):
794+ """
795+ Define Climate Data models.
796+ """
797+ db = env.db
798+ Field = env.Field
799+
800+ def create_index(table_name, field_name):
801+ db.executesql(
802+ """
803+ CREATE INDEX IF NOT EXISTS
804+ "index_%(table_name)s__%(field_name)s"
805+ ON "%(table_name)s" ("%(field_name)s");
806+ """ % locals()
807+ )
808+
809+ place = db.define_table(
810+ "place",
811+ Field(
812+ "longitude",
813+ "double",
814+ notnull=True,
815+ required=True,
816+ ),
817+ Field(
818+ "latitude",
819+ "double",
820+ notnull=True,
821+ required=True,
822+ )
823+ )
824+
825+ # not all places are stations with elevations
826+ # as in the case of "gridded" data
827+ # a station can only be in one place
828+ observation_station = db.define_table(
829+ "observation_station",
830+ Field(
831+ "id",
832+ "id", # must be a place,
833+ notnull=True,
834+ required=True,
835+ ),
836+ Field(
837+ "name",
838+ "string",
839+ notnull=True,
840+ unique=False,
841+ required=True,
842+ ),
843+ Field(
844+ "elevation_metres",
845+ "integer"
846+ )
847+ )
848+
849+ def sample_table(name, value_type):
850+ table = db.define_table(
851+ name,
852+ Field(
853+ "sample_type",
854+ "string",
855+ length = 1,
856+ notnull=True,
857+ # necessary as web2py requires a default value even for
858+ # not null fields
859+ default="-1",
860+ required=True
861+ ),
862+ Field(
863+ "time_period",
864+ "integer",
865+ notnull=True,
866+ default=-1000,
867+ required=True
868+ ),
869+ Field(
870+ # this should become a GIS field
871+ "place_id",
872+ place,
873+ notnull=True,
874+ required=True
875+ ),
876+ Field(
877+ "value",
878+ value_type,
879+ notnull = True,
880+ required=True,
881+ ),
882+ )
883+
884+ create_index(name, "id")
885+ create_index(name, "sample_type")
886+ create_index(name, "time_period")
887+ create_index(name, "place_id")
888+
889+ return table
890+
891+ rainfall_mm = sample_table("climate_rainfall_mm", "double")
892+ min_temperature_celsius = sample_table("climate_min_temperature_celsius", "double")
893+ max_temperature_celsius = sample_table("climate_max_temperature_celsius", "double")
894+
895+ tables = {
896+ "Rainfall mm": ("climate_rainfall_mm", rainfall_mm),
897+ "Max Temperature C": ("climate_max_temperature_celsius", max_temperature_celsius),
898+ "Min Temperature C": ("climate_min_temperature_celsius", min_temperature_celsius),
899+ }
900+
901+ def year_month_to_month_number(year, month):
902+ """Time periods are integers representing months in years,
903+ from 1960 onwards.
904+
905+ e.g. 0 = Jan 1960, 1 = Feb 1960, 12 = Jan 1961
906+
907+ This function converts a year and month to a month number.
908+ """
909+ return ((year-1960) * 12) + (month-1)
910+
911+ def date_to_month_number(date):
912+ """This function converts a date to a month number.
913+
914+ See also year_month_to_month_number(year, month)
915+ """
916+ return year_month_to_month_number(date.year, date.month)
917+
918+# def month_number_to_date(month_number):
919+# ret
920+
921+ from .MapPlugin import define
922+ define(
923+ env,
924+ place,
925+ tables,
926+ date_to_month_number,
927+ sample_codes,
928+ globals()
929+ )
930+
931+ # exports:
932+ globals().update(
933+ sample_types = sample_types,
934+
935+ place = place,
936+ observation_station = observation_station,
937+
938+ tables = tables,
939+
940+ rainfall_mm = rainfall_mm,
941+ max_temperature_celsius = max_temperature_celsius,
942+ min_temperature_celsius = min_temperature_celsius,
943+
944+ date_to_month_number = date_to_month_number,
945+ year_month_to_month_number = year_month_to_month_number,
946+ )
947+
948+ def redefine_models(env):
949+ # avoid risking insidious aliasing bugs
950+ # by not defining things more than once
951+ env.db.update(
952+ climate_rainfall_mm = rainfall_mm,
953+ climate_max_temperature_celsius = max_temperature_celsius,
954+ climate_min_temperature_celsius = min_temperature_celsius,
955+ place = place,
956+ observation_station = observation_station,
957+ )
958+ globals()["define_models"] = redefine_models
959
960=== added file 'modules/ClimateDataPortal/import_NetCDF_readings.py'
961--- modules/ClimateDataPortal/import_NetCDF_readings.py 1970-01-01 00:00:00 +0000
962+++ modules/ClimateDataPortal/import_NetCDF_readings.py 2011-09-06 11:51:25 +0000
963@@ -0,0 +1,131 @@
964+
965+ClimateDataPortal = local_import("ClimateDataPortal")
966+
967+
968+def get_or_create(dict, key, creator):
969+ try:
970+ value = dict[key]
971+ except KeyError:
972+ value = dict[key] = creator()
973+ return value
974+
975+def get_or_create_record(table, query):
976+ query_terms = []
977+ for key, value in query.iteritems():
978+ query_terms.append(getattr(table, key) == value)
979+ reduced_query = reduce(
980+ (lambda left, right: left & right),
981+ query_terms
982+ )
983+ records = db(reduced_query).select()
984+ count = len(records)
985+ assert count <= 1, "Multiple records for %s" % query
986+ if count == 0:
987+ record = table.insert(**query)
988+ db.commit()
989+ else:
990+ record = records.first()
991+ return record.id
992+
993+def nearly(expected_float, actual_float):
994+ return (expected_float * 0.999) < actual_float < (expected_float * 1.001)
995+
996+def add_reading_if_none(
997+ database_table,
998+ sample_type,
999+ time_period,
1000+ place_id,
1001+ value
1002+):
1003+ records = db(
1004+ (database_table.sample_type == sample_type) &
1005+ (database_table.time_period == time_period) &
1006+ (database_table.place_id == place_id)
1007+ ).select(database_table.value, database_table.id)
1008+ count = len(records)
1009+ assert count <= 1
1010+ if count == 0:
1011+ database_table.insert(
1012+ sample_type = sample_type,
1013+ time_period = time_period,
1014+ place_id = place_id,
1015+ value = value
1016+ )
1017+ else:
1018+ existing = records.first()
1019+ assert nearly(existing.value, value), (existing.value, value, place_id)
1020+
1021+
1022+
1023+import datetime
1024+
1025+def import_climate_readings(
1026+ netcdf_file,
1027+ database_table,
1028+ add_reading,
1029+ start_time = datetime.date(1971,1,1),
1030+ is_undefined = lambda x: -99.900003 < x < -99.9
1031+):
1032+ """
1033+ Assumptions:
1034+ * there are no places
1035+ * the data is in order of places
1036+ """
1037+ variables = netcdf_file.variables
1038+
1039+ # create grid of places
1040+ place_ids = {}
1041+
1042+ def to_list(variable):
1043+ result = []
1044+ for i in range(len(variable)):
1045+ result.append(variable[i])
1046+ return result
1047+
1048+ def iter_pairs(list):
1049+ for index in range(len(list)):
1050+ yield index, list[index]
1051+
1052+ times = to_list(variables["time"])
1053+ lat = to_list(variables["lat"])
1054+ lon = to_list(variables["lon"])
1055+ for latitude in lat:
1056+ for longitude in lon:
1057+ record = get_or_create_record(
1058+ ClimateDataPortal.place,
1059+ dict(
1060+ longitude = longitude,
1061+ latitude = latitude
1062+ )
1063+ )
1064+ place_ids[(latitude, longitude)] = record
1065+ #print longitude, latitude, record
1066+
1067+ tt = variables["tt"]
1068+ print "up to:", len(times)
1069+ for time_index, time in iter_pairs(times):
1070+ print time_index
1071+ time_period = start_time+datetime.timedelta(hours=time)
1072+ for latitude_index, latitude in iter_pairs(lat):
1073+ for longitude_index, longitude in iter_pairs(lon):
1074+ value = tt[time_index][latitude_index][longitude_index]
1075+ if not is_undefined(value):
1076+ add_reading(
1077+ database_table = database_table,
1078+ sample_type = ClimateDataPortal.Gridded,
1079+ time_period = ClimateDataPortal.date_to_month_number(time_period),
1080+ place_id = place_ids[(latitude, longitude)],
1081+ value = value
1082+ )
1083+ db.commit()
1084+
1085+import sys
1086+
1087+from Scientific.IO import NetCDF
1088+
1089+file_name = sys.argv[1]
1090+import_climate_readings(
1091+ NetCDF.NetCDFFile(file_name),
1092+ ClimateDataPortal.min_temperature_celsius,
1093+ add_reading_if_none
1094+)
1095
1096=== added file 'modules/ClimateDataPortal/import_stations.py'
1097--- modules/ClimateDataPortal/import_stations.py 1970-01-01 00:00:00 +0000
1098+++ modules/ClimateDataPortal/import_stations.py 2011-09-06 11:51:25 +0000
1099@@ -0,0 +1,53 @@
1100+
1101+ClimateDataPortal = local_import("ClimateDataPortal")
1102+
1103+from decimal import Decimal
1104+
1105+def import_stations(file_name):
1106+ """
1107+ Expects a file containing lines of the form e.g.:
1108+226 JALESORE 1122 172 26.65 85.78
1109+275 PHIDIM (PANCHTH 1419 1205 27.15 87.75
1110+unused Station name <-id <-elev <-lat <-lon
1111+0123456789012345678901234567890123456789012345678901234567890123456789
1112+0 1 2 3 4 5 6
1113+ """
1114+ place = ClimateDataPortal.place
1115+ observation_station = ClimateDataPortal.observation_station
1116+ observation_station.truncate()
1117+ place.truncate()
1118+ db.commit()
1119+
1120+ for line in open(file_name, "r").readlines():
1121+ try:
1122+ place_id_text = line[27:33]
1123+ except IndexError:
1124+ continue
1125+ else:
1126+ try:
1127+ place_id = int(place_id_text)
1128+ except ValueError:
1129+ continue
1130+ else:
1131+ station_name = line[8:25].strip() # don't restrict if they add more
1132+ elevation_metres = int(line[37:43])
1133+
1134+ latitude = Decimal(line[47:53])
1135+ longitude = Decimal(line[57:623])
1136+
1137+ assert place.insert(
1138+ id = place_id,
1139+ longitude = longitude,
1140+ latitude = latitude
1141+ ) == place_id
1142+
1143+ station_id = observation_station.insert(
1144+ id = place_id,
1145+ name = station_name,
1146+ elevation_metres = elevation_metres
1147+ )
1148+ print place_id, station_name, latitude, longitude, elevation_metres
1149+ db.commit()
1150+
1151+import sys
1152+import_stations(sys.argv[1])
1153
1154=== added file 'modules/ClimateDataPortal/import_tabbed_readings.py'
1155--- modules/ClimateDataPortal/import_tabbed_readings.py 1970-01-01 00:00:00 +0000
1156+++ modules/ClimateDataPortal/import_tabbed_readings.py 2011-09-06 11:51:25 +0000
1157@@ -0,0 +1,154 @@
1158+
1159+ClimateDataPortal = local_import("ClimateDataPortal")
1160+
1161+from decimal import Decimal
1162+
1163+
1164+def get_or_create(dict, key, creator):
1165+ try:
1166+ value = dict[key]
1167+ except KeyError:
1168+ value = dict[key] = creator()
1169+ return value
1170+
1171+import os
1172+
1173+class Readings(object):
1174+ def __init__(
1175+ self,
1176+ database_table,
1177+ null_value,
1178+ maximum = None,
1179+ minimum = None
1180+ ):
1181+ self.database_table = database_table
1182+ db(database_table.sample_type == ClimateDataPortal.Observed).delete()
1183+ self.null_value = null_value
1184+ self.maximum = maximum
1185+ self.minimum = minimum
1186+
1187+ self.aggregated_values = {}
1188+
1189+ def add_reading(self, time_period, reading, out_of_range):
1190+ if reading != self.null_value:
1191+ if (
1192+ (self.minimum is not None and reading < self.minimum) or
1193+ (self.maximum is not None and reading > self.maximum)
1194+ ):
1195+ out_of_range(reading)
1196+ else:
1197+ readings = get_or_create(
1198+ self.aggregated_values,
1199+ time_period,
1200+ list
1201+ )
1202+ readings.append(reading)
1203+
1204+ def done(self, place_id):
1205+ for month_number, values in self.aggregated_values.iteritems():
1206+ self.database_table.insert(
1207+ sample_type = ClimateDataPortal.Observed,
1208+ time_period = month_number,
1209+ place_id = place_id,
1210+ value = sum(values) / len(values)
1211+ )
1212+
1213+import datetime
1214+
1215+
1216+def import_tabbed_readings(
1217+ folder_name,
1218+ variables = [],
1219+ place_ids = None
1220+):
1221+ """
1222+ Expects a folder containing files with name rtXXXX.txt
1223+
1224+ each file contains lines of the form e.g.:
1225+1978\t1\t1\t0\t-99.9\t-99.9
1226+
1227+representing year, month, day, rainfall(mm), minimum and maximum temperature
1228+ """
1229+ observation_station = ClimateDataPortal.observation_station
1230+
1231+ null_value = Decimal("-99.9") # seems to be
1232+
1233+ for row in db(observation_station).select(observation_station.id):
1234+ place_id = row.id
1235+ if place_ids is not None:
1236+ # avoid certain place ids (to allow importing particular places)
1237+ start_place, end_place = map(int, place_ids.split(":"))
1238+ assert start_place <= end_place
1239+ if place_id < start_place or place_id > end_place:
1240+ continue
1241+ print place_id
1242+
1243+ data_file_path = os.path.join(folder_name, "rt%04i.txt" % place_id)
1244+ if not os.path.exists(data_file_path):
1245+ print "%s not found" % data_file_path
1246+ else:
1247+ try:
1248+ for line in open(data_file_path, "r").readlines():
1249+ if line:
1250+ data = line.split()
1251+ if data:
1252+ try:
1253+ year = int(data[0])
1254+ month = int(data[1])
1255+ day = int(data[2])
1256+
1257+ time_period = ClimateDataPortal.year_month_to_month_number(year, month)
1258+
1259+ for variable, reading_data in zip(
1260+ variables,
1261+ data[3:6]
1262+ ):
1263+ def out_of_range(reading):
1264+ print "%s/%s/%s: %s out of range" % (
1265+ day, month, year, reading
1266+ )
1267+ reading = Decimal(reading_data)
1268+ variable.add_reading(
1269+ time_period,
1270+ reading,
1271+ out_of_range = out_of_range
1272+ )
1273+
1274+ except Exception, exception:
1275+ print exception
1276+ for variable in variables:
1277+ variable.done(place_id)
1278+ except:
1279+ print line
1280+ raise
1281+
1282+ db.commit()
1283+ else:
1284+ print "No stations!"
1285+
1286+import sys
1287+
1288+null_value = Decimal("-99.9")
1289+import_tabbed_readings(
1290+ folder_name = sys.argv[1],
1291+ variables = [
1292+ Readings(
1293+ ClimateDataPortal.rainfall_mm,
1294+ null_value = null_value,
1295+ minimum = 0,
1296+ ),
1297+ Readings(
1298+ database_table = ClimateDataPortal.min_temperature_celsius,
1299+ null_value = null_value,
1300+ minimum = -120,
1301+ maximum = 55
1302+ ),
1303+ Readings(
1304+ database_table = ClimateDataPortal.max_temperature_celsius,
1305+ null_value = null_value,
1306+ minimum = -120,
1307+ maximum = 55
1308+ ),
1309+ ],
1310+ place_ids = sys.argv[2:] or None
1311+)
1312
1313=== modified file 'modules/s3/s3gis.py'
1314--- modules/s3/s3gis.py 2011-09-05 22:18:45 +0000
1315+++ modules/s3/s3gis.py 2011-09-06 11:51:25 +0000
1316@@ -74,16 +74,11 @@
1317 Provide an easy, safe, systematic way of handling Debug output
1318 (print to stdout doesn't work with WSGI deployments)
1319 """
1320- try:
1321- output = "S3 Debug: %s" % str(message)
1322- if value:
1323- output += ": %s" % str(value)
1324- except:
1325- output = "S3 Debug: %s" % unicode(message)
1326- if value:
1327- output += ": %s" % unicode(value)
1328-
1329- print >> sys.stderr, output
1330+ # should be using python's built-in logging module
1331+ output = u"S3 Debug: %s" % unicode(message)
1332+ if value:
1333+ output += u": %s" % unicode(value)
1334+ sys.stderr.write(output+"\n")
1335
1336 SHAPELY = False
1337 try:
1338@@ -240,7 +235,6 @@
1339 """
1340
1341 def __init__(self):
1342-
1343 self.deployment_settings = current.deployment_settings
1344 self.public_url = current.deployment_settings.get_base_public_url()
1345 if not current.db is not None:
1346@@ -308,6 +302,18 @@
1347 else:
1348 return wkt
1349
1350+ def debug(self, message, value=None):
1351+ # should be using python's built-in logging module
1352+ session = current.session
1353+ if session.s3.debug:
1354+ raise Exception(message)
1355+ else:
1356+ output = u"S3 Debug: %s" % unicode(message)
1357+ if value:
1358+ output += u": %s" % unicode(value)
1359+ sys.stderr.write(output+"\n")
1360+ session.error = current.T(message)
1361+
1362 # -------------------------------------------------------------------------
1363 def download_kml(self, record_id, filename):
1364 """
1365@@ -326,11 +332,11 @@
1366 db = current.db
1367
1368 layer = KMLLayer(self)
1369+
1370 query = (layer.table.id == record_id)
1371 record = db(query).select(limitby=(0, 1)).first()
1372 url = record.url
1373
1374- layer.add_record(record)
1375 cachepath = layer.cachepath
1376 filepath = os.path.join(cachepath, filename)
1377
1378@@ -427,7 +433,7 @@
1379 myfile = zipfile.ZipFile(fp)
1380 try:
1381 file = myfile.read("doc.kml")
1382- except:
1383+ except: # Naked except!!
1384 file = myfile.read(myfile.infolist()[0].filename)
1385 myfile.close()
1386
1387@@ -531,7 +537,7 @@
1388 try:
1389 lon = features[0].lon
1390 simple = True
1391- except:
1392+ except AttributeError:
1393 simple = False
1394
1395 for feature in features:
1396@@ -544,7 +550,7 @@
1397 # A Join
1398 lon = feature.gis_location.lon
1399 lat = feature.gis_location.lat
1400- except:
1401+ except AttributeError:
1402 # Skip any rows without the necessary lat/lon fields
1403 continue
1404
1405@@ -985,7 +991,8 @@
1406 _marker = db.gis_marker
1407 _projection = db.gis_projection
1408 have_tables = _config and _projection
1409- except:
1410+ except Exception, exception:
1411+ self.debug(exception)
1412 have_tables = False
1413
1414 row = None
1415@@ -1002,16 +1009,13 @@
1416 if not row:
1417 if auth.is_logged_in():
1418 # Read personalised config, if available.
1419- try:
1420- query = (db.pr_person.uuid == auth.user.person_uuid) & \
1421- (_config.pe_id == db.pr_person.pe_id) & \
1422- (_marker.id == _config.marker_id) & \
1423- (_projection.id == _config.projection_id)
1424- row = db(query).select(limitby=(0, 1)).first()
1425- if row:
1426- config_id = row["gis_config"].id
1427- except:
1428- pass
1429+ query = (db.pr_person.uuid == auth.user.person_uuid) & \
1430+ (_config.pe_id == db.pr_person.pe_id) & \
1431+ (_marker.id == _config.marker_id) & \
1432+ (_projection.id == _config.projection_id)
1433+ row = db(query).select(limitby=(0, 1)).first()
1434+ if row:
1435+ config_id = row["gis_config"].id
1436 if not row:
1437 # No personal config or not logged in. Use site default.
1438 config_id = 1
1439@@ -1148,7 +1152,7 @@
1440 if level:
1441 try:
1442 return location_hierarchy[level]
1443- except:
1444+ except KeyError:
1445 return level
1446 else:
1447 return location_hierarchy
1448@@ -1197,7 +1201,8 @@
1449 if level:
1450 try:
1451 return all_levels[level]
1452- except:
1453+ except Exception, exception:
1454+
1455 return level
1456 else:
1457 return all_levels
1458@@ -1472,7 +1477,7 @@
1459 represent = db(table.id == value).select(table.name,
1460 cache=cache,
1461 limitby=(0, 1)).first().name
1462- except:
1463+ except: # @ToDo: provide specific exception
1464 # Keep the default from earlier
1465 pass
1466
1467@@ -1512,24 +1517,21 @@
1468 lat_max = location.lat_max
1469
1470 else:
1471- s3_debug("Location searched within isn't a Polygon!")
1472- session.error = T("Location searched within isn't a Polygon!")
1473+ self.debug("Location searched within isn't a Polygon!")
1474 return None
1475- except:
1476+ except: # @ToDo: need specific exception
1477 wkt = location
1478 if (wkt.startswith("POLYGON") or wkt.startswith("MULTIPOLYGON")):
1479 # ok
1480 lon_min = None
1481 else:
1482- s3_debug("This isn't a Polygon!")
1483- session.error = T("This isn't a Polygon!")
1484+ self.debug("This isn't a Polygon!")
1485 return None
1486
1487 try:
1488 polygon = wkt_loads(wkt)
1489- except:
1490- s3_debug("Invalid Polygon!")
1491- session.error = T("Invalid Polygon!")
1492+ except: # @ToDo: need specific exception
1493+ self.debug("Invalid Polygon!")
1494 return None
1495
1496 table = db[tablename]
1497@@ -1537,8 +1539,7 @@
1498
1499 if "location_id" not in table.fields():
1500 # @ToDo: Add any special cases to be able to find the linked location
1501- s3_debug("This table doesn't have a location_id!")
1502- session.error = T("This table doesn't have a location_id!")
1503+ self.debug("This table doesn't have a location_id!")
1504 return None
1505
1506 query = (table.location_id == locations.id)
1507@@ -1570,7 +1571,10 @@
1508 # Save Record
1509 output.records.append(row)
1510 except shapely.geos.ReadingError:
1511- s3_debug("Error reading wkt of location with id", row.id)
1512+ self.debug(
1513+ "Error reading wkt of location with id",
1514+ value=row.id
1515+ )
1516 else:
1517 # 1st check for Features included within the bbox (faster)
1518 def in_bbox(row):
1519@@ -1596,7 +1600,10 @@
1520 # Save Record
1521 output.records.append(row)
1522 except shapely.geos.ReadingError:
1523- s3_debug("Error reading wkt of location with id", row.id)
1524+ self.debug(
1525+ "Error reading wkt of location with id",
1526+ value = row.id,
1527+ )
1528
1529 return output
1530
1531@@ -2072,38 +2079,41 @@
1532 current_row += 1
1533 try:
1534 name0 = row.pop("ADM0_NAME")
1535- except:
1536+ except KeyError:
1537 name0 = ""
1538 try:
1539 name1 = row.pop("ADM1_NAME")
1540- except:
1541+ except KeyError:
1542 name1 = ""
1543 try:
1544 name2 = row.pop("ADM2_NAME")
1545- except:
1546+ except KeyError:
1547 name2 = ""
1548 try:
1549 name3 = row.pop("ADM3_NAME")
1550- except:
1551+ except KeyError:
1552 name3 = ""
1553 try:
1554 name4 = row.pop("ADM4_NAME")
1555- except:
1556+ except KeyError:
1557 name4 = ""
1558 try:
1559 name5 = row.pop("ADM5_NAME")
1560- except:
1561+ except KeyError:
1562 name5 = ""
1563
1564 if not name5 and not name4 and not name3 and \
1565 not name2 and not name1:
1566 # We need a name! (L0's are already in DB)
1567- s3_debug("No name provided", current_row)
1568+ s3_debug(
1569+ "No name provided",
1570+ current_row,
1571+ )
1572 continue
1573
1574 try:
1575 wkt = row.pop("WKT")
1576- except:
1577+ except KeyError:
1578 wkt = None
1579 try:
1580 lat = row.pop("LAT")
1581@@ -2112,21 +2122,17 @@
1582 lat = None
1583 lon = None
1584
1585- if domain:
1586- try:
1587- uuid = "%s/%s" % (domain,
1588- row.pop("UUID"))
1589- except:
1590- uuid = ""
1591+ try:
1592+ uuid = row.pop("UUID")
1593+ except KeyError:
1594+ uuid = ""
1595 else:
1596- try:
1597- uuid = row.pop("UUID")
1598- except:
1599- uuid = ""
1600+ if domain:
1601+ uuid = "%s/%s" % (domain, uuid)
1602
1603 try:
1604 code = row.pop("CODE")
1605- except:
1606+ except KeyError:
1607 code = ""
1608
1609 population = ""
1610@@ -2172,7 +2178,7 @@
1611 # Calculate Centroid & Bounds
1612 if wkt:
1613 try:
1614- # Valid WKT
1615+ # Valid WKT
1616 shape = wkt_loads(wkt)
1617 centroid_point = shape.centroid
1618 lon = centroid_point.x
1619@@ -2186,7 +2192,7 @@
1620 feature_type = 1 # Point
1621 else:
1622 feature_type = 3 # Polygon
1623- except:
1624+ except: # @ToDo: provide specific exception
1625 s3_debug("Invalid WKT", name)
1626 continue
1627 else:
1628@@ -2326,7 +2332,7 @@
1629 else:
1630 cached = False
1631 if not os.access(cachepath, os.W_OK):
1632- s3_debug("Folder not writable", cachepath)
1633+ self.debug("Folder not writable", cachepath)
1634 return
1635
1636 if not cached:
1637@@ -2335,11 +2341,11 @@
1638 f = fetch(url)
1639 except (urllib2.URLError,):
1640 e = sys.exc_info()[1]
1641- s3_debug("URL Error", e)
1642+ self.debug("URL Error", e)
1643 return
1644 except (urllib2.HTTPError,):
1645 e = sys.exc_info()[1]
1646- s3_debug("HTTP Error", e)
1647+ self.debug("HTTP Error", e)
1648 return
1649
1650 # Unzip File
1651@@ -2352,8 +2358,8 @@
1652 # For now, 2.5 users need to download/unzip manually to cache folder
1653 myfile.extract(filename, cachepath)
1654 myfile.close()
1655- except:
1656- s3_debug("Zipfile contents don't seem correct!")
1657+ except IOError:
1658+ self.debug("Zipfile contents don't seem correct!")
1659 myfile.close()
1660 return
1661
1662@@ -2469,7 +2475,7 @@
1663 # Should be just a single parent
1664 break
1665 except shapely.geos.ReadingError:
1666- s3_debug("Error reading wkt of location with id", row.id)
1667+ self.debug("Error reading wkt of location with id", row.id)
1668
1669 # Add entry to database
1670 table.insert(uuid=uuid,
1671@@ -2489,7 +2495,7 @@
1672 else:
1673 continue
1674
1675- s3_debug("All done!")
1676+ self.debug("All done!")
1677 return
1678
1679 # -------------------------------------------------------------------------
1680@@ -2732,7 +2738,7 @@
1681
1682 db = current.db
1683 in_bbox = self.query_features_by_bbox(*shape.bounds)
1684- has_wkt = (db.gis_location.wkt != None) & (db.gis_location.wkt != '')
1685+ has_wkt = (db.gis_location.wkt != None) & (db.gis_location.wkt != "")
1686
1687 for loc in db(in_bbox & has_wkt).select():
1688 try:
1689@@ -2740,7 +2746,7 @@
1690 if location_shape.intersects(shape):
1691 yield loc
1692 except shapely.geos.ReadingError:
1693- s3_debug("Error reading wkt of location with id", loc.id)
1694+ self.debug("Error reading wkt of location with id", loc.id)
1695
1696 # -------------------------------------------------------------------------
1697 def _get_features_by_latlon(self, lat, lon):
1698@@ -2796,7 +2802,7 @@
1699 try :
1700 shape = wkt_loads(location.wkt)
1701 except:
1702- s3_debug("Error reading WKT", location.wkt)
1703+ self.debug("Error reading WKT", location.wkt)
1704 continue
1705 bounds = shape.bounds
1706 table[location.id] = dict(
1707@@ -2947,7 +2953,12 @@
1708 map_width = width
1709 else:
1710 map_width = config.map_width
1711- if bbox and (-90 < bbox["max_lat"] < 90) and (-90 < bbox["min_lat"] < 90) and (-180 < bbox["max_lon"] < 180) and (-180 < bbox["min_lon"] < 180):
1712+ if (bbox
1713+ and (-90 < bbox["max_lat"] < 90)
1714+ and (-90 < bbox["min_lat"] < 90)
1715+ and (-180 < bbox["max_lon"] < 180)
1716+ and (-180 < bbox["min_lon"] < 180)
1717+ ):
1718 # We have sane Bounds provided, so we should use them
1719 pass
1720 else:
1721@@ -2971,15 +2982,21 @@
1722 projection = config.epsg
1723
1724
1725- if projection != 900913 and projection != 4326:
1726+ if projection not in (900913, 4326):
1727 # Test for Valid Projection file in Proj4JS library
1728- projpath = os.path.join(request.folder, "static", "scripts", "gis", "proj4js", "lib", "defs", "EPSG%s.js" % projection)
1729+ projpath = os.path.join(
1730+ request.folder, "static", "scripts", "gis", "proj4js", \
1731+ "lib", "defs", "EPSG%s.js" % projection
1732+ )
1733 try:
1734 f = open(projpath, "r")
1735 f.close()
1736 except:
1737- session.error = "%s /static/scripts/gis/proj4js/lib/defs" % T("Projection not supported - please add definition to")
1738- redirect(URL(c="gis", f="projection"))
1739+ session.error = "'%s' %s /static/scripts/gis/proj4js/lib/defs" % (
1740+ projection,
1741+ T("Projection not supported - please add definition to")
1742+ )
1743+ redirect(URL(r=request, c="gis", f="projection"))
1744
1745 units = config.units
1746 maxResolution = config.maxResolution
1747@@ -3052,11 +3069,12 @@
1748 #########
1749 # Scripts
1750 #########
1751+
1752 def add_javascript(script):
1753 if type(script) == SCRIPT:
1754 html.append(script)
1755 elif script.startswith("http"):
1756- html.append(
1757+ html.append(
1758 SCRIPT(_type="text/javascript",
1759 _src=script))
1760 else:
1761@@ -3066,7 +3084,7 @@
1762
1763 debug = session.s3.debug
1764 if debug:
1765- if projection != 900913 and projection != 4326:
1766+ if projection not in (900913, 4326):
1767 add_javascript("scripts/gis/proj4js/lib/proj4js-combined.js")
1768 add_javascript("scripts/gis/proj4js/lib/defs/EPSG%s.js" % projection)
1769
1770@@ -3079,7 +3097,7 @@
1771 add_javascript("scripts/gis/usng2.js")
1772 add_javascript("scripts/gis/MP.js")
1773 else:
1774- if projection != 900913 and projection != 4326:
1775+ if projection not in (900913, 4326):
1776 add_javascript("scripts/gis/proj4js/lib/proj4js-compressed.js")
1777 add_javascript("scripts/gis/proj4js/lib/defs/EPSG%s.js" % projection)
1778 add_javascript("scripts/gis/OpenLayers.js")
1779@@ -3181,20 +3199,20 @@
1780 # If we do come back to it, then it should be moved to static
1781 if print_tool:
1782 url = print_tool["url"]
1783- url+'' # check url can be concatenated with strings
1784+ url+"" # check url can be concatenated with strings
1785 if "title" in print_tool:
1786- mapTitle = str(print_tool["mapTitle"])
1787+ mapTitle = unicode(print_tool["mapTitle"])
1788 else:
1789- mapTitle = str(T("Map from Sahana Eden"))
1790+ mapTitle = unicode(T("Map from Sahana Eden"))
1791 if "subtitle" in print_tool:
1792- subTitle = str(print_tool["subTitle"])
1793+ subTitle = unicode(print_tool["subTitle"])
1794 else:
1795- subTitle = str(T("Printed from Sahana Eden"))
1796+ subTitle = unicode(T("Printed from Sahana Eden"))
1797 if session.auth:
1798- creator = session.auth.user.email
1799+ creator = unicode(session.auth.user.email)
1800 else:
1801 creator = ""
1802- print_tool1 = "".join(("""
1803+ print_tool1 = u"".join(("""
1804 if (typeof(printCapabilities) != 'undefined') {
1805 // info.json from script headers OK
1806 printProvider = new GeoExt.data.PrintProvider({
1807@@ -3218,7 +3236,7 @@
1808 // printProvider: printProvider
1809 //});
1810 // A layer to display the print page extent
1811- //var pageLayer = new OpenLayers.Layer.Vector('""", str(T("Print Extent")), """');
1812+ //var pageLayer = new OpenLayers.Layer.Vector('""", unicode(T("Print Extent")), """');
1813 //pageLayer.addFeatures(printPage.feature);
1814 //pageLayer.setVisibility(false);
1815 //map.addLayer(pageLayer);
1816@@ -3234,7 +3252,7 @@
1817 //});
1818 // The form with fields controlling the print output
1819 S3.gis.printFormPanel = new Ext.form.FormPanel({
1820- title: '""", str(T("Print Map")), """',
1821+ title: '""", unicode(T("Print Map")), """',
1822 rootVisible: false,
1823 split: true,
1824 autoScroll: true,
1825@@ -3247,7 +3265,7 @@
1826 defaults: {anchor: '100%%'},
1827 listeners: {
1828 'expand': function() {
1829- //if (null == mapPanel.map.getLayersByName('""", str(T("Print Extent")), """')[0]) {
1830+ //if (null == mapPanel.map.getLayersByName('""", unicode(T("Print Extent")), """')[0]) {
1831 // mapPanel.map.addLayer(pageLayer);
1832 //}
1833 if (null == mapPanel.plugins[0]) {
1834@@ -3275,7 +3293,7 @@
1835 xtype: 'textarea',
1836 name: 'comment',
1837 value: '',
1838- fieldLabel: '""", str(T("Comment")), """',
1839+ fieldLabel: '""", unicode(T("Comment")), """',
1840 plugins: new GeoExt.plugins.PrintPageField({
1841 printPage: printPage
1842 })
1843@@ -3283,7 +3301,7 @@
1844 xtype: 'combo',
1845 store: printProvider.layouts,
1846 displayField: 'name',
1847- fieldLabel: '""", str(T("Layout")), """',
1848+ fieldLabel: '""", T("Layout").decode("utf-8"), """',
1849 typeAhead: true,
1850 mode: 'local',
1851 triggerAction: 'all',
1852@@ -3294,7 +3312,7 @@
1853 xtype: 'combo',
1854 store: printProvider.dpis,
1855 displayField: 'name',
1856- fieldLabel: '""", str(T("Resolution")), """',
1857+ fieldLabel: '""", unicode(T("Resolution")), """',
1858 tpl: '<tpl for="."><div class="x-combo-list-item">{name} dpi</div></tpl>',
1859 typeAhead: true,
1860 mode: 'local',
1861@@ -3311,7 +3329,7 @@
1862 // xtype: 'combo',
1863 // store: printProvider.scales,
1864 // displayField: 'name',
1865- // fieldLabel: '""", str(T("Scale")), """',
1866+ // fieldLabel: '""", unicode(T("Scale")), """',
1867 // typeAhead: true,
1868 // mode: 'local',
1869 // triggerAction: 'all',
1870@@ -3321,13 +3339,13 @@
1871 //}, {
1872 // xtype: 'textfield',
1873 // name: 'rotation',
1874- // fieldLabel: '""", str(T("Rotation")), """',
1875+ // fieldLabel: '""", unicode(T("Rotation")), """',
1876 // plugins: new GeoExt.plugins.PrintPageField({
1877 // printPage: printPage
1878 // })
1879 }],
1880 buttons: [{
1881- text: '""", str(T("Create PDF")), """',
1882+ text: '""", unicode(T("Create PDF")), """',
1883 handler: function() {
1884 // the PrintExtent plugin is the mapPanel's 1st plugin
1885 //mapPanel.plugins[0].print();
1886@@ -3345,7 +3363,7 @@
1887 } else {
1888 // Display error diagnostic
1889 S3.gis.printFormPanel = new Ext.Panel ({
1890- title: '""", str(T("Print Map")), """',
1891+ title: '""", unicode(T("Print Map")), """',
1892 rootVisible: false,
1893 split: true,
1894 autoScroll: true,
1895@@ -3356,7 +3374,7 @@
1896 bodyStyle: 'padding:5px',
1897 labelAlign: 'top',
1898 defaults: {anchor: '100%'},
1899- html: '""", str(T("Printing disabled since server not accessible")), """: <BR />""", url, """'
1900+ html: '""", unicode(T("Printing disabled since server not accessible")), """: <BR />""", unicode(url), """'
1901 });
1902 }
1903 """))
1904@@ -3442,40 +3460,40 @@
1905 name_safe = re.sub("'", "", layer.name)
1906 if layer.url2:
1907 url2 = """,
1908- url2: '%s'""" % layer.url2
1909+ "url2": "%s\"""" % layer.url2
1910 else:
1911 url2 = ""
1912 if layer.url3:
1913 url3 = """,
1914- url3: '%s'""" % layer.url3
1915+ "url3": "%s\"""" % layer.url3
1916 else:
1917 url3 = ""
1918 if layer.base:
1919 base = ""
1920 else:
1921 base = """,
1922- isBaseLayer: false"""
1923+ "isBaseLayer": false"""
1924 if layer.visible:
1925 visibility = ""
1926 else:
1927 visibility = """,
1928- visibility: false"""
1929+ "visibility": false"""
1930 if layer.attribution:
1931 attribution = """,
1932- attribution: '%s'""" % layer.attribution
1933+ "attribution": %s""" % repr(layer.attribution)
1934 else:
1935 attribution = ""
1936 if layer.zoom_levels is not None and layer.zoom_levels != 19:
1937 zoomLevels = """,
1938- zoomLevels: %i""" % layer.zoom_levels
1939+ "zoomLevels": %i""" % layer.zoom_levels
1940 else:
1941 zoomLevels = ""
1942
1943 # Generate JS snippet to pass to static
1944 layers_osm += """
1945 S3.gis.layers_osm[%i] = {
1946- name: '%s',
1947- url1: '%s'%s%s%s%s%s%s
1948+ "name": "%s",
1949+ "url1": "%s"%s%s%s%s%s%s
1950 }
1951 """ % (counter,
1952 name_safe,
1953@@ -3487,6 +3505,7 @@
1954 attribution,
1955 zoomLevels)
1956
1957+
1958 # ---------------------------------------------------------------------
1959 # XYZ
1960 # @ToDo: Migrate to Class/Static
1961@@ -3678,7 +3697,7 @@
1962
1963 if "active" in layer and not layer["active"]:
1964 visibility = """,
1965- visibility: false"""
1966+ "visibility": false"""
1967 else:
1968 visibility = ""
1969
1970@@ -3709,32 +3728,33 @@
1971 marker_url = ""
1972 if marker_url:
1973 markerLayer = """,
1974- marker_url: '%s',
1975- marker_height: %i,
1976- marker_width: %i""" % (marker_url, marker_height, marker_width)
1977+ "marker_url": "%s",
1978+ "marker_height": %i,
1979+ "marker_width": %i""" % (marker_url, marker_height, marker_width)
1980
1981 if "opacity" in layer and layer["opacity"] != 1:
1982 opacity = """,
1983- opacity: %.1f""" % layer["opacity"]
1984+ "opacity": %.1f""" % layer["opacity"]
1985 else:
1986 opacity = ""
1987 if "cluster_distance" in layer and layer["cluster_distance"] != self.cluster_distance:
1988 cluster_distance = """,
1989- cluster_distance: %i""" % layer["cluster_distance"]
1990+ "cluster_distance": %i""" % layer["cluster_distance"]
1991 else:
1992 cluster_distance = ""
1993 if "cluster_threshold" in layer and layer["cluster_threshold"] != self.cluster_threshold:
1994 cluster_threshold = """,
1995- cluster_threshold: %i""" % layer["cluster_threshold"]
1996+ "cluster_threshold": %i""" % layer["cluster_threshold"]
1997 else:
1998 cluster_threshold = ""
1999
2000 # Generate JS snippet to pass to static
2001 layers_feature_queries += """
2002 S3.gis.layers_feature_queries[%i] = {
2003- name: '%s',
2004- url: '%s'%s%s%s%s%s
2005-}""" % (counter,
2006+ "name": "%s",
2007+ "url": "%s"%s%s%s%s%s
2008+}
2009+""" % (counter,
2010 name,
2011 url,
2012 visibility,
2013@@ -3742,36 +3762,44 @@
2014 opacity,
2015 cluster_distance,
2016 cluster_threshold)
2017-
2018+
2019 # ---------------------------------------------------------------------
2020 # Add Layers from the Catalogue
2021 # ---------------------------------------------------------------------
2022 layers_config = ""
2023 if catalogue_layers:
2024 for LayerType in [
2025- #OSMLayer,
2026- Bing,
2027- Google,
2028- Yahoo,
2029- TMSLayer,
2030- WMSLayer,
2031- FeatureLayer,
2032- GeoJSONLayer,
2033- GeoRSSLayer,
2034- GPXLayer,
2035- KMLLayer,
2036- WFSLayer
2037- ]:
2038- # Instantiate the Class
2039- layer = LayerType(self)
2040- layer_type_js = layer.as_javascript()
2041- if layer_type_js:
2042- # Add to the output JS
2043- layers_config = "".join((layers_config,
2044- layer_type_js))
2045- if layer.scripts:
2046- for script in layer.scripts:
2047- add_javascript(script)
2048+ #OSMLayer,
2049+ BingLayer,
2050+ GoogleLayer,
2051+ YahooLayer,
2052+ TMSLayer,
2053+ WMSLayer,
2054+ FeatureLayer,
2055+ GeoJSONLayer,
2056+ GeoRSSLayer,
2057+ GPXLayer,
2058+ KMLLayer,
2059+ WFSLayer
2060+ ]:
2061+ try:
2062+ # Instantiate the Class
2063+ layer = LayerType(self)
2064+ layer_type_js = layer.as_javascript()
2065+ if layer_type_js:
2066+ # Add to the output JS
2067+ layers_config = "".join((layers_config,
2068+ layer_type_js))
2069+ if layer.scripts:
2070+ for script in layer.scripts:
2071+ add_javascript(script)
2072+ except Exception, exception:
2073+ if debug:
2074+ raise
2075+ else:
2076+ session.warning.append(
2077+ LayerType.__name__ + " not shown due to error"
2078+ )
2079
2080 # -----------------------------------------------------------------
2081 # Coordinate Grid - only one possible
2082@@ -3845,6 +3873,7 @@
2083 "S3.gis.marker_default_width = %i;\n" % marker_default.width,
2084 osm_auth,
2085 layers_osm,
2086+ layers_feature_queries,
2087 _features,
2088 layers_config,
2089 # i18n Labels
2090@@ -3877,6 +3906,7 @@
2091 ))))
2092
2093 # Static Script
2094+
2095 if debug:
2096 add_javascript("scripts/S3/s3.gis.js")
2097 add_javascript("scripts/S3/s3.gis.layers.js")
2098@@ -3886,7 +3916,6 @@
2099
2100 # Dynamic Script (stuff which should, as far as possible, be moved to static)
2101 html.append(SCRIPT(layers_js + \
2102- #layers_xyz + \
2103 print_tool1))
2104
2105 # Set up map plugins
2106@@ -3901,8 +3930,7 @@
2107
2108 return html
2109
2110-
2111-# =============================================================================
2112+# -----------------------------------------------------------------------------
2113 class Marker(object):
2114 """ Represents a Map Marker """
2115 def __init__(self, gis, id=None):
2116@@ -3935,6 +3963,13 @@
2117 #self.url = URL(c="static", f="img",
2118 # args=["markers", marker.image])
2119
2120+ def add_attributes_to_output(self, output):
2121+ output.update(
2122+ marker_image = self.image,
2123+ marker_height = self.height,
2124+ marker_width = self.width,
2125+ )
2126+
2127 # -----------------------------------------------------------------------------
2128 class Projection(object):
2129 """ Represents a Map Projection """
2130@@ -3960,41 +3995,34 @@
2131 self.epsg = projection.epsg
2132
2133 # -----------------------------------------------------------------------------
2134+
2135+def config_dict(mandatory, defaulted):
2136+ d = dict(mandatory)
2137+ for key, (value, defaults) in defaulted.iteritems():
2138+ if value not in defaults:
2139+ d[key] = value
2140+ return d
2141+
2142+
2143+# the layer code only needs to do:
2144+# any database lookups to get extra data
2145+# security checks.
2146+
2147+# then it generates appropriate JSON strings.
2148+
2149 class Layer(object):
2150 """
2151- Base Class for Layers
2152- Not meant to be instantiated direct
2153+ Abstract Base Class for Layers
2154 """
2155- def __init__(self, gis, record=None):
2156+ def __init__(self, gis):
2157+ self.gis = gis
2158 db = current.db
2159-
2160- self.gis = gis
2161- # This usually arrives later
2162- self.record = record
2163- # Ensure all attributes available (even if Null)
2164- self._refresh()
2165- self.scripts = []
2166-
2167- def add_record(self, record):
2168- """
2169- Update the record & refresh the attributes
2170- """
2171- if record:
2172- self.record = record
2173- self._refresh()
2174- else:
2175- return
2176-
2177- def as_dict(self):
2178- """
2179- Output the Layer as a Python dictionary
2180- - this is used to build a JSON of the overall dict of layers
2181- """
2182- record = self.record
2183- if record:
2184- return record
2185- else:
2186- return
2187+ try:
2188+ self.table = db[self.table_name]
2189+ except:
2190+ current.manager.load(self.table_name)
2191+ self.table = db[tablename]
2192+
2193
2194 def as_json(self):
2195 """
2196@@ -4006,674 +4034,502 @@
2197 else:
2198 return
2199
2200- def as_javascript(self):
2201- """
2202- Output the Layer as Javascript
2203- - suitable for inclusion in the HTML page
2204- """
2205- gis = self.gis
2206- auth = gis.auth
2207- db = current.db
2208- table = self.table
2209-
2210- layer_type_list = []
2211- # Read the enabled Layers
2212- records = db(table.enabled == True).select()
2213- for record in records:
2214- # Check user is allowed to access the layer
2215- role_required = record.role_required
2216- if (not role_required) or auth.s3_has_role(role_required):
2217- # Pass the record to the Class
2218- self.add_record(record)
2219- # Read the output dict for this layer
2220- layer_dict = self.as_dict()
2221- if layer_dict:
2222- # Add this layer to the list of layers for this layer type
2223- layer_type_list.append(layer_dict)
2224-
2225- if layer_type_list:
2226- # Output the Layer Type as JSON
2227- layer_type_json = json.dumps(layer_type_list,
2228- sort_keys=True,
2229- indent=4)
2230- layer_type_js = "".join(("%s = " % self.js_array,
2231- layer_type_json,
2232- "\n"))
2233- return layer_type_js
2234-
2235- def _name_safe(self, name):
2236- """
2237- Make the name safe for use in JSON
2238- i.e. any Unicode character allowed except for " & \
2239- """
2240- return re.sub('[\\"]', "", name)
2241-
2242- def _refresh(self):
2243- " Refresh the attributes of the Layer "
2244- table = self.table
2245- if "marker_id" in table:
2246- self.set_marker()
2247- if "projection_id" in table:
2248- self.set_projection()
2249-
2250- def set_marker(self):
2251- " Set the Marker for the Layer "
2252- gis = self.gis
2253- record = self.record
2254- if record:
2255- marker = Marker(gis, record.marker_id)
2256- self.marker = marker
2257- else:
2258- self.marker = None
2259-
2260- def set_projection(self):
2261- " Set the Projection for the Layer "
2262- gis = self.gis
2263- record = self.record
2264- if record:
2265- projection = Projection(gis, record.projection_id)
2266- self.projection = projection
2267- else:
2268- self.projection = None
2269-
2270-# -----------------------------------------------------------------------------
2271-class OneLayer(Layer):
2272- """
2273- Base Class for Layers with just a single record
2274- Not meant to be instantiated direct
2275- """
2276-
2277- def __init__(self, gis, record=None):
2278- db = current.db
2279- tablename = ""
2280- try:
2281- table = db[tablename]
2282- except:
2283- current.manager.load(tablename)
2284- table = db[tablename]
2285- if not record:
2286- # There is only ever 1 layer
2287- record = db(table.id > 0).select().first()
2288-
2289- self.gis = gis
2290- self.table = table
2291- self.js_array = "S3.gis.OneLayer"
2292- self.record = record
2293- self._refresh()
2294- self.scripts = []
2295-
2296- def as_javascript(self):
2297- """
2298- Output the Layer as Javascript
2299- - suitable for inclusion in the HTML page
2300- """
2301- auth = self.gis.auth
2302- record = self.record
2303- # Check Layer exists in the DB
2304- if not record:
2305- return None
2306- # Check Layer is enabled
2307- if not record.enabled:
2308- return None
2309- # Check user is allowed to access the Layer
2310- role_required = record.role_required
2311- if role_required and not auth.s3_has_role(role_required):
2312- return None
2313- # Read the output JSON for this layer
2314- layer_type_json = self.as_json()
2315- layer_type_js = "".join(("%s = " % self.js_array,
2316- layer_type_json,
2317- "\n"))
2318- return layer_type_js
2319-
2320- def _set_api_key(self):
2321- " Set the API Key for the Layer "
2322- record = self.record
2323- if record:
2324- self.apikey = record.apikey
2325- else:
2326- self.apikey = None
2327-
2328- def _refresh(self):
2329- " Refresh the attributes of the Layer "
2330- table = self.table
2331- if "apikey" in table:
2332- self._set_api_key()
2333-
2334-# -----------------------------------------------------------------------------
2335-class Bing(OneLayer):
2336- """ Bing Layers from Catalogue """
2337- def __init__(self, gis, record=None):
2338- db = current.db
2339- tablename = "gis_layer_bing"
2340- try:
2341- table = db[tablename]
2342- except:
2343- current.manager.load(tablename)
2344- table = db[tablename]
2345- if not record:
2346- # There is only ever 1 layer
2347- record = db(table.id > 0).select().first()
2348-
2349- self.gis = gis
2350- self.table = table
2351- self.js_array = "S3.gis.Bing"
2352- self.record = record
2353- self._refresh()
2354- self.scripts = []
2355-
2356+
2357+
2358+# -----------------------------------------------------------------------------
2359+class SingleRecordLayer(Layer):
2360+ """
2361+ Abstract Base Class for Layers with just a single record
2362+ """
2363+
2364+ def __init__(self, gis):
2365+ super(SingleRecordLayer, self).__init__(gis)
2366+ table = self.table
2367+ records = current.db(table.id > 0).select()
2368+ assert len(records) <= 1, (
2369+ "There should only ever be 0 or 1 %s" % self.__class__.__name__
2370+ )
2371+ self.record = None
2372+ record = records.first()
2373+ if record is not None:
2374+ if record.enabled:
2375+ role_required = record.role_required
2376+ if not role_required or self.gis.auth.s3_has_role(role_required):
2377+ self.record = record
2378+ # Refresh the attributes of the Layer
2379+ if "apikey" in table:
2380+ if record:
2381+ self.apikey = record.apikey
2382+ else:
2383+ self.apikey = None
2384+ self.gis = gis
2385+ self.scripts = []
2386+
2387+ def as_javascript(self):
2388+ """
2389+ Output the Layer as Javascript
2390+ - suitable for inclusion in the HTML page
2391+ """
2392+ if self.record:
2393+ if "apikey" in self.table and not self.apikey:
2394+ raise Exception("Cannot display a %s if we have no valid API Key" % self.__class__.__name__)
2395+ json = self.as_json()
2396+ if json:
2397+ return "%s = %s\n" % (
2398+ self.js_array,
2399+ json
2400+ )
2401+ else:
2402+ return None
2403+ else:
2404+ return None
2405+
2406+# -----------------------------------------------------------------------------
2407+class BingLayer(SingleRecordLayer):
2408+ """ Bing Layer from Catalogue """
2409+ table_name = "gis_layer_bing"
2410+ js_array = "S3.gis.Bing"
2411+
2412 def as_dict(self):
2413 gis = self.gis
2414 record = self.record
2415- apikey = self.apikey
2416-
2417- if not apikey:
2418- # Cannot display Bing layers if we have no valid API Key
2419- return None
2420-
2421- config = gis.get_config()
2422- if Projection(gis, id=config.projection_id).epsg != 900913:
2423- # Cannot display Bing layers unless we're using the
2424- # Spherical Mercator Projection
2425- return None
2426-
2427- # Mandatory attributes
2428- output = {
2429- "ApiKey": self.apikey
2430- }
2431-
2432- # Attributes which are defaulted client-side if not set
2433- if record.aerial_enabled:
2434- output["Aerial"] = record.aerial or "Bing Satellite"
2435- if record.road_enabled:
2436- output["Road"] = record.road or "Bing Roads"
2437- if record.hybrid_enabled:
2438- output["Hybrid"] = record.hybrid or "Bing Hybrid"
2439-
2440- return output
2441+ if record is not None:
2442+ config = self.gis.get_config()
2443+ if Projection(gis, id=config.projection_id).epsg != 900913:
2444+ raise Exception("Cannot display Bing layers unless we're using the Spherical Mercator Projection")
2445+ else:
2446+ # Mandatory attributes
2447+ output = {
2448+ "ApiKey": self.apikey
2449+ }
2450+
2451+ # Attributes which are defaulted client-side if not set
2452+ if record.aerial_enabled:
2453+ output["Aerial"] = record.aerial or "Bing Satellite"
2454+ if record.road_enabled:
2455+ output["Road"] = record.road or "Bing Roads"
2456+ if record.hybrid_enabled:
2457+ output["Hybrid"] = record.hybrid or "Bing Hybrid"
2458+ return output
2459+ else:
2460+ return None
2461
2462 # -----------------------------------------------------------------------------
2463-class Google(OneLayer):
2464+class GoogleLayer(SingleRecordLayer):
2465 """
2466 Google Layers/Tools from Catalogue
2467 """
2468- def __init__(self, gis, record=None):
2469- db = current.db
2470- tablename = "gis_layer_google"
2471- try:
2472- table = db[tablename]
2473- except:
2474- current.manager.load(tablename)
2475- table = db[tablename]
2476- debug = current.session.s3.debug
2477- if not record:
2478- # There is only ever 1 layer
2479- record = db(table.id > 0).select().first()
2480-
2481- self.gis = gis
2482- self.table = table
2483- self.js_array = "S3.gis.Google"
2484- self.record = record
2485- self._refresh()
2486-
2487- if record:
2488+ table_name = "gis_layer_google"
2489+ js_array = "S3.gis.Google"
2490+
2491+ def __init__(self, gis):
2492+ super(GoogleLayer, self).__init__(gis)
2493+ record = self.record
2494+ if record is not None:
2495+ debug = current.session.s3.debug
2496+ add_script = self.scripts.append
2497 if record.mapmaker_enabled or record.mapmakerhybrid_enabled:
2498 # Need to use v2 API
2499 # http://code.google.com/p/gmaps-api-issues/issues/detail?id=2349
2500- self.scripts = ["http://maps.google.com/maps?file=api&v=2&key=%s" % self.apikey]
2501+ add_script("http://maps.google.com/maps?file=api&v=2&key=%s" % self.apikey)
2502 else:
2503 # v3 API
2504- self.scripts = ["http://maps.google.com/maps/api/js?v=3.2&sensor=false"]
2505+ add_script("http://maps.google.com/maps/api/js?v=3.2&sensor=false")
2506 if debug and record.streetview_enabled:
2507- self.scripts.append("scripts/gis/gxp/widgets/GoogleStreetViewPanel.js")
2508+ add_script("scripts/gis/gxp/widgets/GoogleStreetViewPanel.js")
2509 if record.earth_enabled:
2510- self.scripts.append("http://www.google.com/jsapi?key=%s" % self.apikey)
2511- self.scripts.append(SCRIPT("google && google.load('earth', '1');", _type="text/javascript"))
2512+ add_script("http://www.google.com/jsapi?key=%s" % self.apikey)
2513+ add_script(SCRIPT("google && google.load('earth', '1');", _type="text/javascript"))
2514 if debug:
2515- self.scripts.append("scripts/gis/gxp/widgets/GoogleEarthPanel.js")
2516- else:
2517- self.scripts = []
2518+ add_script("scripts/gis/gxp/widgets/GoogleEarthPanel.js")
2519
2520 def as_dict(self):
2521 gis = self.gis
2522 T = current.T
2523 record = self.record
2524- apikey = self.apikey
2525-
2526- if not apikey and (record.mapmaker_enabled or record.mapmakerhybrid_enabled):
2527- # Cannot display Google layers if we have no valid API Key
2528+ if record is not None:
2529+ config = gis.get_config()
2530+ if Projection(gis, id=config.projection_id).epsg != 900913:
2531+ if record.earth_enabled:
2532+ # But the Google Earth panel can still be enabled
2533+ return {
2534+ "Earth": str(T("Switch to 3D"))
2535+ }
2536+ else:
2537+ raise Exception("Cannot display Google layers unless we're using the Spherical Mercator Projection")
2538+
2539+
2540+ # Mandatory attributes
2541+ #"ApiKey": self.apikey
2542+ output = {
2543+ }
2544+
2545+ # Attributes which are defaulted client-side if not set
2546+ if record.satellite_enabled:
2547+ output["Satellite"] = record.satellite or "Google Satellite"
2548+ if record.maps_enabled:
2549+ output["Maps"] = record.maps or "Google Maps"
2550+ if record.hybrid_enabled:
2551+ output["Hybrid"] = record.hybrid or "Google Hybrid"
2552+ if record.mapmaker_enabled:
2553+ output["MapMaker"] = record.mapmaker or "Google MapMaker"
2554+ if record.mapmakerhybrid_enabled:
2555+ output["MapMakerHybrid"] = record.mapmakerhybrid or "Google MapMaker Hybrid"
2556+ if record.earth_enabled:
2557+ output["Earth"] = str(T("Switch to 3D"))
2558+ if record.streetview_enabled and not (record.mapmaker_enabled or record.mapmakerhybrid_enabled):
2559+ # Streetview doesn't work with v2 API
2560+ output["StreetviewButton"] = str(T("Click where you want to open Streetview"))
2561+ output["StreetviewTitle"] = str(T("Street View"))
2562+
2563+ return output
2564+ else:
2565 return None
2566
2567- config = gis.get_config()
2568- if Projection(gis, id=config.projection_id).epsg != 900913:
2569- # Cannot display Google layers unless we're using the
2570- # Spherical Mercator Projection
2571- if record.earth_enabled:
2572- # But the Google Earth panel can still be enabled
2573- output = {
2574- "Earth": str(T("Switch to 3D"))
2575- }
2576- return output
2577- else:
2578- return None
2579-
2580- # Mandatory attributes
2581- #"ApiKey": self.apikey
2582- output = {
2583- }
2584-
2585- # Attributes which are defaulted client-side if not set
2586- if record.satellite_enabled:
2587- output["Satellite"] = record.satellite or "Google Satellite"
2588- if record.maps_enabled:
2589- output["Maps"] = record.maps or "Google Maps"
2590- if record.hybrid_enabled:
2591- output["Hybrid"] = record.hybrid or "Google Hybrid"
2592- if record.mapmaker_enabled:
2593- output["MapMaker"] = record.mapmaker or "Google MapMaker"
2594- if record.mapmakerhybrid_enabled:
2595- output["MapMakerHybrid"] = record.mapmakerhybrid or "Google MapMaker Hybrid"
2596- if record.earth_enabled:
2597- output["Earth"] = str(T("Switch to 3D"))
2598- if record.streetview_enabled and not (record.mapmaker_enabled or record.mapmakerhybrid_enabled):
2599- # Streetview doesn't work with v2 API
2600- output["StreetviewButton"] = str(T("Click where you want to open Streetview"))
2601- output["StreetviewTitle"] = str(T("Street View"))
2602-
2603- return output
2604-
2605 # -----------------------------------------------------------------------------
2606-class Yahoo(OneLayer):
2607+class YahooLayer(SingleRecordLayer):
2608 """
2609 Yahoo Layers from Catalogue
2610
2611 NB This will stop working on 13 September 2011
2612 http://developer.yahoo.com/blogs/ydn/posts/2011/06/yahoo-maps-apis-service-closure-announcement-new-maps-offerings-coming-soon/
2613 """
2614- def __init__(self, gis, record=None):
2615- db = current.db
2616- tablename = "gis_layer_yahoo"
2617- try:
2618- table = db[tablename]
2619- except:
2620- current.manager.load(tablename)
2621- table = db[tablename]
2622- if not record:
2623- # There is only ever 1 layer
2624- record = db(table.id > 0).select().first()
2625-
2626- self.gis = gis
2627- self.table = table
2628- self.js_array = "S3.gis.Yahoo"
2629- self.record = record
2630- self._refresh()
2631- if record:
2632- self.scripts = ["http://api.maps.yahoo.com/ajaxymap?v=3.8&appid=%s" % self.apikey]
2633- else:
2634- self.scripts = []
2635+ js_array = "S3.gis.Yahoo"
2636+ table_name = "gis_layer_yahoo"
2637+
2638+ def __init__(self, gis):
2639+ super(YahooLayer, self).__init__(gis)
2640+ if self.record:
2641+ self.scripts.append("http://api.maps.yahoo.com/ajaxymap?v=3.8&appid=%s" % self.apikey)
2642+ config = gis.get_config()
2643+ if Projection(gis, id=config.projection_id).epsg != 900913:
2644+ raise Exception("Cannot display Yahoo layers unless we're using the Spherical Mercator Projection")
2645
2646 def as_dict(self):
2647- gis = self.gis
2648 record = self.record
2649- apikey = self.apikey
2650-
2651- if not apikey:
2652- # Cannot display Yahoo layers if we have no valid API Key
2653- return None
2654-
2655- config = gis.get_config()
2656- if Projection(gis, id=config.projection_id).epsg != 900913:
2657- # Cannot display Yahoo layers unless we're using the
2658- # Spherical Mercator Projection
2659- return None
2660-
2661- # Mandatory attributes
2662- #"ApiKey": self.apikey
2663- output = {
2664- }
2665-
2666- # Attributes which are defaulted client-side if not set
2667- if record.satellite_enabled:
2668- output["Satellite"] = record.satellite or "Yahoo Satellite"
2669- if record.maps_enabled:
2670- output["Maps"] = record.maps or "Yahoo Maps"
2671- if record.hybrid_enabled:
2672- output["Hybrid"] = record.hybrid or "Yahoo Hybrid"
2673-
2674- return output
2675+ if record is not None:
2676+ # Mandatory attributes
2677+ #"ApiKey": self.apikey
2678+ output = {
2679+ }
2680+
2681+ # Attributes which are defaulted client-side if not set
2682+ if record.satellite_enabled:
2683+ output["Satellite"] = record.satellite or "Yahoo Satellite"
2684+ if record.maps_enabled:
2685+ output["Maps"] = record.maps or "Yahoo Maps"
2686+ if record.hybrid_enabled:
2687+ output["Hybrid"] = record.hybrid or "Yahoo Hybrid"
2688+
2689+ return output
2690+ else:
2691+ return None
2692+
2693+class MultiRecordLayer(Layer):
2694+ def __init__(self, gis):
2695+ super(MultiRecordLayer, self).__init__(gis)
2696+ self.sublayers = []
2697+ self.scripts = []
2698+
2699+ auth = gis.auth
2700+
2701+ layer_type_list = []
2702+ # Read the enabled Layers
2703+ for record in current.db(self.table.enabled == True).select():
2704+ # Check user is allowed to access the layer
2705+ role_required = record.role_required
2706+ if (not role_required) or auth.s3_has_role(role_required):
2707+ self.sublayers.append(self.SubLayer(gis, record))
2708+
2709+ def as_javascript(self):
2710+ """
2711+ Output the Layer as Javascript
2712+ - suitable for inclusion in the HTML page
2713+ """
2714+ sublayer_dicts = []
2715+ for sublayer in self.sublayers:
2716+ # Read the output dict for this sublayer
2717+ sublayer_dict = sublayer.as_dict()
2718+ if sublayer_dict:
2719+ # Add this layer to the list of layers for this layer type
2720+ sublayer_dicts.append(sublayer_dict)
2721+
2722+ if sublayer_dicts:
2723+ # Output the Layer Type as JSON
2724+ layer_type_json = json.dumps(sublayer_dicts,
2725+ sort_keys=True,
2726+ indent=4)
2727+ return "%s = %s\n" % (self.js_array, layer_type_json)
2728+ else:
2729+ return None
2730+
2731+ class SubLayer(object):
2732+ def __init__(self, gis, record):
2733+ # Ensure all attributes available (even if Null)
2734+ self.gis = gis
2735+ self.__dict__.update(record)
2736+ del record
2737+ self.safe_name = re.sub('[\\"]', "", self.name)
2738+
2739+ if hasattr(self, "marker_id"):
2740+ self.marker = Marker(gis, self.marker_id)
2741+ if hasattr(self, "projection_id"):
2742+ self.projection = Projection(gis, self.projection_id)
2743+
2744+ def setup_clustering(self, output):
2745+ gis = self.gis
2746+ cluster_distance = gis.cluster_distance
2747+ cluster_threshold = gis.cluster_threshold
2748+ if self.cluster_distance != cluster_distance:
2749+ output["cluster_distance"] = self.cluster_distance
2750+ if self.cluster_threshold != cluster_threshold:
2751+ output["cluster_threshold"] = self.cluster_threshold
2752+
2753+ def setup_visibility_and_opacity(self, output):
2754+ if not self.visible:
2755+ output["visibility"] = False
2756+ if self.opacity != 1:
2757+ output["opacity"] = "%.1f" % self.opacity
2758+
2759+ def add_attributes_if_not_default(self, output, **values_and_defaults):
2760+ # could also write values in debug mode, to check if defaults ignored.
2761+ # could also check values are not being overwritten.
2762+ for key, (value, defaults) in values_and_defaults.iteritems():
2763+ if value not in defaults:
2764+ output[key] = value
2765+
2766+ #def set_marker(self):
2767+ # " Set the Marker for the Layer "
2768+ # gis = self.gis
2769+ # self.marker = Marker(gis, self.marker_id)
2770+
2771+ #def set_projection(self):
2772+ # " Set the Projection for the Layer "
2773+ # gis = self.gis
2774+ # self.projection = Projection(gis, self.projection_id)
2775
2776 # -----------------------------------------------------------------------------
2777-class FeatureLayer(Layer):
2778+class FeatureLayer(MultiRecordLayer):
2779 """ Feature Layer from Catalogue """
2780- def __init__(self, gis, record=None):
2781- db = current.db
2782- tablename = "gis_layer_feature"
2783- try:
2784- table = db[tablename]
2785- except:
2786- current.manager.load(tablename)
2787- table = db[tablename]
2788-
2789- self.gis = gis
2790- self.table = table
2791- self.js_array = "S3.gis.layers_features"
2792- self.record = record
2793- self._refresh()
2794- self.scripts = []
2795-
2796- def as_dict(self):
2797- gis = self.gis
2798- cluster_distance = gis.cluster_distance
2799- cluster_threshold = gis.cluster_threshold
2800- record = self.record
2801- marker = self.marker
2802-
2803- auth = gis.auth
2804- request = current.request
2805- deployment_settings = gis.deployment_settings
2806-
2807- if record.module not in deployment_settings.modules:
2808- # Module is disabled
2809- return
2810- if not auth.permission(c=record.module, f=record.resource):
2811- # User has no permission to this resource (in ACL)
2812- return
2813-
2814- name_safe = self._name_safe(record.name)
2815-
2816- url = "%s.geojson?layer=%i" % (URL(record.module,record.resource),
2817- record.id)
2818- if record.filter:
2819- url = "%s&%s" % (url, record.filter)
2820-
2821- # Mandatory attributes
2822- output = {
2823- "name": name_safe,
2824+ table_name = "gis_layer_feature"
2825+ js_array = "S3.gis.layers_features"
2826+
2827+ class SubLayer(MultiRecordLayer.SubLayer):
2828+ def __init__(self, gis, record):
2829+ record_module = record.module
2830+ if record_module is not None:
2831+ if record_module not in gis.deployment_settings.modules:
2832+ raise Exception("%s module is disabled" % record_module)
2833+ if not gis.auth.permission(c=record.module, f=record.resource):
2834+ raise Exception("User has no permission to this resource (in ACL)")
2835+ else:
2836+ raise Exception("FeatureLayer Record '%s' has no module" % record.name)
2837+ super(FeatureLayer.SubLayer, self).__init__(gis, record)
2838+
2839+ def as_dict(self):
2840+ gis = self.gis
2841+
2842+ request = current.request
2843+ deployment_settings = gis.deployment_settings
2844+
2845+ url = "%s.geojson?layer=%i" % (URL(self.module, self.resource),
2846+ self.id)
2847+ if self.filter:
2848+ url = "%s&%s" % (url, self.filter)
2849+
2850+ # Mandatory attributes
2851+ output = {
2852+ "name": self.safe_name,
2853 "url": url,
2854- "marker_image": marker.image,
2855- "marker_height": marker.height,
2856- "marker_width": marker.width,
2857 }
2858-
2859- # Attributes which are defaulted client-side if not set
2860- if not record.visible:
2861- output["visibility"] = False
2862- if record.opacity != 1:
2863- output["opacity"] = "%.1f" % record.opacity
2864- if record.cluster_distance != cluster_distance:
2865- output["cluster_distance"] = record.cluster_distance
2866- if record.cluster_threshold != cluster_threshold:
2867- output["cluster_threshold"] = record.cluster_threshold
2868-
2869- return output
2870+ self.marker.add_attributes_to_output(output)
2871+ self.setup_visibility_and_opacity(output)
2872+ self.setup_clustering(output)
2873+
2874+ return output
2875
2876 # -----------------------------------------------------------------------------
2877-class GeoJSONLayer(Layer):
2878+class GeoJSONLayer(MultiRecordLayer):
2879 """ GeoJSON Layer from Catalogue """
2880- def __init__(self, gis, record=None):
2881- db = current.db
2882- tablename = "gis_layer_geojson"
2883- try:
2884- table = db[tablename]
2885- except:
2886- current.manager.load(tablename)
2887- table = db[tablename]
2888-
2889- self.gis = gis
2890- self.table = table
2891- self.js_array = "S3.gis.layers_geojson"
2892- self.record = record
2893- self._refresh()
2894- self.scripts = []
2895-
2896- def as_dict(self):
2897- gis = self.gis
2898- cluster_distance = gis.cluster_distance
2899- cluster_threshold = gis.cluster_threshold
2900- record = self.record
2901- marker = self.marker
2902- projection = self.projection
2903-
2904- name_safe = self._name_safe(record.name)
2905- # Mandatory attributes
2906- output = {
2907- "name": name_safe,
2908- "url": record.url,
2909- "marker_image": marker.image,
2910- "marker_height": marker.height,
2911- "marker_width": marker.width,
2912+ table_name = "gis_layer_geojson"
2913+ js_array = "S3.gis.layers_geojson"
2914+
2915+ class SubLayer(MultiRecordLayer.SubLayer):
2916+ def as_dict(self):
2917+ # Mandatory attributes
2918+ output = {
2919+ "name": self.safe_name,
2920+ "url": self.url,
2921 }
2922-
2923- # Attributes which are defaulted client-side if not set
2924- if projection.epsg != 4326:
2925- output["projection"] = projection.epsg
2926- if not record.visible:
2927- output["visibility"] = False
2928- if record.opacity != 1:
2929- output["opacity"] = "%.1f" % record.opacity
2930- if record.cluster_distance != cluster_distance:
2931- output["cluster_distance"] = record.cluster_distance
2932- if record.cluster_threshold != cluster_threshold:
2933- output["cluster_threshold"] = record.cluster_threshold
2934-
2935- return output
2936+ self.marker.add_attributes_to_output(output)
2937+
2938+ # Attributes which are defaulted client-side if not set
2939+ projection = self.projection
2940+ if projection.epsg != 4326:
2941+ output["projection"] = projection.epsg
2942+ self.setup_visibility_and_opacity(output)
2943+ self.setup_clustering(output)
2944+
2945+ return output
2946
2947 # -----------------------------------------------------------------------------
2948-class GeoRSSLayer(Layer):
2949+class GeoRSSLayer(MultiRecordLayer):
2950 """ GeoRSS Layer from Catalogue """
2951- def __init__(self, gis, record=None):
2952- db = current.db
2953- tablename = "gis_layer_georss"
2954- try:
2955- table = db[tablename]
2956- except:
2957- current.manager.load(tablename)
2958- table = db[tablename]
2959-
2960- self.gis = gis
2961- self.table = table
2962- self.cachetable = db.gis_cache
2963- self.js_array = "S3.gis.layers_georss"
2964- self.record = record
2965- self._refresh()
2966- self.scripts = []
2967-
2968- def as_dict(self):
2969- gis = self.gis
2970- cluster_distance = gis.cluster_distance
2971- cluster_threshold = gis.cluster_threshold
2972- record = self.record
2973- marker = self.marker
2974-
2975- db = current.db
2976- request = current.request
2977- public_url = gis.public_url
2978- cachetable = self.cachetable
2979-
2980- url = record.url
2981- # Check to see if we should Download layer to the cache
2982- download = True
2983- query = (cachetable.source == url)
2984- cached = db(query).select(cachetable.modified_on,
2985- limitby=(0, 1)).first()
2986- refresh = record.refresh or 900 # 15 minutes set if we have no data (legacy DB)
2987- if cached:
2988- modified_on = cached.modified_on
2989- cutoff = modified_on + timedelta(seconds=refresh)
2990- if request.utcnow < cutoff:
2991- download = False
2992- if download:
2993- # Download layer to the Cache
2994- # @ToDo: Call directly without going via HTTP
2995- # s3mgr = current.manager
2996- # @ToDo: Make this async by using Celery (also use this for the refresh time)
2997- fields = ""
2998- if record.data:
2999- fields = "&data_field=%s" % record.data
3000- if record.image:
3001- fields = "%s&image_field=%s" % (fields, record.image)
3002- _url = "%s%s/update.georss?fetchurl=%s%s" % (public_url,
3003- URL(c="gis", f="cache_feed"),
3004- url,
3005- fields)
3006- try:
3007- # @ToDo: Need to commit to not have DB locked with SQLite?
3008- fetch(_url)
3009- if cached:
3010- # Clear old records which are no longer active
3011- query = (cachetable.source == url) & \
3012- (cachetable.modified_on < cutoff)
3013- db(query).delete()
3014- except:
3015- # Feed down
3016- if cached:
3017- # Use cached copy
3018- # Should we Update timestamp to prevent every
3019- # subsequent request attempting the download?
3020- #query = (cachetable.source == url)
3021- #db(query).update(modified_on=request.utcnow)
3022- pass
3023- else:
3024- # No cached copy available - skip layer
3025- return
3026-
3027- name_safe = self._name_safe(record.name)
3028-
3029- # Pass the GeoJSON URL to the client
3030- # Filter to the source of this feed
3031- url = "%s.geojson?cache.source=%s" % (URL(c="gis", f="cache_feed"),
3032- url)
3033-
3034- # Mandatory attributes
3035- output = {
3036- "name": name_safe,
3037- "url": url,
3038- "marker_image": marker.image,
3039- "marker_height": marker.height,
3040- "marker_width": marker.width,
3041- }
3042-
3043- # Attributes which are defaulted client-side if not set
3044- if record.refresh != 900:
3045- output["refresh"] = record.refresh
3046- if not record.visible:
3047- output["visibility"] = False
3048- if record.opacity != 1:
3049- output["opacity"] = "%.1f" % record.opacity
3050- if record.cluster_distance != cluster_distance:
3051- output["cluster_distance"] = record.cluster_distance
3052- if record.cluster_threshold != cluster_threshold:
3053- output["cluster_threshold"] = record.cluster_threshold
3054-
3055- return output
3056+ table_name = "gis_layer_georss"
3057+ js_array = "S3.gis.layers_georss"
3058+
3059+ def __init__(self, gis):
3060+ super(GeoRSSLayer, self).__init__(gis)
3061+ GeoRSSLayer.SubLayer.cachetable = current.db.gis_cache
3062+
3063+ class SubLayer(MultiRecordLayer.SubLayer):
3064+ def as_dict(self):
3065+ gis = self.gis
3066+
3067+ db = current.db
3068+ request = current.request
3069+ public_url = gis.public_url
3070+ cachetable = self.cachetable
3071+
3072+ url = self.url
3073+ # Check to see if we should Download layer to the cache
3074+ download = True
3075+ query = (cachetable.source == url)
3076+ existing_cached_copy = db(query).select(cachetable.modified_on,
3077+ limitby=(0, 1)).first()
3078+ refresh = self.refresh or 900 # 15 minutes set if we have no data (legacy DB)
3079+ if existing_cached_copy:
3080+ modified_on = existing_cached_copy.modified_on
3081+ cutoff = modified_on + timedelta(seconds=refresh)
3082+ if request.utcnow < cutoff:
3083+ download = False
3084+ if download:
3085+ # Download layer to the Cache
3086+ # @ToDo: Call directly without going via HTTP
3087+ # s3mgr = current.manager
3088+ # @ToDo: Make this async by using Celery (also use this for the refresh time)
3089+ fields = ""
3090+ if self.data:
3091+ fields = "&data_field=%s" % self.data
3092+ if self.image:
3093+ fields = "%s&image_field=%s" % (fields, self.image)
3094+ _url = "%s%s/update.georss?fetchurl=%s%s" % (public_url,
3095+ URL(c="gis", f="cache_feed"),
3096+ url,
3097+ fields)
3098+ try:
3099+ # @ToDo: Need to commit to not have DB locked with SQLite?
3100+ fetch(_url)
3101+ if existing_cached_copy:
3102+ # Clear old selfs which are no longer active
3103+ query = (cachetable.source == url) & \
3104+ (cachetable.modified_on < cutoff)
3105+ db(query).delete()
3106+ except:
3107+ # Feed down
3108+ if existing_cached_copy:
3109+ # Use cached copy
3110+ # Should we Update timestamp to prevent every
3111+ # subsequent request attempting the download?
3112+ #query = (cachetable.source == url)
3113+ #db(query).update(modified_on=request.utcnow)
3114+ pass
3115+ else:
3116+ raise Exception("No cached copy available - skip layer")
3117+
3118+ name_safe = self.safe_name
3119+
3120+ # Pass the GeoJSON URL to the client
3121+ # Filter to the source of this feed
3122+ url = "%s.geojson?cache.source=%s" % (URL(c="gis", f="cache_feed"),
3123+ url)
3124+
3125+ # Mandatory attributes
3126+ output = {
3127+ "name": name_safe,
3128+ "url": url,
3129+ }
3130+ self.marker.add_attributes_to_output(output)
3131+
3132+ # Attributes which are defaulted client-side if not set
3133+ if self.refresh != 900:
3134+ output["refresh"] = self.refresh
3135+ self.setup_visibility_and_opacity(output)
3136+ self.setup_clustering(output)
3137+
3138+ return output
3139
3140 # -----------------------------------------------------------------------------
3141-class GPXLayer(Layer):
3142+class GPXLayer(MultiRecordLayer):
3143 """ GPX Layer from Catalogue """
3144- def __init__(self, gis, record=None):
3145- db = current.db
3146- tablename = "gis_layer_gpx"
3147- try:
3148- table = db[tablename]
3149- except:
3150- current.manager.load(tablename)
3151- table = db[tablename]
3152-
3153- self.gis = gis
3154- self.table = table
3155- self.js_array = "S3.gis.layers_gpx"
3156- self.record = record
3157- self._refresh()
3158- self.scripts = []
3159-
3160- def as_dict(self):
3161- gis = self.gis
3162- cluster_distance = gis.cluster_distance
3163- cluster_threshold = gis.cluster_threshold
3164- record = self.record
3165- marker = self.marker
3166-
3167- name_safe = self._name_safe(record.name)
3168-
3169- request = current.request
3170- url = URL(c="default", f="download",
3171- args=record.track)
3172-
3173- # Mandatory attributes
3174- output = {
3175- "name": name_safe,
3176- "url": url,
3177- "marker_image": marker.image,
3178- "marker_height": marker.height,
3179- "marker_width": marker.width,
3180- }
3181-
3182- # Attributes which are defaulted client-side if not set
3183- if not record.waypoints:
3184- output["waypoints"] = False
3185- if not record.tracks:
3186- output["tracks"] = False
3187- if not record.routes:
3188- output["routes"] = False
3189- if not record.visible:
3190- output["visibility"] = False
3191- if record.opacity != 1:
3192- output["opacity"] = "%.1f" % record.opacity
3193- if record.cluster_distance != cluster_distance:
3194- output["cluster_distance"] = record.cluster_distance
3195- if record.cluster_threshold != cluster_threshold:
3196- output["cluster_threshold"] = record.cluster_threshold
3197-
3198- return output
3199-
3200- def refresh(self):
3201- " Refresh the attributes of the Layer "
3202- gis = self.gis
3203- request = current.request
3204- record = self.record
3205- table = self.table
3206- if "marker_id" in table:
3207- self.set_marker()
3208- if "projection_id" in table:
3209- self.set_projection()
3210- if record:
3211- self.url = "%s/%s" % (URL(c="default", f="download"),
3212- record.track)
3213- else:
3214- self.url = None
3215+ table_name = "gis_layer_gpx"
3216+ js_array = "S3.gis.layers_gpx"
3217+
3218+ def __init__(self, gis):
3219+ super(GPXLayer, self).__init__(gis)
3220+
3221+# if record:
3222+# self.url = "%s/%s" % (URL(c="default", f="download"),
3223+# record.track)
3224+# else:
3225+# self.url = None
3226+
3227+ class SubLayer(MultiRecordLayer.SubLayer):
3228+ def as_dict(self):
3229+ gis = self.gis
3230+ request = current.request
3231+
3232+ url = URL(c="default", f="download",
3233+ args=self.track)
3234+
3235+ # Mandatory attributes
3236+ output = {
3237+ "name": self.safe_name,
3238+ "url": url,
3239+ }
3240+ self.marker.add_attributes_to_output(output)
3241+ self.add_attributes_if_not_default(
3242+ output,
3243+ waypoints = (self.waypoints, (True,)),
3244+ tracks = (self.tracks, (True,)),
3245+ routes = (self.routes, (True,)),
3246+ )
3247+ self.setup_visibility_and_opacity(output)
3248+ self.setup_clustering(output)
3249+ return output
3250
3251 # -----------------------------------------------------------------------------
3252-class KMLLayer(Layer):
3253+class KMLLayer(MultiRecordLayer):
3254 """ KML Layer from Catalogue """
3255- def __init__(self, gis, record=None):
3256- db = current.db
3257- tablename = "gis_layer_kml"
3258- try:
3259- table = db[tablename]
3260- except:
3261- current.manager.load(tablename)
3262- table = db[tablename]
3263-
3264- self.gis = gis
3265- self.table = table
3266- # @ToDo: Migrate to gis_cache
3267- self.cachetable = db.gis_cache2
3268- self.js_array = "S3.gis.layers_kml"
3269- self.record = record
3270- self._refresh()
3271- self.scripts = []
3272-
3273+ table_name = "gis_layer_kml"
3274+ js_array = "S3.gis.layers_kml"
3275+
3276+ def __init__(self, gis):
3277+ super(KMLLayer, self).__init__(gis)
3278+
3279+ "Set up the KML cache, should be done once per request"
3280 # Can we cache downloaded KML feeds?
3281 # Needed for unzipping & filtering as well
3282 # @ToDo: Should we move this folder to static to speed up access to cached content?
3283 # Do we need to secure it?
3284 cachepath = os.path.join(current.request.folder, "uploads", "gis_cache")
3285- if os.access(cachepath, os.W_OK):
3286- cacheable = True
3287+
3288+ if os.path.exists(cachepath):
3289+ cacheable = os.access(cachepath, os.W_OK)
3290 else:
3291 try:
3292 os.mkdir(cachepath)
3293+ except OSError, os_error:
3294+ self.gis.debug(
3295+ "GIS: KML layers cannot be cached: %s %s" % (
3296+ cachepath,
3297+ os_error
3298+ )
3299+ )
3300+ cacheable = False
3301+ else:
3302 cacheable = True
3303+<<<<<<< TREE
3304 except:
3305 cacheable = False
3306 self.cacheable = cacheable
3307@@ -4720,209 +4576,164 @@
3308 # Download file (async, if workers alive)
3309 current.s3task.async("download_kml",
3310 args=[record.id, filename])
3311+=======
3312+ # @ToDo: Migrate to gis_cache
3313+ KMLLayer.cachetable = current.db.gis_cache2
3314+ KMLLayer.cacheable = cacheable
3315+ KMLLayer.cachepath = cachepath
3316+
3317+
3318+ class SubLayer(MultiRecordLayer.SubLayer):
3319+ def as_dict(self):
3320+ gis = self.gis
3321+
3322+ T = current.T
3323+ db = current.db
3324+ request = current.request
3325+ response = current.response
3326+ public_url = gis.public_url
3327+
3328+ cachetable = KMLLayer.cachetable
3329+ cacheable = KMLLayer.cacheable
3330+ cachepath = KMLLayer.cachepath
3331+
3332+ name = self.name
3333+ if cacheable:
3334+ _name = urllib2.quote(name)
3335+ _name = _name.replace("%", "_")
3336+ filename = "%s.file.%s.kml" % (cachetable._tablename,
3337+ _name)
3338+
3339+ # Should we download a fresh copy of the source file?
3340+ download = True
3341+ query = (cachetable.name == name)
3342+ cached = db(query).select(cachetable.modified_on,
3343+ limitby=(0, 1)).first()
3344+ refresh = self.refresh or 900 # 15 minutes set if we have no data (legacy DB)
3345+>>>>>>> MERGE-SOURCE
3346 if cached:
3347- db(query).update(modified_on=request.utcnow)
3348- else:
3349- cachetable.insert(name=name, file=filename)
3350-
3351- url = URL(c="default", f="download",
3352- args=[filename])
3353- else:
3354- # No caching possible (e.g. GAE), display file direct from remote (using Proxy)
3355- # (Requires OpenLayers.Layer.KML to be available)
3356- url = record.url
3357-
3358- # Mandatory attributes
3359- output = {
3360- "name": name_safe,
3361- "url": url,
3362- "marker_image": marker.image,
3363- "marker_height": marker.height,
3364- "marker_width": marker.width,
3365- }
3366-
3367- # Attributes which are defaulted client-side if not set
3368- if record.title and record.title != "name":
3369- output["title"] = record.title
3370- if record.body and record.body != "description":
3371- output["body"] = record.body
3372- if record.refresh != 900:
3373- output["refresh"] = record.refresh
3374- if not record.visible:
3375- output["visibility"] = False
3376- if record.opacity != 1:
3377- output["opacity"] = "%.1f" % record.opacity
3378- if record.cluster_distance != cluster_distance:
3379- output["cluster_distance"] = record.cluster_distance
3380- if record.cluster_threshold != cluster_threshold:
3381- output["cluster_threshold"] = record.cluster_threshold
3382-
3383- return output
3384+ modified_on = cached.modified_on
3385+ cutoff = modified_on + timedelta(seconds=refresh)
3386+ if request.utcnow < cutoff:
3387+ download = False
3388+
3389+ if download:
3390+ # Download file
3391+ if response.s3.tasks_active():
3392+ # Async call
3393+ db.task_scheduled.insert(name="download_kml_%s" % uuid.uuid4(),
3394+ func="download_kml",
3395+ args=json.dumps([record.id,
3396+ filename]))
3397+ else:
3398+ # Sync call
3399+ gis.download_kml(self.id, filename)
3400+ if cached:
3401+ db(query).update(modified_on=request.utcnow)
3402+ else:
3403+ cachetable.insert(name=name, file=filename)
3404+
3405+ url = URL(r=request, c="default", f="download",
3406+ args=[filename])
3407+ else:
3408+ # No caching possible (e.g. GAE), display file direct from remote (using Proxy)
3409+ # (Requires OpenLayers.Layer.KML to be available)
3410+ url = self.url
3411+
3412+ output = dict(
3413+ name = self.safe_name,
3414+ url = url,
3415+ )
3416+ self.add_attributes_if_not_default(
3417+ output,
3418+ title = (self.title, ("name", None, "")),
3419+ body = (self.body, ("description", None)),
3420+ refresh = (self.refresh, (900,)),
3421+ )
3422+ self.setup_visibility_and_opacity(output)
3423+ self.setup_clustering(output)
3424+ self.marker.add_attributes_to_output(output)
3425+ return output
3426
3427 # -----------------------------------------------------------------------------
3428-class TMSLayer(Layer):
3429+class TMSLayer(MultiRecordLayer):
3430 """ TMS Layer from Catalogue """
3431- def __init__(self, gis, record=None):
3432- db = current.db
3433- tablename = "gis_layer_tms"
3434- try:
3435- table = db[tablename]
3436- except:
3437- current.manager.load(tablename)
3438- table = db[tablename]
3439-
3440- self.gis = gis
3441- self.table = table
3442- self.js_array = "S3.gis.layers_tms"
3443- self.record = record
3444- self._refresh()
3445- self.scripts = []
3446-
3447- def as_dict(self):
3448- #gis = self.gis
3449- record = self.record
3450-
3451- name_safe = self._name_safe(record.name)
3452-
3453- # Mandatory attributes
3454- output = {
3455- "name": name_safe,
3456- "url": record.url,
3457- "layername": record.layername
3458- }
3459-
3460- # Attributes which are defaulted client-side if not set
3461- if record.url2:
3462- output["url2"] = record.url2
3463- if record.url3:
3464- output["url3"] = record.url3
3465- if record.img_format != "png":
3466- output["format"] = record.img_format
3467- if record.zoom_levels != 9:
3468- output["zoomLevels"] = record.zoom_levels
3469- if record.attribution:
3470- output["attribution"] = record.attribution
3471-
3472- return output
3473+ table_name = "gis_layer_tms"
3474+ js_array = "S3.gis.layers_tms"
3475+
3476+ class SubLayer(MultiRecordLayer.SubLayer):
3477+ def as_dict(self):
3478+ output = {
3479+ "name": self.safe_name,
3480+ "url": self.url,
3481+ "layername": self.layername
3482+ }
3483+ self.add_attributes_if_not_default(
3484+ output,
3485+ url2 = (self.url2, (None,)),
3486+ url3 = (self.url3, (None,)),
3487+ format = (self.img_format, ("png", None)),
3488+ zoomLevels = (self.zoom_levels, (9,)),
3489+ attribution = (self.attribution, (None,)),
3490+ )
3491+ return output
3492
3493 # -----------------------------------------------------------------------------
3494-class WFSLayer(Layer):
3495+class WFSLayer(MultiRecordLayer):
3496 """ WFS Layer from Catalogue """
3497- def __init__(self, gis, record=None):
3498- db = current.db
3499- tablename = "gis_layer_wfs"
3500- try:
3501- table = db[tablename]
3502- except:
3503- current.manager.load(tablename)
3504- table = db[tablename]
3505-
3506- self.gis = gis
3507- self.table = table
3508- self.js_array = "S3.gis.layers_wfs"
3509- self.record = record
3510- self._refresh()
3511- self.scripts = []
3512-
3513- def as_dict(self):
3514- gis = self.gis
3515- cluster_distance = gis.cluster_distance
3516- cluster_threshold = gis.cluster_threshold
3517- record = self.record
3518- projection = self.projection
3519-
3520- name_safe = self._name_safe(record.name)
3521-
3522- # Mandatory attributes
3523- output = {
3524- "name": name_safe,
3525- "url": record.url,
3526- "title": record.title,
3527- "featureType": record.featureType,
3528- "featureNS": record.featureNS,
3529- "schema": record.wfs_schema,
3530+ table_name = "gis_layer_wfs"
3531+ js_array = "S3.gis.layers_wfs"
3532+
3533+ class SubLayer(MultiRecordLayer.SubLayer):
3534+ def as_dict(self):
3535+ output = dict(
3536+ name = self.safe_name,
3537+ url = self.url,
3538+ title = self.title,
3539+ featureType = self.featureType,
3540+ featureNS = self.featureNS,
3541+ schema = self.wfs_schema,
3542+ )
3543+ self.add_attributes_if_not_default(
3544+ output,
3545+ version = (self.version, ("1.1.0",)),
3546+ geometryName = (self.geometryName, ("the_geom",)),
3547+ styleField = (self.style_field, (None,)),
3548+ styleValues = (self.style_values, ("{}", None)),
3549+ projection = (self.projection.epsg, (4326,)),
3550 #editable
3551- }
3552-
3553- # Attributes which are defaulted client-side if not set
3554- if record.version != "1.1.0":
3555- output["version"] = record.version
3556- if record.geometryName != "the_geom":
3557- output["geometryName"] = record.geometryName
3558- if record.style_field:
3559- output["styleField"] = record.style_field
3560- if record.style_values and record.style_values != "{}":
3561- output["styleValues"] = record.style_values
3562- if projection.epsg != 4326:
3563- output["projection"] = projection.epsg
3564- if not record.visible:
3565- output["visibility"] = False
3566- if record.opacity != 1:
3567- output["opacity"] = "%.1f" % record.opacity
3568- if record.cluster_distance != cluster_distance:
3569- output["cluster_distance"] = record.cluster_distance
3570- if record.cluster_threshold != cluster_threshold:
3571- output["cluster_threshold"] = record.cluster_threshold
3572-
3573- return output
3574+ )
3575+ self.setup_visibility_and_opacity(output)
3576+ self.setup_clustering(output)
3577+ return output
3578
3579 # -----------------------------------------------------------------------------
3580-class WMSLayer(Layer):
3581+class WMSLayer(MultiRecordLayer):
3582 """ WMS Layer from Catalogue """
3583- def __init__(self, gis, record=None):
3584- db = current.db
3585- tablename = "gis_layer_wms"
3586- try:
3587- table = db[tablename]
3588- except:
3589- current.manager.load(tablename)
3590- table = db[tablename]
3591-
3592- self.gis = gis
3593- self.table = table
3594- self.js_array = "S3.gis.layers_wms"
3595- self.record = record
3596- self._refresh()
3597- self.scripts = []
3598-
3599- def as_dict(self):
3600- #gis = self.gis
3601- record = self.record
3602-
3603- name_safe = self._name_safe(record.name)
3604-
3605- # Mandatory attributes
3606- output = {
3607- "name": name_safe,
3608- "url": record.url,
3609- "layers": record.layers
3610- }
3611-
3612- # Attributes which are defaulted client-side if not set
3613- if not record.visible:
3614- output["visibility"] = False
3615- if record.opacity != 1:
3616- output["opacity"] = "%.1f" % record.opacity
3617- if not record.transparent:
3618- output["transparent"] = False
3619- if record.version != "1.1.1":
3620- output["version"] = record.version
3621- if record.img_format != "image/png":
3622- output["format"] = record.img_format
3623- if record.map:
3624- output["map"] = record.map
3625- if record.style:
3626- output["style"] = record.style
3627- if record.bgcolor:
3628- output["bgcolor"] = record.bgcolor
3629- if record.tiled:
3630- output["tiled"] = True
3631- if record.buffer:
3632- output["buffer"] = record.buffer
3633- if record.base:
3634- output["base"] = True
3635-
3636- return output
3637+ js_array = "S3.gis.layers_wms"
3638+ table_name = "gis_layer_wms"
3639+
3640+ class SubLayer(MultiRecordLayer.SubLayer):
3641+ def as_dict(self):
3642+ output = dict(
3643+ name = self.safe_name,
3644+ url = self.url,
3645+ layers = self.layers
3646+ )
3647+ self.add_attributes_if_not_default(
3648+ output,
3649+ transparent = (self.transparent, (True,)),
3650+ version = (self.version, ("1.1.1",)),
3651+ format = (self.img_format, ("image/png",)),
3652+ map = (self.map, (None,)),
3653+ buffer = (self.buffer, (0,)),
3654+ base = (self.base, (False,)),
3655+ style = (self.style, (None,)),
3656+ bgcolor = (self.bgcolor, (None,)),
3657+ tiled = (self.tiled, (False, )),
3658+ )
3659+ self.setup_visibility_and_opacity(output)
3660+ return output
3661
3662 # =============================================================================
3663 class S3MAP(S3Method):
3664@@ -5031,4 +4842,3 @@
3665 return page
3666
3667 # END =========================================================================
3668-
3669
3670=== added file 'modules/test_utils/AddedRole.py'
3671--- modules/test_utils/AddedRole.py 1970-01-01 00:00:00 +0000
3672+++ modules/test_utils/AddedRole.py 2011-09-06 11:51:25 +0000
3673@@ -0,0 +1,25 @@
3674+
3675+class AddedRole(object):
3676+ """Adds a role and removes it at the end of a test no matter what happens.
3677+
3678+ """
3679+ def __init__(self, session, role):
3680+ self.role = role
3681+ self.session = session
3682+
3683+ def __enter__(self):
3684+ roles = self.session.s3.roles
3685+ role = self.role
3686+ if not role in roles:
3687+ roles.append(role)
3688+
3689+ def __exit__(self, type, value, traceback):
3690+ session_s3_roles = self.session.s3.roles
3691+ roles = list(session_s3_roles)
3692+ for i in range(len(roles)):
3693+ session_s3_roles.pop(0)
3694+ add_role = session_s3_roles.append
3695+ role = self.role
3696+ for role in roles:
3697+ if role is not role:
3698+ add_role(role)
3699
3700=== added file 'modules/test_utils/Change.py'
3701--- modules/test_utils/Change.py 1970-01-01 00:00:00 +0000
3702+++ modules/test_utils/Change.py 2011-09-06 11:51:25 +0000
3703@@ -0,0 +1,25 @@
3704+
3705+_UNDEFINED = object()
3706+
3707+class Change(object):
3708+ def __init__(self, target, changes):
3709+ self.changes = changes
3710+ self.target = target
3711+
3712+ def __enter__(self):
3713+ assert not hasattr(self, "originals")
3714+ self.originals = originals = {}
3715+ # store originals and set new values
3716+ for name, value in self.changes.iteritems():
3717+ originals[name] = getattr(self.target, name, _UNDEFINED)
3718+ setattr(self.target, name, value)
3719+
3720+ def __exit__(self, type, value, traceback):
3721+ # restore originals
3722+ for name, value in self.originals.iteritems():
3723+ if value is _UNDEFINED:
3724+ delattr(self, name)
3725+ else:
3726+ setattr(self.target, name, value)
3727+ del self.originals
3728+
3729
3730=== added file 'modules/test_utils/ExpectSessionWarning.py'
3731--- modules/test_utils/ExpectSessionWarning.py 1970-01-01 00:00:00 +0000
3732+++ modules/test_utils/ExpectSessionWarning.py 2011-09-06 11:51:25 +0000
3733@@ -0,0 +1,14 @@
3734+
3735+class ExpectSessionWarning(object):
3736+ def __init__(self, session, warning):
3737+ self.warning = warning
3738+ self.session = session
3739+
3740+ def __enter__(self):
3741+ session = self.session
3742+ warnings = []
3743+ self.warnings = session.warning = warnings
3744+
3745+ def __exit__(self, type, value, traceback):
3746+ if type is None:
3747+ assert self.warning in self.warnings
3748
3749=== added file 'modules/test_utils/ExpectedException.py'
3750--- modules/test_utils/ExpectedException.py 1970-01-01 00:00:00 +0000
3751+++ modules/test_utils/ExpectedException.py 2011-09-06 11:51:25 +0000
3752@@ -0,0 +1,13 @@
3753+
3754+class ExpectedException(object):
3755+ def __init__(self, ExceptionClass):
3756+ self.ExceptionClass = ExceptionClass
3757+
3758+ def __enter__(self):
3759+ pass
3760+
3761+ def __exit__(self, type, value, traceback):
3762+ return issubclass(type, self.ExceptionClass), (
3763+ "%s not raised" % self.ExceptionClass.__name__
3764+ )
3765+
3766
3767=== added file 'modules/test_utils/InsertedRecord.py'
3768--- modules/test_utils/InsertedRecord.py 1970-01-01 00:00:00 +0000
3769+++ modules/test_utils/InsertedRecord.py 2011-09-06 11:51:25 +0000
3770@@ -0,0 +1,19 @@
3771+
3772+from clear_table import clear_table
3773+
3774+class InsertedRecord(object):
3775+ """Inserts and commits a record and removes it at the end of
3776+ a test no matter what happens.
3777+
3778+ """
3779+ def __init__(self, db, table, data):
3780+ self.db = db
3781+ self.table = table
3782+ self.data = data
3783+
3784+ def __enter__(self):
3785+ self.table.insert(**self.data)
3786+ self.db.commit()
3787+
3788+ def __exit__(self, type, value, traceback):
3789+ clear_table(self.db, self.table)
3790
3791=== added file 'modules/test_utils/Web2pyNosePlugin.py'
3792--- modules/test_utils/Web2pyNosePlugin.py 1970-01-01 00:00:00 +0000
3793+++ modules/test_utils/Web2pyNosePlugin.py 2011-09-06 11:51:25 +0000
3794@@ -0,0 +1,106 @@
3795+
3796+import nose
3797+import re
3798+from itertools import imap
3799+import unittest
3800+
3801+class Web2pyNosePlugin(nose.plugins.base.Plugin):
3802+ # see: http://somethingaboutorange.com/mrl/projects/nose/0.11.1/plugins/writing.html
3803+
3804+ """This plugin is designed to give the web2py environment to the tests.
3805+ """
3806+ score = 0
3807+ # always enable as this plugin can only
3808+ # be selected by running this script
3809+ enabled = True
3810+
3811+ def __init__(
3812+ self,
3813+ application_name,
3814+ environment,
3815+ directory_pattern,
3816+ test_folders
3817+ ):
3818+ super(Web2pyNosePlugin, self).__init__()
3819+ self.application_name = application_name
3820+ self.environment = environment
3821+ self.directory_pattern = directory_pattern
3822+ self.test_folders = test_folders
3823+
3824+ def options(self, parser, env):
3825+ """Register command line options"""
3826+ pass
3827+
3828+ def wantDirectory(self, dirname):
3829+ return bool(re.search(self.directory_pattern, dirname))
3830+
3831+ def wantFile(self, file_name):
3832+ print file_name
3833+ return file_name.endswith(".py") and any(
3834+ imap(file_name.__contains__, self.test_folders)
3835+ )
3836+
3837+ def wantModule(self, module):
3838+ return False
3839+
3840+ def loadTestsFromName(self, file_name, discovered):
3841+ """Sets up the unit-testing environment.
3842+
3843+ This involves loading modules as if by web2py.
3844+ Also we must have a test database.
3845+
3846+ If testing controllers, tests need to set up the request themselves.
3847+
3848+ """
3849+ if file_name.endswith(".py"):
3850+
3851+ # Is it possible that the module could load
3852+ # other code that is using the original db?
3853+
3854+ test_globals = self.environment
3855+
3856+ module_globals = dict(self.environment)
3857+ # execfile is used because it doesn't create a module
3858+ # or load the module from sys.modules if it exists.
3859+
3860+ execfile(file_name, module_globals)
3861+
3862+ import inspect
3863+ # we have to return something, otherwise nose
3864+ # will let others have a go, and they won't pass
3865+ # in the web2py environment, so we'll get errors
3866+ tests = []
3867+
3868+ for name, thing in module_globals.iteritems():
3869+ if (
3870+ # don't bother with globally imported things
3871+ name not in test_globals \
3872+ # unless they have been overridden
3873+ or test_globals[name] is not thing
3874+ ):
3875+ if (
3876+ isinstance(thing, type)
3877+ and issubclass(thing, unittest.TestCase)
3878+ ):
3879+ # look for test methods
3880+ for member_name in dir(thing):
3881+ if member_name.startswith("test"):
3882+ if callable(getattr(thing, member_name)):
3883+ tests.append(thing(member_name))
3884+ elif (
3885+ name.startswith("test")
3886+ or name.startswith("Test")
3887+ ):
3888+ if inspect.isfunction(thing):
3889+ function = thing
3890+ function_name = name
3891+ # things coming from execfile have no module
3892+ #print file_name, function_name, function.__module__
3893+ if function.__module__ in ("__main__", None):
3894+ tests.append(
3895+ nose.case.FunctionTestCase(function)
3896+ )
3897+ return tests
3898+ else:
3899+ return []
3900+
3901
3902=== modified file 'modules/test_utils/__init__.py'
3903--- modules/test_utils/__init__.py 2011-08-11 19:25:52 +0000
3904+++ modules/test_utils/__init__.py 2011-09-06 11:51:25 +0000
3905@@ -1,2 +1,12 @@
3906
3907-from compare_lines import compare_lines
3908\ No newline at end of file
3909+from compare_lines import compare_lines
3910+from clear_table import clear_table
3911+from find_JSON_format_data_structure import *
3912+from Web2pyNosePlugin import Web2pyNosePlugin
3913+from assert_equal import *
3914+
3915+from InsertedRecord import *
3916+from AddedRole import *
3917+from ExpectedException import *
3918+from Change import *
3919+from ExpectSessionWarning import ExpectSessionWarning
3920
3921=== added file 'modules/test_utils/assert_equal.py'
3922--- modules/test_utils/assert_equal.py 1970-01-01 00:00:00 +0000
3923+++ modules/test_utils/assert_equal.py 2011-09-06 11:51:25 +0000
3924@@ -0,0 +1,60 @@
3925+
3926+
3927+def assert_same_type(expected, actual):
3928+ assert isinstance(actual, type(expected)), "%s vs. %s" % (type(expected), type(actual))
3929+
3930+def assert_equal_sequence(expected, actual):
3931+ assert len(expected) == len(actual), "length should be %i, not %i:\n%s" % (
3932+ len(expected), len(actual), actual
3933+ )
3934+ for i in range(len(expected)):
3935+ try:
3936+ assert_equal(expected[i], actual[i])
3937+ except AssertionError, assertion_error:
3938+ raise AssertionError(
3939+ str(assertion_error)
3940+ )
3941+
3942+def assert_equal_set(expected, actual):
3943+ missing = expected.difference(actual)
3944+ assert not missing, "Missing: %s" % ", ".join(missing)
3945+
3946+ extra = actual.difference(expected)
3947+ assert not extra, "Extra: %s" % ", ".join(extra)
3948+
3949+def assert_equal_dict(expected, actual):
3950+ assert_equal_set(
3951+ expected = set(expected.keys()),
3952+ actual = set(actual.keys())
3953+ )
3954+ for key in expected.iterkeys():
3955+ try:
3956+ assert_equal(expected[key], actual[key])
3957+ except AssertionError, assertion_error:
3958+ raise AssertionError(
3959+ "[%s] %s" % (
3960+ key,
3961+ str(assertion_error),
3962+ )
3963+ )
3964+
3965+def assert_equal_value(expected, actual):
3966+ assert expected == actual, "%s != %s" % (expected, actual)
3967+
3968+_compare_procs = {
3969+ list: assert_equal_sequence,
3970+ int: assert_equal_value,
3971+ float: assert_equal_value,
3972+ str: assert_equal_value,
3973+ unicode: assert_equal_value, #sequence,
3974+ dict: assert_equal_dict,
3975+ set: assert_equal_set,
3976+}
3977+
3978+def assert_equal(expected, actual):
3979+ assert_same_type(expected, actual)
3980+ compare_proc = _compare_procs.get(type(expected), assert_equal_value)
3981+ compare_proc(
3982+ expected,
3983+ actual
3984+ )
3985
3986=== added file 'modules/test_utils/clear_table.py'
3987--- modules/test_utils/clear_table.py 1970-01-01 00:00:00 +0000
3988+++ modules/test_utils/clear_table.py 2011-09-06 11:51:25 +0000
3989@@ -0,0 +1,4 @@
3990+
3991+def clear_table(db, db_table):
3992+ db(db_table.id).delete()
3993+ db.commit()
3994
3995=== added file 'modules/test_utils/find_JSON_format_data_structure.py'
3996--- modules/test_utils/find_JSON_format_data_structure.py 1970-01-01 00:00:00 +0000
3997+++ modules/test_utils/find_JSON_format_data_structure.py 2011-09-06 11:51:25 +0000
3998@@ -0,0 +1,54 @@
3999+
4000+import re
4001+from json.decoder import JSONDecoder
4002+
4003+__all__ = (
4004+ "not_found",
4005+ "cannot_parse_JSON",
4006+ "find_JSON_format_data_structure"
4007+)
4008+
4009+def not_found(name, string):
4010+ raise Exception(
4011+ u"Cannot find %s in %s" % (name, string)
4012+ )
4013+
4014+def cannot_parse_JSON(string):
4015+ raise Exception(
4016+ u"Cannot parse JSON: '%s'" % string
4017+ )
4018+
4019+def find_JSON_format_data_structure(
4020+ string,
4021+ name,
4022+ found,
4023+ not_found,
4024+ cannot_parse_JSON
4025+):
4026+ """Finds a named JSON-format data structure in the string.
4027+
4028+ The name can be any string.
4029+ The pattern "name = " will be looked for in the string,
4030+ and the data structure following it parsed and returned as a python
4031+ data structure.
4032+ """
4033+ try:
4034+ name_start = string.index(name)
4035+ except ValueError:
4036+ not_found(name, string)
4037+ else:
4038+ name_length = len(name)
4039+ name_end = name_start + name_length
4040+
4041+ _, remaining = re.Scanner([
4042+ (r"\s*=\s*", lambda scanner, token: None)
4043+ ]).scan(
4044+ string[name_end:]
4045+ )
4046+
4047+ try:
4048+ data, end_position = JSONDecoder().raw_decode(remaining)
4049+ except ValueError, value_error:
4050+ cannot_parse_JSON(remaining)
4051+ else:
4052+ found(data)
4053
4054=== modified file 'modules/test_utils/run.py'
4055--- modules/test_utils/run.py 2011-08-11 19:25:52 +0000
4056+++ modules/test_utils/run.py 2011-09-06 11:51:25 +0000
4057@@ -1,5 +1,8 @@
4058 #!python
4059
4060+# capture web2py environment before doing anything else
4061+
4062+web2py_environment = dict(globals())
4063
4064 __doc__ = """This script is run from the nose command in the
4065 application being tested:
4066@@ -8,249 +11,81 @@
4067
4068 python2.6 ./applications/eden/tests/nose.py <nose arguments>
4069
4070+web2py runs a file which:
4071+1. Sets up a plugin. This plugin registers itself so nose can use it.
4072+2. Runs nose programmatically giving it the plugin
4073+nose loads the tests via the plugin.
4074+when the plugin loads the tests, it injects the web2py environment.
4075+
4076 """
4077
4078-########################################
4079-#
4080-# web2py runs a file which:
4081-# 1. Sets up a plugin. This plugin registers itself so nose can use it.
4082-# 2. Runs nose programmatically giving it the plugin
4083-#
4084-# nose loads the tests via the plugin.
4085-# when the plugin loads the tests, it injects the web2py environment.
4086-#
4087-########################################
4088-
4089-
4090-# @ToDo:
4091-# in particular, haven't checked that db is being replaced with the test_db
4092-# (work in progress)
4093-
4094-# using --import_models is OK for a single test, but for running a suite,
4095-# we probably need the test_env function to set up an environment
4096-
4097-# @ToDo: Provide an ignore_warns mode so that we can tackle ERRORs 1st
4098-# but FAILs often give us clues that help us fix ERRORs
4099-# fixing an error might itself cause a failure, but not be spotted until later.
4100-
4101-def use_test_db(db):
4102- print "Creating test database..."
4103- try:
4104- test_db = use_test_db.db
4105- except AttributeError:
4106- # create test database by copying db
4107- test_db_name = "sqlite://testing.sqlite"
4108- print "Copying db tables into test database..."
4109- test_db = DAL(test_db_name) # Name and location of the test DB file
4110- # Copy tables!
4111- for tablename in db.tables:
4112- table_copy = []
4113- for data in db[tablename]:
4114- table_copy.append(
4115- copy.copy(data)
4116- )
4117- test_db.define_table(tablename, *table_copy)
4118- use_test_db.db = test_db
4119- return test_db
4120-
4121+import sys
4122+
4123+from types import ModuleType
4124 import nose
4125-
4126+import glob
4127+import os.path
4128+import os
4129 import copy
4130-def test_env(
4131- imports,
4132- application,
4133- controller,
4134- function,
4135- folder,
4136- globals,
4137- create_test_db = use_test_db,
4138- _module_root = "applications",
4139-):
4140- """Sets up the unit-testing environment.
4141-
4142- This involves loading modules as if by web2py.
4143- Also we must make a test database.
4144-
4145- """
4146- from gluon.globals import Request
4147- globals["Request"] = Request
4148- request = Request()
4149- request.application = application
4150- request.controller = controller
4151- request.function = function
4152- request.folder = folder
4153-
4154- globals["request"] = request
4155-
4156- test_db = db#create_test_db(db)
4157-
4158+from gluon.globals import Request
4159+import unittest
4160+
4161+
4162+def load_module(application_relative_module_path):
4163 import os
4164- for import_path, names in imports:
4165- #print import_path
4166- #module = {"db": test_db}
4167- #module.update(globals)
4168- module = dict(globals)
4169- path_components = [_module_root] + import_path.split(".")
4170- file_path = os.path.join(*path_components)+".py"
4171- # execfile is used because it doesn't create a module
4172- # and doesn't load the module if it exists.
4173- execfile(file_path, module)
4174- if names is "*":
4175- globals.update(module)
4176- else:
4177- for name in names:
4178- globals[name] = module[name]
4179-
4180-# -----------------------
4181-
4182-import os.path
4183+ web2py_relative_module_path = ".".join((
4184+ "applications", request.application, application_relative_module_path
4185+ ))
4186+ imported_module = __import__(web2py_relative_module_path)
4187+ for step in web2py_relative_module_path.split(".")[1:]:
4188+ imported_module = getattr(imported_module, step)
4189+ return imported_module
4190+
4191+web2py_environment["load_module"] = load_module
4192+web2py_env_module = ModuleType("web2py_env")
4193+web2py_env_module.__dict__.update(web2py_environment)
4194+sys.modules["web2py_env"] = web2py_env_module
4195+
4196
4197 application_name = request.application
4198-
4199-model_files_pattern = os.path.join("applications",application_name,"models","*.py")
4200-import glob
4201-
4202-test_env(
4203- globals = globals(),
4204- application = application_name,
4205- controller = "controller",
4206- function = "function",
4207- folder = "folder",
4208- imports = [
4209- (application_name+".models.%s" % (module_name[len(model_files_pattern)-4:-3]), "*")
4210- for module_name in glob.glob(model_files_pattern)
4211- ]
4212-)
4213-
4214-import sys
4215-log = sys.stderr.write
4216-
4217-import unittest
4218-from itertools import imap
4219+application_folder_path = os.path.join("applications",application_name)
4220+
4221+application = application_name
4222+controller = "controller"
4223+function = "function"
4224+folder = os.path.join(os.getcwd(), "applications", application_name)
4225+
4226+web2py_environment["Request"] = Request
4227+request = Request()
4228+request.application = application
4229+request.controller = controller
4230+request.function = function
4231+request.folder = folder
4232+
4233+web2py_environment["request"] = request
4234+current.request = request
4235+
4236+controller_configuration = OrderedDict()
4237+for controller_name in ["default"]+glob.glob(
4238+ os.path.join(application_folder_path, "controllers", "*.py")
4239+):
4240+ controller_configuration[controller_name] = Storage(
4241+ name_nice = controller_name,
4242+ description = controller_name,
4243+ restricted = False,
4244+ module_type = 0
4245+ )
4246+
4247+current.deployment_settings.modules = controller_configuration
4248
4249 test_folders = set()
4250-
4251-class Web2pyNosePlugin(nose.plugins.base.Plugin):
4252- # see: http://somethingaboutorange.com/mrl/projects/nose/0.11.1/plugins/writing.html
4253-
4254- """This plugin is designed to give the web2py environment to the tests.
4255- """
4256- score = 0
4257- # always enable as this plugin can only
4258- # be selected by running this script
4259- enabled = True
4260-
4261- def __init__(
4262- self,
4263- application_name,
4264- environment,
4265- create_test_db,
4266- directory_pattern
4267- ):
4268- super(Web2pyNosePlugin, self).__init__()
4269- self.application_name = application_name
4270- self.environment = dict(
4271- db = db#create_test_db(db)
4272- )
4273- self.environment.update(environment)
4274- self.directory_pattern = directory_pattern
4275-
4276- def options(self, parser, env):
4277- """Register command line options"""
4278- return
4279- parser.add_option(
4280- "--web2py",
4281- dest="web2py",
4282- action="append",
4283- metavar="ATTR",
4284- help="Use web2py environment when loading tests"
4285- )
4286-
4287- def wantDirectory(self, dirname):
4288- return bool(re.search(self.directory_pattern, dirname))
4289-
4290- def wantFile(self, file_name):
4291- return file_name.endswith(".py") and any(
4292- imap(file_name.__contains__, test_folders)
4293- )
4294-
4295- def wantModule(self, module):
4296- return False
4297-
4298- def loadTestsFromName(self, file_name, discovered):
4299- """Sets up the unit-testing environment.
4300-
4301- This involves loading modules as if by web2py.
4302- Also we must have a test database.
4303-
4304- If testing controllers, tests need to set up the request themselves.
4305-
4306- """
4307- if file_name.endswith(".py"):
4308-# log(file_name)
4309-
4310- # assert 0, file_name
4311- # stop
4312-
4313- # Is it possible that the module could load
4314- # other code that is using the original db?
4315-
4316- test_globals = self.environment
4317-
4318- # execfile is used because it doesn't create a module
4319- # and doesn't load the module into sys.modules if it exists.
4320- module_globals = dict(self.environment)
4321- execfile(file_name, module_globals)
4322-
4323- import inspect
4324- # we have to return something, otherwise nose
4325- # will let others have a go, and they won't pass
4326- # in the web2py environment, so we'll get errors
4327- tests = []
4328-
4329- for name, thing in module_globals.iteritems():
4330- if (
4331- # don't bother with globally imported things
4332- name not in test_globals \
4333- # unless they have been overridden
4334- or test_globals[name] is not thing
4335- ):
4336- if (
4337- isinstance(thing, type)
4338- and issubclass(thing, unittest.TestCase)
4339- ):
4340- # look for test methods
4341- for member_name in dir(thing):
4342- if member_name.startswith("test"):
4343- if callable(getattr(thing, member_name)):
4344- tests.append(thing(member_name))
4345- elif (
4346- name.startswith("test")
4347- or name.startswith("Test")
4348- ):
4349- if inspect.isfunction(thing):
4350- function = thing
4351- function_name = name
4352- # things coming from execfile have no module
4353- #print file_name, function_name, function.__module__
4354- if function.__module__ is None:
4355- tests.append(
4356- nose.case.FunctionTestCase(function)
4357- )
4358- return tests
4359- else:
4360- return
4361-
4362-import re
4363-
4364-argv = [
4365- #"--verbosity=2",
4366- #"--debug=nose"
4367-]
4368+argv = []
4369
4370 # folder in which tests are kept
4371 # non-option arguments (test paths) are made relative to this
4372-test_root = os.path.join("applications", application_name, "tests", "unit_tests")
4373+test_root = os.path.join(application_folder_path, "tests", "unit_tests")
4374+
4375+current_working_directory = os.getcwd()
4376
4377 disallowed_options = {}
4378 disallowed_options["-w"] = disallowed_options["--where"] = (
4379@@ -275,10 +110,10 @@
4380 argv.append(arg)
4381 else:
4382 test_path = arg
4383- test_fuller_path = os.path.join(test_root, test_path)
4384- test_folders.add(test_fuller_path)
4385- if not os.path.exists(test_fuller_path):
4386- print "\n", test_fuller_path, "not found"
4387+ test_folder_fuller_path = os.path.join(test_root, test_path)
4388+ test_folders.add(test_folder_fuller_path)
4389+ if not os.path.exists(test_folder_fuller_path):
4390+ print "\n", test_folder_fuller_path, "not found"
4391 #sys.exit(1)
4392
4393 # test paths in command line aren't passed, just added to test_folders
4394@@ -293,14 +128,15 @@
4395
4396 sys.argv[1:] = argv
4397
4398+test_utils = local_import("test_utils")
4399+
4400 nose.main(
4401 # seems at least this version of nose ignores passed in argv
4402 # argv = argv,
4403 addplugins = nose.plugins.PluginManager([
4404- Web2pyNosePlugin(
4405+ test_utils.Web2pyNosePlugin(
4406 application_name,
4407- globals(),
4408- use_test_db,
4409+ web2py_environment,
4410 re.compile(
4411 re.escape(os.path.sep).join(
4412 (
4413@@ -311,7 +147,8 @@
4414 "[^","]*)*)?)?)?$"
4415 )
4416 )
4417- )
4418+ ),
4419+ test_folders
4420 )
4421 ])
4422 )
4423
4424=== added file 'private/prepopulate/default/tasks.cfg'
4425--- private/prepopulate/default/tasks.cfg 1970-01-01 00:00:00 +0000
4426+++ private/prepopulate/default/tasks.cfg 2011-09-06 11:51:25 +0000
4427@@ -0,0 +1,18 @@
4428+##########################################################################
4429+# Add a list of csv file to import into the system
4430+# the list of import file sis a comma separated list as follows:
4431+# "prefix","tablename","csv file name","stylesheet"
4432+#
4433+# The csv file is assumed to be in the same directory as this file
4434+# The style sheet is assumed to be in either of the following directories:
4435+# static/format/s3csv/"prefix"/
4436+# static/format/s3csv/
4437+#
4438+# For details on how to import data into the system see the following:
4439+# zzz_1st_run
4440+# s3Tools::S3BulkImporter
4441+##########################################################################
4442+"supply","catalog_item","DefaultItems.csv","supply_items.xsl"
4443+"supply","catalog_item","StandardItems.csv","supply_items.xsl"
4444+"hrm","skill","DefaultSkillList.csv","skill.xsl"
4445+"hrm","competency_rating",DefaultSkillCompetency.csv,competency_rating.xsl
4446\ No newline at end of file
4447
4448=== removed file 'private/prepopulate/default/tasks.cfg'
4449--- private/prepopulate/default/tasks.cfg 2011-08-17 15:13:43 +0000
4450+++ private/prepopulate/default/tasks.cfg 1970-01-01 00:00:00 +0000
4451@@ -1,18 +0,0 @@
4452-##########################################################################
4453-# Add a list of csv file to import into the system
4454-# the list of import file sis a comma separated list as follows:
4455-# "prefix","tablename","csv file name","stylesheet"
4456-#
4457-# The csv file is assumed to be in the same directory as this file
4458-# The style sheet is assumed to be in either of the following directories:
4459-# static/format/s3csv/"prefix"/
4460-# static/format/s3csv/
4461-#
4462-# For details on how to import data into the system see the following:
4463-# zzz_1st_run
4464-# s3Tools::S3BulkImporter
4465-##########################################################################
4466-"supply","catalog_item","DefaultItems.csv","supply_items.xsl"
4467-"supply","catalog_item","StandardItems.csv","supply_items.xsl"
4468-"hrm","skill","DefaultSkillList.csv","skill.xsl"
4469-"hrm","competency_rating",DefaultSkillCompetency.csv,competency_rating.xsl
4470
4471=== modified file 'static/scripts/S3/s3.gis.climate.js'
4472--- static/scripts/S3/s3.gis.climate.js 2011-06-15 09:47:54 +0000
4473+++ static/scripts/S3/s3.gis.climate.js 2011-09-06 11:51:25 +0000
4474@@ -9,154 +9,372 @@
4475 }
4476 }
4477
4478-
4479 ClimateDataMapPlugin = function (config) {
4480- var self = this // so no this-clobbering
4481- self.data_type_option_names = config.data_type_option_names
4482- self.parameter_names = config.parameter_names
4483- self.projected_option_type_names = config.projected_option_type_names
4484- self.year_min = config.year_min
4485- self.year_max = config.year_max
4486-
4487- self.data_type_label = config.data_type_label
4488- self.projected_option_type_label = config.projected_option_type_label
4489-
4490- self.setup = function () {
4491- var graphic = new OpenLayers.Layer.Image(
4492- 'Test Data',
4493- '/eden/climate/climate_image_overlay',
4494- new OpenLayers.Bounds(8900000, 3020000, 9850000, 3580000),
4495-// new OpenLayers.Bounds(-180, -88.759, 180, 88.759),
4496- new OpenLayers.Size(249, 139),
4497- {
4498- // numZoomLevels: 3,
4499- isBaseLayer:false,
4500- opacity: 0.5,
4501- transparent:true
4502- }
4503- );
4504- graphic.events.on({
4505- loadstart: function() {
4506- OpenLayers.Console.log("loadstart");
4507- },
4508- loadend: function() {
4509- OpenLayers.Console.log("loadend");
4510- }
4511- });
4512- map.addLayer(graphic);
4513+ var plugin = this // let's be explicit!
4514+ plugin.data_type_option_names = config.data_type_option_names
4515+ plugin.parameter_names = config.parameter_names
4516+ plugin.year_min = config.year_min
4517+ plugin.year_max = config.year_max
4518+
4519+ plugin.data_type_label = config.data_type_label
4520+ plugin.overlay_data_URL = config.overlay_data_URL
4521+ plugin.chart_URL = config.chart_URL
4522+ delete config
4523+
4524+ plugin.setup = function () {
4525+ var overlay_layer = plugin.overlay_layer = new OpenLayers.Layer.Vector(
4526+ 'Climate data map overlay',
4527+ {
4528+ isBaseLayer:false,
4529+ }
4530+ );
4531+ map.addLayer(overlay_layer);
4532+
4533+ // selection
4534+ OpenLayers.Feature.Vector.style['default']['strokeWidth'] = '2'
4535+ var selectCtrl = new OpenLayers.Control.SelectFeature(
4536+ overlay_layer,
4537+ {
4538+ clickout: true,
4539+ toggle: false,
4540+ multiple: false,
4541+ hover: false,
4542+ toggleKey: 'altKey',
4543+ multipleKey: 'shiftKey',
4544+ box: true,
4545+ onSelect: function (feature) {
4546+ feature.style.strokeColor = 'black'
4547+ feature.style.strokeDashstyle = 'dash'
4548+ overlay_layer.drawFeature(feature)
4549+ },
4550+ onUnselect: function (feature) {
4551+ feature.style.strokeColor = 'none'
4552+ overlay_layer.drawFeature(feature)
4553+ },
4554+ }
4555+ );
4556+
4557+ map.addControl(selectCtrl);
4558+
4559+ selectCtrl.activate();
4560 }
4561- self.addToMapWindow = function (items) {
4562- function toggle_projected_options() {
4563- $('#projected-options').toggle(
4564- $('#id_Projected').attr('checked') == 'checked'
4565+ plugin.addToMapWindow = function (items) {
4566+ var combo_box_size = {
4567+ width: 120,
4568+ heigth:25
4569+ }
4570+
4571+ function make_combo_box(
4572+ data,
4573+ fieldLabel,
4574+ hiddenName
4575+ ) {
4576+ var options = []
4577+ each(
4578+ data,
4579+ function (option) {
4580+ options.push([option, option])
4581+ }
4582 )
4583- }
4584- var climate_data_type_options = [];
4585- each(
4586- self.data_type_option_names,
4587- function (option_name) {
4588- var radio_button = new Ext.form.Radio({
4589- name: "data-type",
4590- id: "id_%s" % option_name,
4591- boxLabel: option_name,
4592- checked: option_name == self.data_type_option_names[0],
4593- })
4594- radio_button.on({
4595- change: toggle_projected_options
4596- })
4597- climate_data_type_options.push(radio_button)
4598- }
4599- )
4600- var projected_options = [];
4601- each(
4602- self.projected_option_type_names,
4603- function (projected_option_type_name) {
4604- projected_options.push(
4605- new Ext.form.Radio({
4606- name: "projected-option-type",
4607- id: "id_%s" % projected_option_type_name,
4608- boxLabel: projected_option_type_name,
4609- })
4610- )
4611- }
4612- )
4613- var projected_options_widget = new Ext.form.FieldSet({
4614- title: self.projected_option_type_label,
4615- items: [
4616- new Ext.form.CheckboxGroup({
4617- items: projected_options,
4618- xtype: 'checkboxgroup',
4619- columns: 1
4620- })
4621- ]
4622- })
4623-
4624- var climate_data_type_options = new Ext.form.FieldSet({
4625- title: self.data_type_label,
4626- items: [
4627- new Ext.form.RadioGroup({
4628- items: climate_data_type_options,
4629- columns: 1,
4630- })
4631- ]
4632- })
4633-
4634- var parameter_options = [];
4635- each(
4636- self.parameter_names,
4637- function (parameter_name) {
4638- var checkbox = new Ext.form.Checkbox({
4639- name: parameter_name,
4640- id: "id_%s" % parameter_name,
4641- boxLabel: parameter_name,
4642- })
4643- parameter_options.push(checkbox)
4644- }
4645- )
4646-
4647- var parameters_widget = new Ext.form.FieldSet({
4648- title: "Parameters",
4649- items: [
4650- new Ext.form.CheckboxGroup({
4651- items: parameter_options,
4652- xtype: 'checkboxgroup',
4653- columns: 1
4654- })
4655- ]
4656- })
4657-
4658- var period_widget = new Ext.form.FieldSet({
4659- title: "Period",
4660- items: [
4661- new Ext.form.NumberField({
4662- fieldLabel: "From",
4663- minValue: self.year_min,
4664- maxValue: self.year_max,
4665- value: self.year_min
4666+ var combo_box = new Ext.form.ComboBox({
4667+ fieldLabel: fieldLabel,
4668+ hiddenName: hiddenName,
4669+ store: new Ext.data.SimpleStore({
4670+ fields: ['name', 'option'],
4671+ data: options
4672 }),
4673- new Ext.form.NumberField({
4674- fieldLabel: "To",
4675- minValue: self.year_min,
4676- maxValue: self.year_max,
4677- value: self.year_max
4678- })
4679- ]
4680- })
4681+ displayField: 'name',
4682+ typeAhead: true,
4683+ mode: 'local',
4684+ triggerAction: 'all',
4685+ emptyText:'Choose...',
4686+ selectOnFocus:true
4687+ })
4688+ combo_box.setSize(combo_box_size)
4689+ return combo_box
4690+ }
4691+ var data_type_combo_box = make_combo_box(
4692+ plugin.data_type_option_names,
4693+ 'Data type',
4694+ 'data_type'
4695+ )
4696+
4697+ var variable_combo_box = make_combo_box(
4698+ plugin.parameter_names,
4699+ 'Variable',
4700+ 'parameter'
4701+ )
4702+
4703+ var statistic_combo_box = make_combo_box(
4704+ ['Minimum','Maximum','Average'],
4705+ 'Aggregate values',
4706+ 'statistic'
4707+ )
4708+
4709 var climate_data_panel = new Ext.FormPanel({
4710 id: 'climate_data_panel',
4711- title: 'Climate data',
4712+ title: 'Climate data map overlay',
4713 collapsible: true,
4714 collapseMode: 'mini',
4715 items: [{
4716 region: 'center',
4717 items: [
4718- climate_data_type_options,
4719- projected_options_widget,
4720- parameters_widget,
4721- period_widget
4722+ new Ext.form.FieldSet({
4723+ title: 'Data set',
4724+ items: [
4725+ data_type_combo_box,
4726+ variable_combo_box
4727+ ]
4728+ }),
4729+ new Ext.form.FieldSet({
4730+ title: 'Period',
4731+ items: [
4732+ new Ext.form.NumberField({
4733+ fieldLabel: 'From',
4734+ name: 'from_date',
4735+ minValue: plugin.year_min,
4736+ maxValue: plugin.year_max,
4737+ value: plugin.year_min
4738+ }),
4739+ new Ext.form.NumberField({
4740+ fieldLabel: 'To',
4741+ name: 'to_date',
4742+ minValue: plugin.year_min,
4743+ maxValue: plugin.year_max,
4744+ value: plugin.year_max,
4745+ size: combo_box_size
4746+ })
4747+ ]
4748+ }),
4749+ new Ext.form.FieldSet({
4750+ title: 'Map overlay colours',
4751+ items: [
4752+ statistic_combo_box,
4753+ ]
4754+ })
4755 ]
4756 }]
4757 });
4758+
4759+ var update_map_layer_button = new Ext.Button({
4760+ text: 'Update map layer',
4761+ disabled: true,
4762+ handler: function() {
4763+ plugin.overlay_layer.destroyFeatures()
4764+
4765+ // request new features
4766+ var form_values = climate_data_panel.getForm().getValues()
4767+
4768+ // add new features
4769+ $.ajax({
4770+ url: plugin.overlay_data_URL,
4771+ data: {
4772+ data_type: form_values.data_type,
4773+ statistic: form_values.statistic,
4774+ parameter: form_values.parameter,
4775+ from_date: form_values.from_date,
4776+ to_date: form_values.to_date
4777+ },
4778+ success: function(feature_data, status_code) {
4779+ function Vector(geometry, attributes, style) {
4780+ style.strokeColor= 'none'
4781+ style.fillOpacity= 0.8
4782+ style.strokeWidth = 1
4783+
4784+ return new OpenLayers.Feature.Vector(
4785+ geometry, attributes, style
4786+ )
4787+ }
4788+ function Polygon(components) {
4789+ return new OpenLayers.Geometry.Polygon(components)
4790+ }
4791+ function Point(lon, lat) {
4792+ var point = new OpenLayers.Geometry.Point(lat, lon)
4793+ return point.transform(
4794+ S3.gis.proj4326,
4795+ S3.gis.projection_current
4796+ )
4797+ }
4798+ function LinearRing(point_list) {
4799+ point_list.push(point_list[0])
4800+ return new OpenLayers.Geometry.LinearRing(point_list)
4801+ }
4802+ eval('var data = '+feature_data)
4803+ $('#id_key_min_value').html(data.min)
4804+ $('#id_key_max_value').html(data.max)
4805+ plugin.overlay_layer.addFeatures(data.features)
4806+ }
4807+ });
4808+ }
4809+ });
4810+
4811+ function enable_update_layer_button_if_form_complete(
4812+ box, record, index
4813+ ) {
4814+ if (
4815+ !!data_type_combo_box.getValue() &&
4816+ !!variable_combo_box.getValue() &&
4817+ !!statistic_combo_box.getValue()
4818+ ) {
4819+ update_map_layer_button.enable()
4820+ }
4821+ }
4822+ data_type_combo_box.on(
4823+ 'change',
4824+ enable_update_layer_button_if_form_complete
4825+ );
4826+ variable_combo_box.on(
4827+ 'change',
4828+ enable_update_layer_button_if_form_complete
4829+ );
4830+ statistic_combo_box.on(
4831+ 'change',
4832+ enable_update_layer_button_if_form_complete
4833+ );
4834+ climate_data_panel.addButton(update_map_layer_button)
4835+
4836+ var show_chart_button = new Ext.Button({
4837+ text: 'Show chart',
4838+ disabled: true,
4839+ handler: function() {
4840+ // create URL
4841+ var place_ids = []
4842+ each(
4843+ plugin.overlay_layer.selectedFeatures,
4844+ function (feature) {
4845+ place_ids.push(feature.data.id)
4846+ }
4847+ )
4848+ var form_values = climate_data_panel.getForm().getValues(),
4849+ data_type = form_values.data_type,
4850+ parameter = form_values.parameter,
4851+ from_date = form_values.from_date,
4852+ to_date = form_values.to_date,
4853+ place_ids = place_ids;
4854+
4855+ var spec = JSON.stringify({
4856+ data_type: data_type,
4857+ parameter: parameter,
4858+ from_date: from_date,
4859+ to_date: to_date,
4860+ place_ids: place_ids
4861+ })
4862+
4863+ var chart_name = [
4864+ data_type, parameter,
4865+ 'from', from_date,
4866+ 'to', to_date,
4867+ 'for', (
4868+ place_ids.length < 3?
4869+ 'places: '+ place_ids:
4870+ place_ids.length+' places'
4871+ )
4872+ ].join(' ')
4873+
4874+ // get hold of a chart manager instance
4875+ if (!plugin.chart_window) {
4876+ var chart_window = plugin.chart_window = window.open(
4877+ 'climate/chart_popup.html',
4878+ 'chart',
4879+ 'width=660,height=600,toolbar=0,resizable=0'
4880+ )
4881+ chart_window.onload = function () {
4882+ chart_window.chart_manager = new chart_window.ChartManager(plugin.chart_URL)
4883+ chart_window.chart_manager.addChartSpec(spec, chart_name)
4884+ }
4885+ chart_window.onbeforeunload = function () {
4886+ delete plugin.chart_window
4887+ }
4888+ } else {
4889+ // some duplication here:
4890+ plugin.chart_window. chart_manager.addChartSpec(spec, chart_name)
4891+ }
4892+
4893+ }
4894+ });
4895+
4896+
4897+ function enable_show_chart_button_if_data_and_variable_selected(
4898+ box, record, index
4899+ ) {
4900+ if (
4901+ !!data_type_combo_box.getValue() &&
4902+ !!variable_combo_box.getValue()
4903+ ) {
4904+ show_chart_button.enable()
4905+ }
4906+ }
4907+
4908+ data_type_combo_box.on(
4909+ 'change',
4910+ enable_show_chart_button_if_data_and_variable_selected
4911+ );
4912+
4913+ variable_combo_box.on(
4914+ 'change',
4915+ enable_show_chart_button_if_data_and_variable_selected
4916+ );
4917+
4918+
4919+
4920+ climate_data_panel.addButton(show_chart_button)
4921+
4922 items.push(climate_data_panel)
4923+
4924+ var key_panel = new Ext.Panel({
4925+ id: 'key_panel',
4926+ title: 'Key',
4927+ collapsible: true,
4928+ collapseMode: 'mini',
4929+ items: [
4930+ {
4931+ layout: {
4932+ type: 'table',
4933+ columns: 3,
4934+ },
4935+ defaults: {
4936+ width: '100%',
4937+ height: 20,
4938+ style: 'margin: 10px'
4939+ },
4940+ items: [
4941+ {
4942+ tag: 'span',
4943+ id: 'id_key_min_value',
4944+ style: 'margin: 5px; text-align: center;',
4945+ border: false,
4946+ items: [
4947+ {
4948+ html:'Min',
4949+ border: false
4950+ }
4951+ ]
4952+ },
4953+ new Ext.BoxComponent({
4954+ autoEl: {
4955+ tag: 'img',
4956+ width: 128,
4957+ height: 15,
4958+ src: 'data:image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAIAAAAABCAYAAAAW0qa2AAAAc0lEQVQoz42QSw6AIAxEX40JEO9/VlBjxg3gD9TFpG06aTrPJjYlYBsNvCAALvce8JZrVuA6X6Snv+kpdwXBwAlsBiIoAQksV536Mr/te3rxDay4r2iNmAFwBcsdVwdfRagDwbC031M8op5j96L8RVEVYQf3hFgEX0OMvQAAAABJRU5ErkJggg=='
4959+ }
4960+ }),
4961+ {
4962+ tag: 'span',
4963+ id: 'id_key_max_value',
4964+ style: 'margin: 5px; text-align: center',
4965+ border: false,
4966+ items: [
4967+ {
4968+ html:'Max',
4969+ border: false
4970+ }
4971+ ]
4972+ }
4973+ ]
4974+ }
4975+ ]
4976+ })
4977+
4978+ items.push(key_panel)
4979 }
4980 }
4981
4982=== added file 'tests/__init__.py'
4983=== added directory 'tests/climate'
4984=== added file 'tests/climate/__init__.py'
4985--- tests/climate/__init__.py 1970-01-01 00:00:00 +0000
4986+++ tests/climate/__init__.py 2011-09-06 11:51:25 +0000
4987@@ -0,0 +1,101 @@
4988+
4989+ClimateDataPortal = local_import('ClimateDataPortal')
4990+
4991+def clear_tables():
4992+ ClimateDataPortal.place.truncate()
4993+ ClimateDataPortal.rainfall_mm.truncate()
4994+ ClimateDataPortal.temperature_celsius.truncate()
4995+ db.commit()
4996+#clear_tables()
4997+
4998+def frange(start, end, inc=1.0):
4999+ value = start
5000+ i = 0
The diff has been truncated for viewing.