Merge lp:~sidnei/charms/precise/squid-reverseproxy/trunk into lp:charms/squid-reverseproxy

Proposed by Sidnei da Silva
Status: Superseded
Proposed branch: lp:~sidnei/charms/precise/squid-reverseproxy/trunk
Merge into: lp:charms/squid-reverseproxy
Diff against target: 2126 lines (+1526/-237)
19 files modified
.bzrignore (+9/-1)
Makefile (+36/-0)
cm.py (+172/-0)
config-manager.txt (+8/-0)
config.yaml (+25/-3)
hooks/_pythonpath.py (+40/-0)
hooks/hooks.py (+228/-213)
hooks/install (+23/-0)
hooks/shelltoolbox/__init__.py (+2/-0)
hooks/tests/test_cached_website_hooks.py (+95/-0)
hooks/tests/test_config_changed_hooks.py (+58/-0)
hooks/tests/test_helpers.py (+273/-0)
hooks/tests/test_nrpe_hooks.py (+217/-0)
hooks/tests/test_website_hooks.py (+267/-0)
metadata.yaml (+7/-1)
revision (+0/-1)
setup.cfg (+4/-0)
tarmac_tests.sh (+6/-0)
templates/main_config.template (+56/-18)
To merge this branch: bzr merge lp:~sidnei/charms/precise/squid-reverseproxy/trunk
Reviewer Review Type Date Requested Status
Mark Mims (community) Needs Resubmitting
Review via email: mp+149708@code.launchpad.net

This proposal has been superseded by a proposal from 2013-04-18.

Commit message

Greatly improved test coverage. Better interaction with upstream and downstream services, namely apache2 and haproxy.

Description of the change

* Greatly improve test coverage

* Allow the use of an X-Balancer-Name header to select which cache_peer backend will be used for a specific request.

  This is useful if you have multiple applications sitting behind a single hostname being directed to the same Squid from an Apache service, using the balancer relation. Since you can't rely on 'dstdomain' anymore, this special header is set by the Apache charm in the balancer, and used instead.

* Support 'all-services' being set in the relation, in the way that the haproxy sets it, in addition to the previously supported 'sitenames' setting. Makes it compatible with the haproxy charm.

* When the list of supported 'sitenames' (computed from dstdomain acls) changes, notify services related via the 'cached-website' relation. This allows to add new services in the haproxy service (or really, any other service related), which notifies the squid service, which then bubbles up to services related via cached-website.

To post a comment you must log in.
33. By Haw Loeung

[jjo] Don't use absolute path to call juju cmds - needed to allow /usr/local/bin/ overrides if e.g. using branched juju (until LP#1118945 is fixed)

Revision history for this message
Mark Mims (mark-mims) wrote :

same as apache... please reserve /tests for charm tests

review: Needs Resubmitting
34. By David Ames

[dames,r=wedgwood] Set hostname on cached-website relation hook. Needed for haproxy charm

35. By David Ames

[dames,r=wedgwood] Add missing nagios_check_http_params to config.yaml

36. By David Ames

[sidnei,r=dames] - Improve test coverage (from 0% to 51%)
- Add helper scripts to run coverage/lint/tests
- Add support for relating with haproxy, which exports multiple (service_name, port) combos via 'all_services', assuming that the service_name exported is to be used as the dstdomain or balancer name in Apache
- If enable, match incoming requests on the X-Balancer-Name header, set by the modified apache2 charm, instead of matching on dstdomain (aka Host header, which isn't set by the apache2 charm when using balancer:).

37. By David Ames

[dames,trivial] Fixes to previous merge

38. By Liam Young

Merged from ~ubuntuone-pqm-team. Bump charmsupport, and add extra options to refresh_pattern.

39. By Tom Haddon

Merge from U1 version to get the two in sync

40. By Matthias Arnason

[bloodearnest r=tiaz] add ability to disable caching and just proxy

41. By Matthias Arnason

[sidnei r=tiaz] Expose 'via' config setting, default to on.

42. By JuanJo Ciarlante

[sidnei, r=jjo] When setting cache_size_mb to 0 disable only *disk* cache, not all caching.

43. By JuanJo Ciarlante

[sidnei,r=jjo] If there's a query string in the path, use GET instead of HEAD

44. By Sidnei da Silva

Merge from alexlist's https support branch

45. By Sidnei da Silva

- Merge from trunk

Unmerged revisions

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file '.bzrignore'
2--- .bzrignore 2013-02-15 02:21:16 +0000
3+++ .bzrignore 2013-04-16 21:22:23 +0000
4@@ -1,1 +1,9 @@
5-basenode/
6+revision
7+_trial_temp
8+.coverage
9+coverage.xml
10+*.crt
11+*.key
12+lib/*
13+*.pyc
14+exec.d
15
16=== added file 'Makefile'
17--- Makefile 1970-01-01 00:00:00 +0000
18+++ Makefile 2013-04-16 21:22:23 +0000
19@@ -0,0 +1,36 @@
20+PWD := $(shell pwd)
21+SOURCEDEPS_DIR ?= $(shell dirname $(PWD))/.sourcecode
22+HOOKS_DIR := $(PWD)/hooks
23+TEST_PREFIX := PYTHONPATH=$(HOOKS_DIR)
24+TEST_DIR := $(PWD)/hooks/tests
25+CHARM_DIR := $(PWD)
26+PYTHON := /usr/bin/env python
27+
28+
29+build: sourcedeps test lint proof
30+
31+revision:
32+ @test -f revision || echo 0 > revision
33+
34+proof: revision
35+ @echo Proofing charm...
36+ @charm proof $(PWD) && echo OK
37+ @test `cat revision` = 0 && rm revision
38+
39+test:
40+ @echo Starting tests...
41+ @CHARM_DIR=$(CHARM_DIR) $(TEST_PREFIX) nosetests $(TEST_DIR)
42+
43+lint:
44+ @echo Checking for Python syntax...
45+ @flake8 $(HOOKS_DIR) --ignore=E123 && echo OK
46+
47+sourcedeps: $(PWD)/config-manager.txt
48+ @echo Updating source dependencies...
49+ @$(PYTHON) cm.py -c $(PWD)/config-manager.txt \
50+ -p $(SOURCEDEPS_DIR) \
51+ -t $(PWD)
52+
53+charm-payload: sourcedeps
54+
55+.PHONY: revision proof test lint sourcedeps charm-payload
56
57=== added file 'cm.py'
58--- cm.py 1970-01-01 00:00:00 +0000
59+++ cm.py 2013-04-16 21:22:23 +0000
60@@ -0,0 +1,172 @@
61+# Copyright 2010-2012 Canonical Ltd. All rights reserved.
62+
63+import os
64+import re
65+import sys
66+import errno
67+import hashlib
68+import subprocess
69+import optparse
70+import urllib
71+
72+from os import curdir
73+from bzrlib.branch import Branch
74+from bzrlib.plugin import load_plugins
75+load_plugins()
76+from bzrlib.plugins.launchpad import account as lp_account
77+
78+if 'GlobalConfig' in dir(lp_account):
79+ from bzrlib.config import LocationConfig as LocationConfiguration
80+ _ = LocationConfiguration
81+else:
82+ from bzrlib.config import LocationStack as LocationConfiguration
83+ _ = LocationConfiguration
84+
85+
86+def get_ubunet_branch_config(config_file):
87+ """
88+ Retrieves the sourcedeps configuration for an ubunet source dir.
89+ Returns a dict of (branch, revspec) tuples, keyed by branch name.
90+ """
91+ branches = {}
92+ with open(config_file, 'r') as stream:
93+ for line in stream:
94+ line = line.split('#')[0].strip()
95+ match = re.match(r'(\S+)\s+'
96+ 'bzr\+ssh://([^/]+)/([^;]+)'
97+ '(?:;revno=(\d+))?', line)
98+ if match:
99+ name, host, branch, revno = match.group(1, 2, 3, 4)
100+ if revno is None:
101+ revspec = "-1"
102+ else:
103+ revspec = revno
104+ branches[name] = (host, branch, revspec)
105+ return branches
106+
107+
108+def main(config_file, parent_dir, target_dir, verbose):
109+ """Do the deed."""
110+
111+ try:
112+ os.makedirs(parent_dir)
113+ except OSError, e:
114+ if e.errno != errno.EEXIST:
115+ raise
116+
117+ bzr_config = LocationConfiguration(parent_dir)
118+ get_lp_login = lp_account.get_lp_login
119+ username = get_lp_login(bzr_config) or get_lp_login()
120+ if username is None:
121+ raise RuntimeError("Unable to determine launchpad login")
122+ quoted_username = urllib.quote(username)
123+
124+ branches = sorted(get_ubunet_branch_config(config_file).iteritems())
125+ for branch_name, (host, quoted_branch_spec, revspec) in branches:
126+ revno = int(revspec)
127+
128+ # qualify mirror branch name with hash of remote repo path to deal
129+ # with changes to the remote branch URL over time
130+ branch_spec_digest = hashlib.sha1(quoted_branch_spec).hexdigest()
131+ branch_directory = branch_spec_digest
132+
133+ source_path = os.path.join(parent_dir, branch_directory)
134+ destination_path = os.path.join(target_dir, branch_name)
135+
136+ # Remove leftover symlinks/stray files.
137+ try:
138+ os.remove(destination_path)
139+ except OSError, e:
140+ if e.errno != errno.EISDIR and e.errno != errno.ENOENT:
141+ raise
142+
143+ branch_url = ("bzr+ssh://%s@%s/%s" %
144+ (quoted_username, host, quoted_branch_spec))
145+ lp_url = "lp:" + quoted_branch_spec.replace("+branch/", "")
146+
147+ # Create the local mirror branch if it doesn't already exist
148+ if verbose:
149+ sys.stderr.write('%30s: ' % (branch_name,))
150+ sys.stderr.flush()
151+
152+ fresh = False
153+ if not os.path.exists(source_path):
154+ subprocess.check_call(['bzr', 'branch', '-q',
155+ '--', branch_url, source_path])
156+ fresh = True
157+
158+ source_branch = Branch.open(source_path)
159+
160+ # Freshen the source branch if required (-1 means we want tip).
161+ if not fresh and (revno == -1 or revno > source_branch.revno()):
162+ subprocess.check_call(['bzr', 'pull', '-q', '--overwrite', '-r',
163+ str(revno), '-d', source_path,
164+ '--', branch_url])
165+
166+ if os.path.exists(destination_path):
167+ # Overwrite the destination with the appropriate revision.
168+ subprocess.check_call(['bzr', 'clean-tree', '--force', '-q',
169+ '--ignored', '-d', destination_path])
170+ subprocess.check_call(['bzr', 'pull', '-q', '--overwrite',
171+ '-r', str(revno),
172+ '-d', destination_path, '--', source_path])
173+ else:
174+ # Create a new branch.
175+ subprocess.check_call(['bzr', 'branch', '-q', '--hardlink',
176+ '-r', str(revno),
177+ '--', source_path, destination_path])
178+
179+ # Check the state of the destination branch.
180+ destination_branch = Branch.open(destination_path)
181+ destination_revno = destination_branch.revno()
182+
183+ if verbose:
184+ sys.stderr.write('checked out %4s of %s\n' %
185+ ("r" + str(destination_revno), lp_url))
186+ sys.stderr.flush()
187+
188+ if revno != -1 and destination_revno != revno:
189+ raise RuntimeError("Expected revno %d but got revno %d" %
190+ (revno, destination_revno))
191+
192+if __name__ == '__main__':
193+ parser = optparse.OptionParser(
194+ usage="%prog [options]",
195+ description=(
196+ "Add a lightweight checkout in <target> for each "
197+ "corresponding file in <parent>."),
198+ add_help_option=False)
199+ parser.add_option(
200+ '-p', '--parent', dest='parent', default=None,
201+ help=("The directory of the parent tree."),
202+ metavar="DIR")
203+ parser.add_option(
204+ '-t', '--target', dest='target', default=curdir,
205+ help=("The directory of the target tree."),
206+ metavar="DIR")
207+ parser.add_option(
208+ '-c', '--config', dest='config', default=None,
209+ help=("The config file to be used for config-manager."),
210+ metavar="DIR")
211+ parser.add_option(
212+ '-q', '--quiet', dest='verbose', action='store_false',
213+ help="Be less verbose.")
214+ parser.add_option(
215+ '-h', '--help', action='help',
216+ help="Show this help message and exit.")
217+ parser.set_defaults(verbose=True)
218+
219+ options, args = parser.parse_args()
220+
221+ if options.parent is None:
222+ parser.error(
223+ "Parent directory not specified.")
224+
225+ if options.target is None:
226+ parser.error(
227+ "Target directory not specified.")
228+
229+ sys.exit(main(config_file=options.config,
230+ parent_dir=options.parent,
231+ target_dir=options.target,
232+ verbose=options.verbose))
233
234=== added file 'config-manager.txt'
235--- config-manager.txt 1970-01-01 00:00:00 +0000
236+++ config-manager.txt 2013-04-16 21:22:23 +0000
237@@ -0,0 +1,8 @@
238+# After making changes to this file, to ensure that your sourcedeps are
239+# up-to-date do:
240+#
241+# make sourcedeps
242+
243+./lib/charmsupport bzr+ssh://bazaar.launchpad.net/~ubuntuone-pqm-team/charmsupport/trunk;revno=49
244+./lib/charm-tools bzr+ssh://bazaar.launchpad.net/+branch/charm-tools;revno=172
245+./lib/shelltoolbox bzr+ssh://bazaar.launchpad.net/+branch/python-shelltoolbox;revno=17
246
247=== modified file 'config.yaml'
248--- config.yaml 2013-01-11 16:49:35 +0000
249+++ config.yaml 2013-04-16 21:22:23 +0000
250@@ -3,10 +3,18 @@
251 type: int
252 default: 3128
253 description: Squid listening port.
254+ port_options:
255+ type: string
256+ default: accel vhost
257+ description: Squid listening port options
258 log_format:
259 type: string
260 default: '%>a %ui %un [%tl] "%rm %ru HTTP/%rv" %>Hs %<st "%{Referer}>h" "%{User-Agent}>h" %Ss:%Sh'
261 description: Format of the squid log.
262+ x_balancer_name_allowed:
263+ type: boolean
264+ default: false
265+ description: Route based on X-Balancer-Name header set by Apache charm.
266 cache_mem_mb:
267 type: int
268 default: 256
269@@ -51,20 +59,34 @@
270 juju-squid-0
271 If you're running multiple environments with the same services in them
272 this allows you to differentiate between them.
273- nagios_check_url:
274+ nagios_check_http_params:
275 default: ""
276 type: string
277 description: >
278- The URL to check squid has access to, most likely inside your web server farm
279+ The parameters to pass to the nrpe plugin check_http.
280 nagios_service_type:
281 default: "generic"
282 type: string
283 description: >
284- What service this component forms part of, e.g. supermassive-squid-cluster. Used by nrpe.
285+ What service this component forms part of, e.g. supermassive-squid-cluster. Used by nrpe.
286 refresh_patterns:
287 type: string
288 default: ''
289 description: >
290 JSON-formatted list of refresh patterns. For example:
291 '{"http://www.ubuntu.com": {"min": 0, "percent": 20, "max": 60}, "http://www.canonical.com": {"min": 0, "percent": 20, "max": 120}}'
292+ services:
293+ default: ''
294+ type: string
295+ description: |
296+ Services definition(s). Although the variable type is a string, this is
297+ interpreted by the charm as yaml. To use multiple services within the
298+ same instance, specify all of the variables (service_name,
299+ service_host, service_port) with a "-" before the first variable,
300+ service_name, as below.
301
302+ - service_name: example_proxy
303+ service_domain: example.com
304+ servers:
305+ - [foo.internal, 80]
306+ - [bar.internal, 80]
307
308=== added file 'hooks/_pythonpath.py'
309--- hooks/_pythonpath.py 1970-01-01 00:00:00 +0000
310+++ hooks/_pythonpath.py 2013-04-16 21:22:23 +0000
311@@ -0,0 +1,40 @@
312+import sys
313+import os
314+import os.path
315+
316+# Make sure that charmsupport is importable, or bail out.
317+local_copy = os.path.join(
318+ os.path.dirname(os.path.dirname(__file__)),
319+ "lib", "charmsupport")
320+if os.path.exists(local_copy) and os.path.isdir(local_copy):
321+ sys.path.insert(0, local_copy)
322+
323+try:
324+ import charmsupport
325+ _ = charmsupport
326+except ImportError:
327+ sys.exit("Could not find required 'charmsupport' library.")
328+
329+# Make sure that shelltoolbox is importable, or bail out.
330+local_copy = os.path.join(
331+ os.path.dirname(os.path.dirname(__file__)),
332+ "lib", "shelltoolbox")
333+if os.path.exists(local_copy) and os.path.isdir(local_copy):
334+ sys.path.insert(0, local_copy)
335+try:
336+ import shelltoolbox
337+ _ = shelltoolbox
338+except ImportError:
339+ sys.exit("Could not find required 'shelltoolbox' library.")
340+
341+# Make sure that charmhelpers is importable, or bail out.
342+local_copy = os.path.join(
343+ os.path.dirname(os.path.dirname(__file__)),
344+ "lib", "charm-tools", "helpers", "python")
345+if os.path.exists(local_copy) and os.path.isdir(local_copy):
346+ sys.path.insert(0, local_copy)
347+try:
348+ import charmhelpers
349+ _ = charmhelpers
350+except ImportError:
351+ sys.exit("Could not find required 'charmhelpers' library.")
352
353=== modified file 'hooks/hooks.py'
354--- hooks/hooks.py 2013-01-11 17:00:40 +0000
355+++ hooks/hooks.py 2013-04-16 21:22:23 +0000
356@@ -1,17 +1,29 @@
357 #!/usr/bin/env python
358
359-import glob
360-import grp
361 import json
362 import math
363 import os
364-import pwd
365 import random
366 import re
367 import shutil
368 import string
369 import subprocess
370 import sys
371+import yaml
372+import collections
373+import socket
374+
375+import _pythonpath
376+_ = _pythonpath
377+
378+from charmsupport.hookenv import (
379+ config as config_get,
380+ log,
381+ relation_set,
382+ relation_ids as get_relation_ids,
383+ relations_of_type,
384+)
385+from charmsupport.nrpe import NRPE
386
387 ###############################################################################
388 # Global variables
389@@ -19,124 +31,87 @@
390 default_squid3_config_dir = "/etc/squid3"
391 default_squid3_config = "%s/squid.conf" % default_squid3_config_dir
392 default_squid3_config_cache_dir = "/var/run/squid3"
393-hook_name = os.path.basename(sys.argv[0])
394+Server = collections.namedtuple("Server", "name address port options")
395
396 ###############################################################################
397 # Supporting functions
398 ###############################################################################
399
400
401-#------------------------------------------------------------------------------
402-# config_get: Returns a dictionary containing all of the config information
403-# Optional parameter: scope
404-# scope: limits the scope of the returned configuration to the
405-# desired config item.
406-#------------------------------------------------------------------------------
407-def config_get(scope=None):
408- try:
409- config_cmd_line = ['config-get']
410- if scope is not None:
411- config_cmd_line.append(scope)
412- config_cmd_line.append('--format=json')
413- config_data = json.loads(subprocess.check_output(config_cmd_line))
414- except Exception, e:
415- subprocess.call(['juju-log', str(e)])
416- config_data = None
417- finally:
418- return(config_data)
419-
420-
421-#------------------------------------------------------------------------------
422-# relation_json: Returns json-formatted relation data
423-# Optional parameters: scope, relation_id
424-# scope: limits the scope of the returned data to the
425-# desired item.
426-# unit_name: limits the data ( and optionally the scope )
427-# to the specified unit
428-# relation_id: specify relation id for out of context usage.
429-#------------------------------------------------------------------------------
430-def relation_json(scope=None, unit_name=None, relation_id=None):
431- try:
432- relation_cmd_line = ['relation-get', '--format=json']
433- if relation_id is not None:
434- relation_cmd_line.extend(('-r', relation_id))
435- if scope is not None:
436- relation_cmd_line.append(scope)
437- else:
438- relation_cmd_line.append('-')
439- relation_cmd_line.append(unit_name)
440- relation_data = subprocess.check_output(relation_cmd_line)
441- except Exception, e:
442- subprocess.call(['juju-log', str(e)])
443- relation_data = None
444- finally:
445- return(relation_data)
446-
447-
448-#------------------------------------------------------------------------------
449-# relation_get: Returns a dictionary containing the relation information
450-# Optional parameters: scope, relation_id
451-# scope: limits the scope of the returned data to the
452-# desired item.
453-# unit_name: limits the data ( and optionally the scope )
454-# to the specified unit
455-# relation_id: specify relation id for out of context usage.
456-#------------------------------------------------------------------------------
457-def relation_get(scope=None, unit_name=None, relation_id=None):
458- try:
459- relation_data = json.loads(relation_json())
460- except Exception, e:
461- subprocess.call(['juju-log', str(e)])
462- relation_data = None
463- finally:
464- return(relation_data)
465-
466-
467-#------------------------------------------------------------------------------
468-# relation_ids: Returns a list of relation ids
469-# optional parameters: relation_type
470-# relation_type: return relations only of this type
471-#------------------------------------------------------------------------------
472-def relation_ids(relation_types=['website']):
473- # accept strings or iterators
474- if isinstance(relation_types, basestring):
475- reltypes = [relation_types]
476- else:
477- reltypes = relation_types
478- relids = []
479- for reltype in reltypes:
480- relid_cmd_line = ['relation-ids', '--format=json', reltype]
481- relids.extend(json.loads(subprocess.check_output(relid_cmd_line)))
482- return relids
483-
484-
485-#------------------------------------------------------------------------------
486-# relation_get_all: Returns a dictionary containing the relation information
487-# optional parameters: relation_type
488-# relation_type: limits the scope of the returned data to the
489-# desired item.
490-#------------------------------------------------------------------------------
491-def relation_get_all():
492- reldata = {}
493- try:
494- relids = relation_ids()
495- for relid in relids:
496- units_cmd_line = ['relation-list', '--format=json', '-r', relid]
497- units = json.loads(subprocess.check_output(units_cmd_line))
498- for unit in units:
499- reldata[unit] = \
500- json.loads(relation_json(
501- relation_id=relid, unit_name=unit))
502- if 'sitenames' in reldata[unit]:
503- reldata[unit]['sitenames'] = \
504- reldata[unit]['sitenames'].split()
505- reldata[unit]['relation-id'] = relid
506- reldata[unit]['name'] = unit.replace("/", "_")
507- except Exception, e:
508- subprocess.call(['juju-log', str(e)])
509- reldata = []
510- finally:
511- return(reldata)
512+def get_id(sitename, relation_id, unit_name):
513+ unit_name = unit_name.replace('/', '_').replace('.', '_')
514+ relation_id = relation_id.replace(':', '_').replace('.', '_')
515+ if sitename is None:
516+ return relation_id + '__' + unit_name
517+ return sitename.replace('.', '_') + '__' + relation_id + '__' + unit_name
518+
519+
520+def get_relation_sites():
521+ all_sites = {}
522+
523+ config_data = config_get()
524+ config_services = yaml.safe_load(config_data.get("services", "")) or ()
525+ server_options_by_site = {}
526+
527+ for service_item in config_services:
528+ site = service_item["service_name"]
529+ servers = all_sites.setdefault(site, [])
530+ options = " ".join(service_item.get("server_options", []))
531+ server_options_by_site[site] = options
532+ for host, port in service_item["servers"]:
533+ servers.append(
534+ Server(name=get_id(None, site, host),
535+ address=host, port=port,
536+ options=options))
537+
538+ relations = relations_of_type("website")
539+
540+ for relation_data in relations:
541+ unit = relation_data["__unit__"]
542+ relation_id = relation_data["__relid__"]
543+ if not ("port" in relation_data or "all_services" in relation_data):
544+ log("No port in relation data for '%s', skipping." % unit)
545+ continue
546+
547+ if not "private-address" in relation_data:
548+ log("No private-address in relation data "
549+ "for '%s', skipping." % unit)
550+ continue
551+
552+ for site in relation_data.get("sitenames", "").split():
553+ servers = all_sites.setdefault(site, [])
554+ servers.append(
555+ Server(name=get_id(site, relation_id, unit),
556+ address=relation_data["private-address"],
557+ port=relation_data["port"],
558+ options=server_options_by_site.get(site, '')))
559+
560+ services = yaml.safe_load(relation_data.get("all_services", "")) or ()
561+ for service_item in services:
562+ site = service_item["service_name"]
563+ servers = all_sites.setdefault(site, [])
564+ servers.append(
565+ Server(name=get_id(site, relation_id, unit),
566+ address=relation_data["private-address"],
567+ port=service_item["service_port"],
568+ options=server_options_by_site.get(site, '')))
569+
570+ if not ("sitenames" in relation_data or
571+ "all_services" in relation_data):
572+ servers = all_sites.setdefault(None, [])
573+ servers.append(
574+ Server(name=get_id(None, relation_id, unit),
575+ address=relation_data["private-address"],
576+ port=relation_data["port"],
577+ options=server_options_by_site.get(None, '')))
578+
579+ if len(all_sites) == 0:
580+ return
581+
582+ for site, servers in all_sites.iteritems():
583+ servers.sort()
584+
585+ return all_sites
586
587
588 #------------------------------------------------------------------------------
589@@ -144,13 +119,13 @@
590 #------------------------------------------------------------------------------
591 def apt_get_install(packages=None):
592 if packages is None:
593- return(False)
594+ return False
595 cmd_line = ['apt-get', '-y', 'install', '-qq']
596 try:
597 cmd_line.extend(packages.split())
598 except AttributeError:
599 cmd_line.extend(packages)
600- return(subprocess.call(cmd_line))
601+ return subprocess.call(cmd_line)
602
603
604 #------------------------------------------------------------------------------
605@@ -169,9 +144,8 @@
606 #------------------------------------------------------------------------------
607 def load_squid3_config(squid3_config_file="/etc/squid3/squid.conf"):
608 if os.path.isfile(squid3_config_file):
609- return(open(squid3_config_file).read())
610- else:
611- return(None)
612+ return open(squid3_config_file).read()
613+ return
614
615
616 #------------------------------------------------------------------------------
617@@ -183,8 +157,8 @@
618 def get_service_ports(squid3_config_file="/etc/squid3/squid.conf"):
619 squid3_config = load_squid3_config(squid3_config_file)
620 if squid3_config is None:
621- return(None)
622- return(re.findall("http_port ([0-9]+)", squid3_config))
623+ return
624+ return re.findall("http_port ([0-9]+)", squid3_config)
625
626
627 #------------------------------------------------------------------------------
628@@ -195,9 +169,9 @@
629 def get_sitenames(squid3_config_file="/etc/squid3/squid.conf"):
630 squid3_config = load_squid3_config(squid3_config_file)
631 if squid3_config is None:
632- return(None)
633- sitenames = re.findall("cache_peer_domain \w+ ([^!].*)", squid3_config)
634- return(list(set(sitenames)))
635+ return
636+ sitenames = re.findall("acl [\w_-]+ dstdomain ([^!].*)", squid3_config)
637+ return list(set(sitenames))
638
639
640 #------------------------------------------------------------------------------
641@@ -206,9 +180,9 @@
642 #------------------------------------------------------------------------------
643 def open_port(port=None, protocol="TCP"):
644 if port is None:
645- return(None)
646- return(subprocess.call(['/usr/bin/open-port', "%d/%s" %
647- (int(port), protocol)]))
648+ return
649+ return subprocess.call(['open-port', "%d/%s" %
650+ (int(port), protocol)])
651
652
653 #------------------------------------------------------------------------------
654@@ -217,9 +191,9 @@
655 #------------------------------------------------------------------------------
656 def close_port(port=None, protocol="TCP"):
657 if port is None:
658- return(None)
659- return(subprocess.call(['/usr/bin/close-port', "%d/%s" %
660- (int(port), protocol)]))
661+ return
662+ return subprocess.call(['close-port', "%d/%s" %
663+ (int(port), protocol)])
664
665
666 #------------------------------------------------------------------------------
667@@ -229,7 +203,7 @@
668 #------------------------------------------------------------------------------
669 def update_service_ports(old_service_ports=None, new_service_ports=None):
670 if old_service_ports is None or new_service_ports is None:
671- return(None)
672+ return
673 for port in old_service_ports:
674 if port not in new_service_ports:
675 close_port(port)
676@@ -248,7 +222,7 @@
677 if l not in 'Iil0oO1']
678 random_chars = [random.choice(alphanumeric_chars)
679 for i in range(pwd_length)]
680- return(''.join(random_chars))
681+ return ''.join(random_chars)
682
683
684 #------------------------------------------------------------------------------
685@@ -257,27 +231,54 @@
686 def construct_squid3_config():
687 from jinja2 import Environment, FileSystemLoader
688 config_data = config_get()
689- relations = relation_get_all()
690+ sites = get_relation_sites()
691+ only_direct = set()
692+ if sites is not None:
693+ for site, servers in sites.iteritems():
694+ if not servers:
695+ only_direct.add(site)
696+
697 if config_data['refresh_patterns']:
698- refresh_patterns = json.loads(config_data['refresh_patterns'])
699+ ## Try originally supported JSON formatted config
700+ try:
701+ refresh_patterns = json.loads(config_data['refresh_patterns'])
702+ ## else use YAML (current):
703+ except ValueError:
704+ refresh_patterns = yaml.load(config_data['refresh_patterns'])
705 else:
706 refresh_patterns = {}
707+ # Use default refresh pattern if specified.
708+ if '.' in refresh_patterns:
709+ default_refresh_pattern = refresh_patterns.pop('.')
710+ else:
711+ default_refresh_pattern = {
712+ 'min': 30,
713+ 'percent': 20,
714+ 'max': 4320,
715+ }
716+
717 template_env = Environment(loader=FileSystemLoader(
718 os.path.join(os.environ['CHARM_DIR'], 'templates')))
719 config_data['cache_l1'] = int(math.ceil(math.sqrt(
720- int(config_data['cache_size_mb']) * 1024 / (16 *
721- int(config_data['target_objs_per_dir']) * int(
722- config_data['avg_obj_size_kb'])))))
723+ int(config_data['cache_size_mb']) * 1024 / (
724+ 16 * int(config_data['target_objs_per_dir']) * int(
725+ config_data['avg_obj_size_kb'])))))
726 config_data['cache_l2'] = config_data['cache_l1'] * 16
727 templ_vars = {
728 'config': config_data,
729- 'relations': relations,
730+ 'sites': sites,
731+ 'only_direct': only_direct,
732 'refresh_patterns': refresh_patterns,
733+ 'default_refresh_pattern': default_refresh_pattern,
734 }
735 template = template_env.get_template('main_config.template').\
736 render(templ_vars)
737+ write_squid3_config(str(template))
738+
739+
740+def write_squid3_config(contents):
741 with open(default_squid3_config, 'w') as squid3_config:
742- squid3_config.write(str(template))
743+ squid3_config.write(contents)
744
745
746 #------------------------------------------------------------------------------
747@@ -286,139 +287,146 @@
748 #------------------------------------------------------------------------------
749 def service_squid3(action=None, squid3_config=default_squid3_config):
750 if action is None or squid3_config is None:
751- return(None)
752+ return
753 elif action == "check":
754 check_cmd = ['/usr/sbin/squid3', '-f', squid3_config, '-k', 'parse']
755 retVal = subprocess.call(check_cmd)
756 if retVal == 1:
757- return(False)
758+ return False
759 elif retVal == 0:
760- return(True)
761+ return True
762 else:
763- return(False)
764+ return False
765 elif action == 'status':
766 status = subprocess.check_output(['status', 'squid3'])
767 if re.search('running', status) is not None:
768- return(True)
769+ return True
770 else:
771- return(False)
772+ return False
773 elif action in ('start', 'stop', 'reload', 'restart'):
774 retVal = subprocess.call([action, 'squid3'])
775 if retVal == 0:
776- return(True)
777+ return True
778 else:
779- return(False)
780+ return False
781
782
783 def update_nrpe_checks():
784- config_data = config_get()
785- try:
786- nagios_uid = pwd.getpwnam('nagios').pw_uid
787- nagios_gid = grp.getgrnam('nagios').gr_gid
788- except:
789- subprocess.call(['juju-log', "Nagios user not setup, exiting"])
790- return
791-
792- unit_name = os.environ['JUJU_UNIT_NAME'].replace('/', '-')
793- nrpe_check_file = '/etc/nagios/nrpe.d/check_squidrp.cfg'
794- nagios_hostname = "%s-%s" % (config_data['nagios_context'], unit_name)
795- nagios_logdir = '/var/log/nagios'
796- nagios_exportdir = '/var/lib/nagios/export'
797- nrpe_service_file = \
798- '/var/lib/nagios/export/service__%s_check_squidrp.cfg' % \
799- (nagios_hostname)
800- if not os.path.exists(nagios_logdir):
801- os.mkdir(nagios_logdir)
802- os.chown(nagios_logdir, nagios_uid, nagios_gid)
803- if not os.path.exists(nagios_exportdir):
804- subprocess.call(['juju-log', 'Exiting as %s is not accessible' %
805- (nagios_exportdir)])
806- return
807- for f in os.listdir(nagios_exportdir):
808- if re.search('.*check_squidrp.cfg', f):
809- os.remove(os.path.join(nagios_exportdir, f))
810- from jinja2 import Environment, FileSystemLoader
811- template_env = Environment(
812- loader=FileSystemLoader(os.path.join(os.environ['CHARM_DIR'], 'templates')))
813- templ_vars = {
814- 'nagios_hostname': nagios_hostname,
815- 'nagios_servicegroup': config_data['nagios_context'],
816- }
817- template = template_env.get_template('nrpe_service.template').\
818- render(templ_vars)
819- with open(nrpe_service_file, 'w') as nrpe_service_config:
820- nrpe_service_config.write(str(template))
821- with open(nrpe_check_file, 'w') as nrpe_check_config:
822- nrpe_check_config.write("# check squid\n")
823- nrpe_check_config.write(
824- "command[check_squidrp]=/usr/lib/nagios/plugins/check_http %s\n" %
825- (config_data['nagios_check_http_params']))
826- if os.path.isfile('/etc/init.d/nagios-nrpe-server'):
827- subprocess.call(['service', 'nagios-nrpe-server', 'reload'])
828+ nrpe_compat = NRPE()
829+ conf = nrpe_compat.config
830+ check_http_params = conf.get('nagios_check_http_params')
831+ if check_http_params:
832+ nrpe_compat.add_check(
833+ shortname='squidrp',
834+ description='Check Squid',
835+ check_cmd='check_http %s' % check_http_params
836+ )
837+ config_services_str = conf.get('services', '')
838+ config_services = yaml.safe_load(config_services_str) or ()
839+ for service in config_services:
840+ path = service.get('nrpe_check_path', '/')
841+ command = 'check_http -I 127.0.0.1 -p 3128 --method=HEAD '
842+ service_name = service['service_name']
843+ if conf.get('x_balancer_name_allowed'):
844+ command += "-u http://localhost%s -k \\'X-Balancer-Name: %s\\'" % (
845+ path, service_name)
846+ else:
847+ command += '-u http://%s%s' % (service_name, path)
848+ nrpe_compat.add_check(
849+ shortname='squid-%s' % service_name.replace(".", "_"),
850+ description='Check Squid for site %s' % service_name,
851+ check_cmd=command,
852+ )
853+ nrpe_compat.write()
854
855
856 ###############################################################################
857 # Hook functions
858 ###############################################################################
859 def install_hook():
860- for f in glob.glob('exec.d/*/charm-pre-install'):
861- if os.path.isfile(f) and os.access(f, os.X_OK):
862- subprocess.check_call(['sh', '-c', f])
863 if not os.path.exists(default_squid3_config_dir):
864 os.mkdir(default_squid3_config_dir, 0600)
865 if not os.path.exists(default_squid3_config_cache_dir):
866 os.mkdir(default_squid3_config_cache_dir, 0600)
867- shutil.copy2('%s/files/default.squid3' % (os.environ['CHARM_DIR']), '/etc/default/squid3')
868+ shutil.copy2('%s/files/default.squid3' % (
869+ os.environ['CHARM_DIR']), '/etc/default/squid3')
870 return (apt_get_install(
871 "squid3 python-jinja2") == enable_squid3() is not True)
872
873
874 def config_changed():
875- current_service_ports = get_service_ports()
876+ old_service_ports = get_service_ports()
877+ old_sitenames = get_sitenames()
878 construct_squid3_config()
879 update_nrpe_checks()
880
881 if service_squid3("check"):
882 updated_service_ports = get_service_ports()
883- update_service_ports(current_service_ports, updated_service_ports)
884+ update_service_ports(old_service_ports, updated_service_ports)
885 service_squid3("reload")
886+ if not (old_sitenames == get_sitenames()):
887+ notify_cached_website()
888 else:
889+ # XXX Ideally the config should be restored to a working state if the
890+ # check fails, otherwise an inadvertent reload will cause the service
891+ # to be broken.
892 sys.exit(1)
893
894
895 def start_hook():
896 if service_squid3("status"):
897- return(service_squid3("restart"))
898+ return service_squid3("restart")
899 else:
900- return(service_squid3("start"))
901+ return service_squid3("start")
902
903
904 def stop_hook():
905 if service_squid3("status"):
906- return(service_squid3("stop"))
907+ return service_squid3("stop")
908
909
910 def website_interface(hook_name=None):
911 if hook_name is None:
912- return(None)
913- if hook_name in ["joined", "changed", "broken", "departed"]:
914+ return
915+ if hook_name in ("joined", "changed", "broken", "departed"):
916 config_changed()
917
918
919 def cached_website_interface(hook_name=None):
920 if hook_name is None:
921- return(None)
922- if hook_name in ["joined", "changed"]:
923- sitenames = ' '.join(get_sitenames())
924- # passing only one port - the first one defined
925- subprocess.call(['relation-set', 'port=%s' % get_service_ports()[0],
926- 'sitenames=%s' % sitenames])
927+ return
928+ if hook_name in ("joined", "changed"):
929+ notify_cached_website(relation_ids=(None,))
930+
931+
932+def get_hostname(host=None):
933+ my_host = socket.gethostname()
934+ if host is None or host == "0.0.0.0":
935+ # If the listen ip has been set to 0.0.0.0 then pass back the hostname
936+ return socket.getfqdn(my_host)
937+ elif host == "localhost":
938+ # If the fqdn lookup has returned localhost (lxc setups) then return
939+ # hostname
940+ return my_host
941+ return host
942+
943+
944+def notify_cached_website(relation_ids=None):
945+ hostname = get_hostname()
946+ # passing only one port - the first one defined
947+ port = get_service_ports()[0]
948+ sitenames = ' '.join(get_sitenames())
949+
950+ for rid in relation_ids or get_relation_ids("cached-website"):
951+ relation_set(relation_id=rid, port=port,
952+ hostname=hostname,
953+ sitenames=sitenames)
954
955
956 ###############################################################################
957 # Main section
958 ###############################################################################
959-def main():
960+def main(hook_name):
961 if hook_name == "install":
962 install_hook()
963 elif hook_name == "config-changed":
964@@ -446,14 +454,21 @@
965 elif hook_name == "cached-website-relation-departed":
966 cached_website_interface("departed")
967
968- elif hook_name == "nrpe-external-master-relation-changed":
969+ elif hook_name in ("nrpe-external-master-relation-changed",
970+ "local-monitors-relation-changed"):
971 update_nrpe_checks()
972
973 elif hook_name == "env-dump":
974- print relation_get_all()
975+ print relations_of_type()
976 else:
977 print "Unknown hook"
978 sys.exit(1)
979
980 if __name__ == '__main__':
981- main()
982+ hook_name = os.path.basename(sys.argv[0])
983+ # Also support being invoked directly with hook as argument name.
984+ if hook_name == "hooks.py":
985+ if len(sys.argv) < 2:
986+ sys.exit("Missing required hook name argument.")
987+ hook_name = sys.argv[1]
988+ main(hook_name)
989
990=== modified symlink 'hooks/install' (properties changed: -x to +x)
991=== target was u'hooks.py'
992--- hooks/install 1970-01-01 00:00:00 +0000
993+++ hooks/install 2013-04-16 21:22:23 +0000
994@@ -0,0 +1,23 @@
995+#!/bin/sh
996+
997+set -eu
998+
999+apt_get_install() {
1000+ DEBIAN_FRONTEND=noninteractive apt-get -y -qq -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install $1
1001+}
1002+
1003+juju-log 'Invoking charm-pre-install hooks'
1004+[ -d exec.d ] && ( for f in exec.d/*/charm-pre-install; do [ -x $f ] && /bin/sh -c "$f"; done )
1005+
1006+juju-log 'Install dependencies'
1007+apt_get_install python-software-properties
1008+
1009+juju-log 'Add charm tools ppa'
1010+apt-add-repository -y ppa:ubuntuone/charm-tools
1011+apt-get -qq update
1012+
1013+juju-log 'Install charmsupport and dependencies'
1014+apt_get_install charm-tools python-charmsupport python-shelltoolbox
1015+
1016+juju-log 'Invoking python-based install hook'
1017+python hooks/hooks.py install
1018
1019=== added symlink 'hooks/local-monitors-relation-changed'
1020=== target is u'hooks.py'
1021=== modified file 'hooks/shelltoolbox/__init__.py'
1022--- hooks/shelltoolbox/__init__.py 2012-10-02 09:42:19 +0000
1023+++ hooks/shelltoolbox/__init__.py 2013-04-16 21:22:23 +0000
1024@@ -603,10 +603,12 @@
1025 def changed(self):
1026 return set(key for key in self.intersect
1027 if self.past_dict[key] != self.current_dict[key])
1028+
1029 @property
1030 def unchanged(self):
1031 return set(key for key in self.intersect
1032 if self.past_dict[key] == self.current_dict[key])
1033+
1034 @property
1035 def modified(self):
1036 return self.current_dict != self.past_dict
1037
1038=== added directory 'hooks/tests'
1039=== added file 'hooks/tests/__init__.py'
1040=== added file 'hooks/tests/test_cached_website_hooks.py'
1041--- hooks/tests/test_cached_website_hooks.py 1970-01-01 00:00:00 +0000
1042+++ hooks/tests/test_cached_website_hooks.py 2013-04-16 21:22:23 +0000
1043@@ -0,0 +1,95 @@
1044+from testtools import TestCase
1045+from mock import patch, call
1046+
1047+import hooks
1048+
1049+
1050+class CachedWebsiteRelationTest(TestCase):
1051+
1052+ def setUp(self):
1053+ super(CachedWebsiteRelationTest, self).setUp()
1054+ self.notify_cached_website = self.patch_hook("notify_cached_website")
1055+
1056+ def patch_hook(self, hook_name):
1057+ mock_controller = patch.object(hooks, hook_name)
1058+ mock = mock_controller.start()
1059+ self.addCleanup(mock_controller.stop)
1060+ return mock
1061+
1062+ def test_website_interface_none(self):
1063+ self.assertEqual(None, hooks.cached_website_interface(hook_name=None))
1064+ self.notify_cached_website.assert_not_called()
1065+
1066+ def test_website_interface_joined(self):
1067+ hooks.cached_website_interface(hook_name="joined")
1068+ self.notify_cached_website.assert_called_once_with(
1069+ relation_ids=(None,))
1070+
1071+ def test_website_interface_changed(self):
1072+ hooks.cached_website_interface(hook_name="changed")
1073+ self.notify_cached_website.assert_called_once_with(
1074+ relation_ids=(None,))
1075+
1076+
1077+class NotifyCachedWebsiteRelationTest(TestCase):
1078+
1079+ def setUp(self):
1080+ super(NotifyCachedWebsiteRelationTest, self).setUp()
1081+
1082+ self.get_service_ports = self.patch_hook("get_service_ports")
1083+ self.get_sitenames = self.patch_hook("get_sitenames")
1084+ self.get_relation_ids = self.patch_hook("get_relation_ids")
1085+ self.get_hostname = self.patch_hook("get_hostname")
1086+ self.relation_set = self.patch_hook("relation_set")
1087+
1088+ def patch_hook(self, hook_name):
1089+ mock_controller = patch.object(hooks, hook_name)
1090+ mock = mock_controller.start()
1091+ self.addCleanup(mock_controller.stop)
1092+ return mock
1093+
1094+ def test_notify_cached_website_no_relation_ids(self):
1095+ self.get_relation_ids.return_value = ()
1096+
1097+ hooks.notify_cached_website()
1098+
1099+ self.relation_set.assert_not_called()
1100+ self.get_relation_ids.assert_called_once_with(
1101+ "cached-website")
1102+
1103+ def test_notify_cached_website_with_default_relation(self):
1104+ self.get_relation_ids.return_value = ()
1105+ self.get_hostname.return_value = "foo.local"
1106+ self.get_service_ports.return_value = (3128,)
1107+ self.get_sitenames.return_value = ("foo.internal", "bar.internal")
1108+
1109+ hooks.notify_cached_website(relation_ids=(None,))
1110+
1111+ self.get_hostname.assert_called_once_with()
1112+ self.relation_set.assert_called_once_with(
1113+ relation_id=None, port=3128,
1114+ hostname="foo.local",
1115+ sitenames="foo.internal bar.internal")
1116+ self.get_relation_ids.assert_not_called()
1117+
1118+ def test_notify_cached_website_with_relations(self):
1119+ self.get_relation_ids.return_value = ("cached-website:1",
1120+ "cached-website:2")
1121+ self.get_hostname.return_value = "foo.local"
1122+ self.get_service_ports.return_value = (3128,)
1123+ self.get_sitenames.return_value = ("foo.internal", "bar.internal")
1124+
1125+ hooks.notify_cached_website()
1126+
1127+ self.get_hostname.assert_called_once_with()
1128+ self.get_relation_ids.assert_called_once_with("cached-website")
1129+ self.relation_set.assert_has_calls([
1130+ call.relation_set(
1131+ relation_id="cached-website:1", port=3128,
1132+ hostname="foo.local",
1133+ sitenames="foo.internal bar.internal"),
1134+ call.relation_set(
1135+ relation_id="cached-website:2", port=3128,
1136+ hostname="foo.local",
1137+ sitenames="foo.internal bar.internal"),
1138+ ])
1139
1140=== added file 'hooks/tests/test_config_changed_hooks.py'
1141--- hooks/tests/test_config_changed_hooks.py 1970-01-01 00:00:00 +0000
1142+++ hooks/tests/test_config_changed_hooks.py 2013-04-16 21:22:23 +0000
1143@@ -0,0 +1,58 @@
1144+from testtools import TestCase
1145+from mock import patch
1146+
1147+import hooks
1148+
1149+
1150+class ConfigChangedTest(TestCase):
1151+
1152+ def setUp(self):
1153+ super(ConfigChangedTest, self).setUp()
1154+ self.get_service_ports = self.patch_hook("get_service_ports")
1155+ self.get_sitenames = self.patch_hook("get_sitenames")
1156+ self.construct_squid3_config = self.patch_hook(
1157+ "construct_squid3_config")
1158+ self.update_nrpe_checks = self.patch_hook(
1159+ "update_nrpe_checks")
1160+ self.update_service_ports = self.patch_hook(
1161+ "update_service_ports")
1162+ self.service_squid3 = self.patch_hook(
1163+ "service_squid3")
1164+ self.notify_cached_website = self.patch_hook("notify_cached_website")
1165+
1166+ def patch_hook(self, hook_name):
1167+ mock_controller = patch.object(hooks, hook_name)
1168+ mock = mock_controller.start()
1169+ self.addCleanup(mock_controller.stop)
1170+ return mock
1171+
1172+ def test_config_changed_notify_cached_website_changed_stanzas(self):
1173+ self.service_squid3.return_value = True
1174+ self.get_sitenames.side_effect = (
1175+ ('foo.internal',),
1176+ ('foo.internal', 'bar.internal'))
1177+
1178+ hooks.config_changed()
1179+
1180+ self.notify_cached_website.assert_called_once_with()
1181+
1182+ def test_config_changed_no_notify_cached_website_not_changed(self):
1183+ self.service_squid3.return_value = True
1184+ self.get_sitenames.side_effect = (
1185+ ('foo.internal',),
1186+ ('foo.internal',))
1187+
1188+ hooks.config_changed()
1189+
1190+ self.notify_cached_website.assert_not_called()
1191+
1192+ @patch("sys.exit")
1193+ def test_config_changed_no_notify_cached_website_failed_check(self, exit):
1194+ self.service_squid3.return_value = False
1195+ self.get_sitenames.side_effect = (
1196+ ('foo.internal',),
1197+ ('foo.internal', 'bar.internal'))
1198+
1199+ hooks.config_changed()
1200+
1201+ self.notify_cached_website.assert_not_called()
1202
1203=== added file 'hooks/tests/test_helpers.py'
1204--- hooks/tests/test_helpers.py 1970-01-01 00:00:00 +0000
1205+++ hooks/tests/test_helpers.py 2013-04-16 21:22:23 +0000
1206@@ -0,0 +1,273 @@
1207+import os
1208+from os.path import dirname
1209+
1210+from testtools import TestCase
1211+from testtools.matchers import AfterPreprocessing, Contains
1212+from mock import patch
1213+
1214+import hooks
1215+from charmsupport.hookenv import Serializable
1216+
1217+
1218+def normalize_whitespace(data):
1219+ return ' '.join(chunk for chunk in data.split())
1220+
1221+
1222+@patch.dict(os.environ, {"CHARM_DIR": dirname(dirname(dirname(__file__)))})
1223+class SquidConfigTest(TestCase):
1224+
1225+ def _assert_contents(self, *expected):
1226+ def side_effect(got):
1227+ for chunk in expected:
1228+ self.assertThat(got, AfterPreprocessing(
1229+ normalize_whitespace,
1230+ Contains(normalize_whitespace(chunk))))
1231+ return side_effect
1232+
1233+ @patch("hooks.write_squid3_config")
1234+ @patch("hooks.get_relation_sites")
1235+ @patch("hooks.config_get")
1236+ def test_squid_config_no_sites(self, config_get, get_relation_sites,
1237+ write_squid3_config):
1238+ config_get.return_value = Serializable({
1239+ "refresh_patterns": "",
1240+ "cache_size_mb": 1024,
1241+ "target_objs_per_dir": 1024,
1242+ "avg_obj_size_kb": 1024,
1243+ })
1244+ get_relation_sites.return_value = None
1245+ hooks.construct_squid3_config()
1246+ write_squid3_config.side_effect = self._assert_contents(
1247+ """
1248+ always_direct deny all
1249+ """,
1250+ )
1251+
1252+ @patch("hooks.write_squid3_config")
1253+ @patch("hooks.get_relation_sites")
1254+ @patch("hooks.config_get")
1255+ def test_squid_config_no_sitenames(self, config_get, get_relation_sites,
1256+ write_squid3_config):
1257+ config_get.return_value = Serializable({
1258+ "refresh_patterns": "",
1259+ "cache_size_mb": 1024,
1260+ "target_objs_per_dir": 1024,
1261+ "avg_obj_size_kb": 1024,
1262+ })
1263+ get_relation_sites.return_value = {
1264+ None: [
1265+ hooks.Server("website_1__foo_1", "1.2.3.4", 4242, ''),
1266+ hooks.Server("website_1__foo_2", "1.2.3.5", 4242, ''),
1267+ ],
1268+ }
1269+ write_squid3_config.side_effect = self._assert_contents(
1270+ """
1271+ acl no_sitename_acl myport
1272+ http_access allow accel_ports no_sitename_acl
1273+ never_direct allow no_sitename_acl
1274+ """,
1275+ """
1276+ cache_peer 1.2.3.4 parent 4242 0 name=website_1__foo_1 no-query
1277+ no-digest originserver round-robin login=PASS
1278+ cache_peer_access website_1__foo_1 allow no_sitename_acl
1279+ cache_peer_access website_1__foo_1 deny all
1280+ """,
1281+ """
1282+ cache_peer 1.2.3.5 parent 4242 0 name=website_1__foo_2 no-query
1283+ no-digest originserver round-robin login=PASS
1284+ cache_peer_access website_1__foo_2 allow no_sitename_acl
1285+ cache_peer_access website_1__foo_2 deny all
1286+ """
1287+ )
1288+ hooks.construct_squid3_config()
1289+
1290+ @patch("hooks.write_squid3_config")
1291+ @patch("hooks.get_relation_sites")
1292+ @patch("hooks.config_get")
1293+ def test_squid_config_with_domain(self, config_get, get_relation_sites,
1294+ write_squid3_config):
1295+ config_get.return_value = Serializable({
1296+ "refresh_patterns": "",
1297+ "cache_size_mb": 1024,
1298+ "target_objs_per_dir": 1024,
1299+ "avg_obj_size_kb": 1024,
1300+ })
1301+ get_relation_sites.return_value = {
1302+ "foo.com": [
1303+ hooks.Server("foo_com__website_1__foo_1",
1304+ "1.2.3.4", 4242, "forceddomain=example.com"),
1305+ hooks.Server("foo_com__website_1__foo_2",
1306+ "1.2.3.5", 4242, "forceddomain=example.com"),
1307+ ],
1308+ }
1309+ write_squid3_config.side_effect = self._assert_contents(
1310+ """
1311+ acl s_1_acl dstdomain foo.com
1312+ http_access allow accel_ports s_1_acl
1313+ http_access allow CONNECT SSL_ports s_1_acl
1314+ always_direct allow SSL_ports s_1_acl
1315+ never_direct allow s_1_acl
1316+ """,
1317+ """
1318+ cache_peer 1.2.3.4 parent 4242 0 name=foo_com__website_1__foo_1
1319+ no-query no-digest originserver round-robin login=PASS
1320+ forceddomain=example.com
1321+ cache_peer_access foo_com__website_1__foo_1 allow s_1_acl
1322+ cache_peer_access foo_com__website_1__foo_1 deny all
1323+ """,
1324+ """
1325+ cache_peer 1.2.3.5 parent 4242 0 name=foo_com__website_1__foo_2
1326+ no-query no-digest originserver round-robin login=PASS
1327+ forceddomain=example.com
1328+ cache_peer_access foo_com__website_1__foo_2 allow s_1_acl
1329+ cache_peer_access foo_com__website_1__foo_2 deny all
1330+ """
1331+ )
1332+ hooks.construct_squid3_config()
1333+
1334+ @patch("hooks.write_squid3_config")
1335+ @patch("hooks.get_relation_sites")
1336+ @patch("hooks.config_get")
1337+ def test_with_domain_no_servers_only_direct(self, config_get,
1338+ get_relation_sites,
1339+ write_squid3_config):
1340+ config_get.return_value = Serializable({
1341+ "refresh_patterns": "",
1342+ "cache_size_mb": 1024,
1343+ "target_objs_per_dir": 1024,
1344+ "avg_obj_size_kb": 1024,
1345+ })
1346+ get_relation_sites.return_value = {
1347+ "foo.com": [
1348+ ],
1349+ }
1350+ write_squid3_config.side_effect = self._assert_contents(
1351+ """
1352+ acl s_1_acl dstdomain foo.com
1353+ http_access allow accel_ports s_1_acl
1354+ http_access allow CONNECT SSL_ports s_1_acl
1355+ always_direct allow s_1_acl
1356+ """,
1357+ )
1358+ hooks.construct_squid3_config()
1359+
1360+ @patch("hooks.write_squid3_config")
1361+ @patch("hooks.get_relation_sites")
1362+ @patch("hooks.config_get")
1363+ def test_with_balancer_no_servers_only_direct(self, config_get,
1364+ get_relation_sites,
1365+ write_squid3_config):
1366+ config_get.return_value = Serializable({
1367+ "refresh_patterns": "",
1368+ "cache_size_mb": 1024,
1369+ "target_objs_per_dir": 1024,
1370+ "avg_obj_size_kb": 1024,
1371+ "x_balancer_name_allowed": True,
1372+ })
1373+ get_relation_sites.return_value = {
1374+ "foo.com": [
1375+ ],
1376+ }
1377+ write_squid3_config.side_effect = self._assert_contents(
1378+ """
1379+ acl s_1_acl dstdomain foo.com
1380+ http_access allow accel_ports s_1_acl
1381+ http_access allow CONNECT SSL_ports s_1_acl
1382+ always_direct allow s_1_acl
1383+ """,
1384+ """
1385+ acl s_1_balancer req_header X-Balancer-Name foo\.com
1386+ http_access allow accel_ports s_1_balancer
1387+ http_access allow CONNECT SSL_ports s_1_balancer
1388+ always_direct allow s_1_balancer
1389+ """,
1390+ )
1391+ hooks.construct_squid3_config()
1392+
1393+ @patch("hooks.write_squid3_config")
1394+ @patch("hooks.get_relation_sites")
1395+ @patch("hooks.config_get")
1396+ def test_with_balancer_name(self, config_get, get_relation_sites,
1397+ write_squid3_config):
1398+ config_get.return_value = Serializable({
1399+ "refresh_patterns": "",
1400+ "cache_size_mb": 1024,
1401+ "target_objs_per_dir": 1024,
1402+ "avg_obj_size_kb": 1024,
1403+ "x_balancer_name_allowed": True,
1404+ })
1405+ get_relation_sites.return_value = {
1406+ "foo.com": [
1407+ hooks.Server("foo_com__website_1__foo_1",
1408+ "1.2.3.4", 4242, ''),
1409+ hooks.Server("foo_com__website_1__foo_2",
1410+ "1.2.3.5", 4242, ''),
1411+ ],
1412+ }
1413+ write_squid3_config.side_effect = self._assert_contents(
1414+ """
1415+ acl s_1_acl dstdomain foo.com
1416+ http_access allow accel_ports s_1_acl
1417+ http_access allow CONNECT SSL_ports s_1_acl
1418+ always_direct allow SSL_ports s_1_acl
1419+ never_direct allow s_1_acl
1420+ """,
1421+ """
1422+ acl s_1_balancer req_header X-Balancer-Name foo\.com
1423+ http_access allow accel_ports s_1_balancer
1424+ http_access allow CONNECT SSL_ports s_1_balancer
1425+ always_direct allow SSL_ports s_1_balancer
1426+ never_direct allow s_1_balancer
1427+ """,
1428+ """
1429+ cache_peer 1.2.3.4 parent 4242 0 name=foo_com__website_1__foo_1
1430+ no-query no-digest originserver round-robin login=PASS
1431+ cache_peer_access foo_com__website_1__foo_1 allow s_1_acl
1432+ cache_peer_access foo_com__website_1__foo_1 allow s_1_balancer
1433+ cache_peer_access foo_com__website_1__foo_1 deny all
1434+ """,
1435+ """
1436+ cache_peer 1.2.3.5 parent 4242 0 name=foo_com__website_1__foo_2
1437+ no-query no-digest originserver round-robin login=PASS
1438+ cache_peer_access foo_com__website_1__foo_2 allow s_1_acl
1439+ """
1440+ )
1441+ hooks.construct_squid3_config()
1442+
1443+
1444+class HelpersTest(TestCase):
1445+ def test_gets_config(self):
1446+ json_string = '{"foo": "BAR"}'
1447+ with patch('subprocess.check_output') as check_output:
1448+ check_output.return_value = json_string
1449+
1450+ result = hooks.config_get()
1451+
1452+ self.assertEqual(result['foo'], 'BAR')
1453+ check_output.assert_called_with(['config-get', '--format=json'])
1454+
1455+ def test_gets_config_with_scope(self):
1456+ json_string = '{"foo": "BAR"}'
1457+ with patch('subprocess.check_output') as check_output:
1458+ check_output.return_value = json_string
1459+
1460+ result = hooks.config_get(scope='baz')
1461+
1462+ self.assertEqual(result['foo'], 'BAR')
1463+ check_output.assert_called_with(['config-get', 'baz',
1464+ '--format=json'])
1465+
1466+ @patch('subprocess.call')
1467+ def test_installs_packages(self, mock_call):
1468+ mock_call.return_value = 'some result'
1469+
1470+ result = hooks.apt_get_install('foo bar')
1471+
1472+ self.assertEqual(result, 'some result')
1473+ mock_call.assert_called_with(['apt-get', '-y', 'install', '-qq',
1474+ 'foo', 'bar'])
1475+
1476+ @patch('subprocess.call')
1477+ def test_installs_nothing_if_package_not_provided(self, mock_call):
1478+ self.assertFalse(hooks.apt_get_install())
1479+ self.assertFalse(mock_call.called)
1480
1481=== added file 'hooks/tests/test_nrpe_hooks.py'
1482--- hooks/tests/test_nrpe_hooks.py 1970-01-01 00:00:00 +0000
1483+++ hooks/tests/test_nrpe_hooks.py 2013-04-16 21:22:23 +0000
1484@@ -0,0 +1,217 @@
1485+import os
1486+import grp
1487+import pwd
1488+import subprocess
1489+from testtools import TestCase
1490+from mock import patch, call
1491+
1492+import hooks
1493+from charmsupport import nrpe
1494+from charmsupport.hookenv import Serializable
1495+
1496+
1497+class NRPERelationTest(TestCase):
1498+ """Tests for the update_nrpe_checks hook.
1499+
1500+ Half of this is already tested in the tests for charmsupport.nrpe, but
1501+ as the hook in the charm pre-dates that, the tests are left here to ensure
1502+ backwards-compatibility.
1503+
1504+ """
1505+ patches = {
1506+ 'config': {'object': nrpe},
1507+ 'log': {'object': nrpe},
1508+ 'getpwnam': {'object': pwd},
1509+ 'getgrnam': {'object': grp},
1510+ 'mkdir': {'object': os},
1511+ 'chown': {'object': os},
1512+ 'exists': {'object': os.path},
1513+ 'listdir': {'object': os},
1514+ 'remove': {'object': os},
1515+ 'open': {'object': nrpe, 'create': True},
1516+ 'isfile': {'object': os.path},
1517+ 'call': {'object': subprocess},
1518+ 'relation_ids': {'object': nrpe},
1519+ 'relation_set': {'object': nrpe},
1520+ }
1521+
1522+ def setUp(self):
1523+ super(NRPERelationTest, self).setUp()
1524+ self.patched = {}
1525+ # Mock the universe.
1526+ for attr, data in self.patches.items():
1527+ create = data.get('create', False)
1528+ patcher = patch.object(data['object'], attr, create=create)
1529+ self.patched[attr] = patcher.start()
1530+ self.addCleanup(patcher.stop)
1531+ if not 'JUJU_UNIT_NAME' in os.environ:
1532+ os.environ['JUJU_UNIT_NAME'] = 'test'
1533+
1534+ def check_call_counts(self, **kwargs):
1535+ for attr, expected in kwargs.items():
1536+ patcher = self.patched[attr]
1537+ self.assertEqual(expected, patcher.call_count, attr)
1538+
1539+ def test_update_nrpe_no_nagios_bails(self):
1540+ config = {'nagios_context': 'test'}
1541+ self.patched['config'].return_value = Serializable(config)
1542+ self.patched['getpwnam'].side_effect = KeyError
1543+
1544+ self.assertEqual(None, hooks.update_nrpe_checks())
1545+
1546+ expected = 'Nagios user not set up, nrpe checks not updated'
1547+ self.patched['log'].assert_called_once_with(expected)
1548+ self.check_call_counts(log=1, config=1, getpwnam=1)
1549+
1550+ def test_update_nrpe_removes_existing_config(self):
1551+ config = {
1552+ 'nagios_context': 'test',
1553+ 'nagios_check_http_params': '-u http://example.com/url',
1554+ }
1555+ self.patched['config'].return_value = Serializable(config)
1556+ self.patched['exists'].return_value = True
1557+ self.patched['listdir'].return_value = [
1558+ 'foo', 'bar.cfg', 'check_squidrp.cfg']
1559+
1560+ self.assertEqual(None, hooks.update_nrpe_checks())
1561+
1562+ expected = '/var/lib/nagios/export/check_squidrp.cfg'
1563+ self.patched['remove'].assert_called_once_with(expected)
1564+ self.check_call_counts(config=1, getpwnam=1, getgrnam=1,
1565+ exists=4, remove=1, open=2, listdir=1)
1566+
1567+ def test_update_nrpe_with_check_url(self):
1568+ config = {
1569+ 'nagios_context': 'test',
1570+ 'nagios_check_http_params': '-u foo -H bar',
1571+ }
1572+ self.patched['config'].return_value = Serializable(config)
1573+ self.patched['exists'].return_value = True
1574+ self.patched['isfile'].return_value = False
1575+
1576+ self.assertEqual(None, hooks.update_nrpe_checks())
1577+ self.assertEqual(2, self.patched['open'].call_count)
1578+ filename = 'check_squidrp.cfg'
1579+
1580+ service_file_contents = """
1581+#---------------------------------------------------
1582+# This file is Juju managed
1583+#---------------------------------------------------
1584+define service {
1585+ use active-service
1586+ host_name test-test
1587+ service_description test-test[squidrp] Check Squid
1588+ check_command check_nrpe!check_squidrp
1589+ servicegroups test
1590+}
1591+"""
1592+ self.patched['open'].assert_has_calls(
1593+ [call('/etc/nagios/nrpe.d/%s' % filename, 'w'),
1594+ call('/var/lib/nagios/export/service__test-test_%s' %
1595+ filename, 'w'),
1596+ call().__enter__().write(service_file_contents),
1597+ call().__enter__().write('# check squidrp\n'),
1598+ call().__enter__().write(
1599+ 'command[check_squidrp]=/check_http -u foo -H bar\n')],
1600+ any_order=True)
1601+
1602+ self.check_call_counts(config=1, getpwnam=1, getgrnam=1,
1603+ exists=4, open=2, listdir=1)
1604+
1605+ def test_update_nrpe_with_services_and_host_header(self):
1606+ config = {
1607+ 'nagios_context': 'test',
1608+ 'services': '- {service_name: i_ytimg_com}',
1609+ }
1610+ self.patched['config'].return_value = Serializable(config)
1611+ self.patched['exists'].return_value = True
1612+
1613+ self.assertEqual(None, hooks.update_nrpe_checks())
1614+
1615+ self.check_call_counts(config=1, getpwnam=1, getgrnam=1,
1616+ exists=4, open=2, listdir=1)
1617+ expected = ('command[check_squid-i_ytimg_com]=/check_http '
1618+ '-I 127.0.0.1 -p 3128 --method=HEAD '
1619+ '-u http://i_ytimg_com/\n')
1620+ self.patched['open'].assert_has_calls(
1621+ [call('/etc/nagios/nrpe.d/check_squid-i_ytimg_com.cfg', 'w'),
1622+ call().__enter__().write(expected)],
1623+ any_order=True)
1624+
1625+ def test_update_nrpe_with_dotted_service_name_and_host_header(self):
1626+ config = {
1627+ 'nagios_context': 'test',
1628+ 'services': '- {service_name: i.ytimg.com}',
1629+ }
1630+ self.patched['config'].return_value = Serializable(config)
1631+ self.patched['exists'].return_value = True
1632+
1633+ self.assertEqual(None, hooks.update_nrpe_checks())
1634+
1635+ self.check_call_counts(config=1, getpwnam=1, getgrnam=1,
1636+ exists=4, open=2, listdir=1)
1637+ expected = ('command[check_squid-i_ytimg_com]=/check_http '
1638+ '-I 127.0.0.1 -p 3128 --method=HEAD '
1639+ '-u http://i.ytimg.com/\n')
1640+ self.patched['open'].assert_has_calls(
1641+ [call('/etc/nagios/nrpe.d/check_squid-i_ytimg_com.cfg', 'w'),
1642+ call().__enter__().write(expected)],
1643+ any_order=True)
1644+
1645+ def test_update_nrpe_with_services_and_balancer_name_header(self):
1646+ config = {
1647+ 'nagios_context': 'test',
1648+ 'x_balancer_name_allowed': True,
1649+ 'services': '- {service_name: i_ytimg_com}',
1650+ }
1651+ self.patched['config'].return_value = Serializable(config)
1652+ self.patched['exists'].return_value = True
1653+
1654+ self.assertEqual(None, hooks.update_nrpe_checks())
1655+
1656+ self.check_call_counts(config=1, getpwnam=1, getgrnam=1,
1657+ exists=4, open=2, listdir=1)
1658+
1659+ expected = ('command[check_squid-i_ytimg_com]=/check_http '
1660+ '-I 127.0.0.1 -p 3128 --method=HEAD -u http://localhost/ '
1661+ "-k 'X-Balancer-Name: i_ytimg_com'\n")
1662+ self.patched['open'].assert_has_calls(
1663+ [call('/etc/nagios/nrpe.d/check_squid-i_ytimg_com.cfg', 'w'),
1664+ call().__enter__().write(expected)],
1665+ any_order=True)
1666+
1667+ def test_update_nrpe_with_services_and_optional_path(self):
1668+ services = '- {nrpe_check_path: /foo.jpg, service_name: foo_com}\n'
1669+ config = {
1670+ 'nagios_context': 'test',
1671+ 'services': services,
1672+ }
1673+ self.patched['config'].return_value = Serializable(config)
1674+ self.patched['exists'].return_value = True
1675+
1676+ self.assertEqual(None, hooks.update_nrpe_checks())
1677+
1678+ self.check_call_counts(config=1, getpwnam=1, getgrnam=1,
1679+ exists=4, open=2, listdir=1)
1680+ expected = ('command[check_squid-foo_com]=/check_http '
1681+ '-I 127.0.0.1 -p 3128 --method=HEAD '
1682+ '-u http://foo_com/foo.jpg\n')
1683+ self.patched['open'].assert_has_calls(
1684+ [call('/etc/nagios/nrpe.d/check_squid-foo_com.cfg', 'w'),
1685+ call().__enter__().write(expected)],
1686+ any_order=True)
1687+
1688+ def test_update_nrpe_restarts_service(self):
1689+ config = {
1690+ 'nagios_context': 'test',
1691+ 'nagios_check_http_params': '-u foo -p 3128'
1692+ }
1693+ self.patched['config'].return_value = Serializable(config)
1694+ self.patched['exists'].return_value = True
1695+
1696+ self.assertEqual(None, hooks.update_nrpe_checks())
1697+
1698+ expected = ['initctl', 'restart', 'nagios-nrpe-server']
1699+ self.assertEqual(expected, self.patched['call'].call_args[0][0])
1700+ self.check_call_counts(config=1, getpwnam=1, getgrnam=1,
1701+ exists=4, open=2, listdir=1, call=1)
1702
1703=== added file 'hooks/tests/test_website_hooks.py'
1704--- hooks/tests/test_website_hooks.py 1970-01-01 00:00:00 +0000
1705+++ hooks/tests/test_website_hooks.py 2013-04-16 21:22:23 +0000
1706@@ -0,0 +1,267 @@
1707+import yaml
1708+
1709+from testtools import TestCase
1710+from mock import patch
1711+
1712+import hooks
1713+from charmsupport.hookenv import Serializable
1714+
1715+
1716+class WebsiteRelationTest(TestCase):
1717+
1718+ def setUp(self):
1719+ super(WebsiteRelationTest, self).setUp()
1720+
1721+ relations_of_type = patch.object(hooks, "relations_of_type")
1722+ self.relations_of_type = relations_of_type.start()
1723+ self.addCleanup(relations_of_type.stop)
1724+
1725+ config_get = patch.object(hooks, "config_get")
1726+ self.config_get = config_get.start()
1727+ self.addCleanup(config_get.stop)
1728+
1729+ log = patch.object(hooks, "log")
1730+ self.log = log.start()
1731+ self.addCleanup(log.stop)
1732+
1733+ def test_relation_data_returns_no_relations(self):
1734+ self.config_get.return_value = Serializable({})
1735+ self.relations_of_type.return_value = []
1736+ self.assertEqual(None, hooks.get_relation_sites())
1737+
1738+ def test_no_port_in_relation_data(self):
1739+ self.config_get.return_value = Serializable({})
1740+ self.relations_of_type.return_value = [
1741+ {"private-address": "1.2.3.4",
1742+ "__unit__": "foo/1",
1743+ "__relid__": "website:1"},
1744+ ]
1745+ self.assertIs(None, hooks.get_relation_sites())
1746+ self.log.assert_called_once_with(
1747+ "No port in relation data for 'foo/1', skipping.")
1748+
1749+ def test_empty_relation_services(self):
1750+ self.config_get.return_value = Serializable({})
1751+ self.relations_of_type.return_value = [
1752+ {"private-address": "1.2.3.4",
1753+ "__unit__": "foo/1",
1754+ "__relid__": "website:1",
1755+ "all_services": "",
1756+ },
1757+ ]
1758+
1759+ self.assertEqual(None, hooks.get_relation_sites())
1760+ self.log.assert_not_called()
1761+
1762+ def test_no_port_in_relation_data_ok_with_all_services(self):
1763+ self.config_get.return_value = Serializable({})
1764+ self.relations_of_type.return_value = [
1765+ {"private-address": "1.2.3.4",
1766+ "__unit__": "foo/1",
1767+ "__relid__": "website:1",
1768+ "all_services": yaml.dump([
1769+ {"service_name": "foo.internal",
1770+ "service_port": 4242},
1771+ ]),
1772+ },
1773+ ]
1774+
1775+ expected = {
1776+ "foo.internal": [
1777+ ("foo_internal__website_1__foo_1", "1.2.3.4", 4242, ''),
1778+ ],
1779+ }
1780+ self.assertEqual(expected, hooks.get_relation_sites())
1781+ self.log.assert_not_called()
1782+
1783+ def test_no_private_address_in_relation_data(self):
1784+ self.config_get.return_value = Serializable({})
1785+ self.relations_of_type.return_value = [
1786+ {"port": 4242,
1787+ "__unit__": "foo/0",
1788+ "__relid__": "website:1",
1789+ },
1790+ ]
1791+ self.assertIs(None, hooks.get_relation_sites())
1792+ self.log.assert_called_once_with(
1793+ "No private-address in relation data for 'foo/0', skipping.")
1794+
1795+ def test_sitenames_in_relation_data(self):
1796+ self.config_get.return_value = Serializable({})
1797+ self.relations_of_type.return_value = [
1798+ {"private-address": "1.2.3.4",
1799+ "port": 4242,
1800+ "__unit__": "foo/1",
1801+ "__relid__": "website:1",
1802+ "sitenames": "foo.internal bar.internal"},
1803+ {"private-address": "1.2.3.5",
1804+ "port": 4242,
1805+ "__unit__": "foo/2",
1806+ "__relid__": "website:1",
1807+ "sitenames": "foo.internal bar.internal"},
1808+ ]
1809+ expected = {
1810+ "foo.internal": [
1811+ ("foo_internal__website_1__foo_1", "1.2.3.4", 4242, ''),
1812+ ("foo_internal__website_1__foo_2", "1.2.3.5", 4242, ''),
1813+ ],
1814+ "bar.internal": [
1815+ ("bar_internal__website_1__foo_1", "1.2.3.4", 4242, ''),
1816+ ("bar_internal__website_1__foo_2", "1.2.3.5", 4242, ''),
1817+ ],
1818+ }
1819+ self.assertEqual(expected, hooks.get_relation_sites())
1820+
1821+ def test_all_services_in_relation_data(self):
1822+ self.config_get.return_value = Serializable({})
1823+ self.relations_of_type.return_value = [
1824+ {"private-address": "1.2.3.4",
1825+ "__unit__": "foo/1",
1826+ "__relid__": "website:1",
1827+ "all_services": yaml.dump([
1828+ {"service_name": "foo.internal",
1829+ "service_port": 4242},
1830+ {"service_name": "bar.internal",
1831+ "service_port": 4243}
1832+ ]),
1833+ },
1834+ {"private-address": "1.2.3.5",
1835+ "__unit__": "foo/2",
1836+ "__relid__": "website:1",
1837+ "all_services": yaml.dump([
1838+ {"service_name": "foo.internal",
1839+ "service_port": 4242},
1840+ {"service_name": "bar.internal",
1841+ "service_port": 4243}
1842+ ]),
1843+ },
1844+ ]
1845+ expected = {
1846+ "foo.internal": [
1847+ ("foo_internal__website_1__foo_1", "1.2.3.4", 4242, ''),
1848+ ("foo_internal__website_1__foo_2", "1.2.3.5", 4242, ''),
1849+ ],
1850+ "bar.internal": [
1851+ ("bar_internal__website_1__foo_1", "1.2.3.4", 4243, ''),
1852+ ("bar_internal__website_1__foo_2", "1.2.3.5", 4243, ''),
1853+ ],
1854+ }
1855+ self.assertEqual(expected, hooks.get_relation_sites())
1856+
1857+ def test_unit_names_in_relation_data(self):
1858+ self.config_get.return_value = Serializable({})
1859+ self.relations_of_type.return_value = [
1860+ {"private-address": "1.2.3.4",
1861+ "__relid__": "website_1",
1862+ "__unit__": "foo/1",
1863+ "port": 4242},
1864+ {"private-address": "1.2.3.5",
1865+ "__relid__": "website_1",
1866+ "__unit__": "foo/2",
1867+ "port": 4242},
1868+ ]
1869+ expected = {
1870+ None: [
1871+ ("website_1__foo_1", "1.2.3.4", 4242, ''),
1872+ ("website_1__foo_2", "1.2.3.5", 4242, ''),
1873+ ],
1874+ }
1875+ self.assertEqual(expected, hooks.get_relation_sites())
1876+
1877+ def test_sites_from_config_no_domain(self):
1878+ self.relations_of_type.return_value = []
1879+ self.config_get.return_value = Serializable({
1880+ "services": yaml.dump([
1881+ {"service_name": "foo.internal",
1882+ "servers": [
1883+ ["1.2.3.4", 4242],
1884+ ["1.2.3.5", 4242],
1885+ ]
1886+ }
1887+ ])})
1888+ expected = {
1889+ "foo.internal": [
1890+ ("foo_internal__1_2_3_4", "1.2.3.4", 4242, ''),
1891+ ("foo_internal__1_2_3_5", "1.2.3.5", 4242, ''),
1892+ ],
1893+ }
1894+ self.assertEqual(expected, hooks.get_relation_sites())
1895+
1896+ def test_sites_from_config_with_domain(self):
1897+ self.relations_of_type.return_value = []
1898+ self.config_get.return_value = Serializable({
1899+ "services": yaml.dump([
1900+ {"service_name": "foo.internal",
1901+ "server_options": ["forceddomain=example.com"],
1902+ "servers": [
1903+ ["1.2.3.4", 4242],
1904+ ["1.2.3.5", 4242],
1905+ ]
1906+ }
1907+ ])})
1908+ expected = {
1909+ "foo.internal": [
1910+ ("foo_internal__1_2_3_4", "1.2.3.4", 4242,
1911+ "forceddomain=example.com"),
1912+ ("foo_internal__1_2_3_5", "1.2.3.5", 4242,
1913+ "forceddomain=example.com"),
1914+ ],
1915+ }
1916+ self.assertEqual(expected, hooks.get_relation_sites())
1917+
1918+ def test_sites_from_config_and_relation_with_domain(self):
1919+ self.relations_of_type.return_value = [
1920+ {"private-address": "1.2.3.4",
1921+ "__unit__": "foo/1",
1922+ "__relid__": "website:1",
1923+ "all_services": yaml.dump([
1924+ {"service_name": "foo.internal",
1925+ "service_port": 4242},
1926+ {"service_name": "bar.internal",
1927+ "service_port": 4243}
1928+ ]),
1929+ },
1930+ {"private-address": "1.2.3.5",
1931+ "__unit__": "foo/2",
1932+ "__relid__": "website:1",
1933+ "all_services": yaml.dump([
1934+ {"service_name": "foo.internal",
1935+ "service_port": 4242},
1936+ {"service_name": "bar.internal",
1937+ "service_port": 4243}
1938+ ]),
1939+ },
1940+ ]
1941+ self.config_get.return_value = Serializable({
1942+ "services": yaml.dump([
1943+ {"service_name": "foo.internal",
1944+ "server_options": ["forceddomain=example.com"],
1945+ "servers": [
1946+ ["1.2.4.4", 4242],
1947+ ["1.2.4.5", 4242],
1948+ ]
1949+ }
1950+ ])})
1951+ expected = {
1952+ "foo.internal": [
1953+ ("foo_internal__1_2_4_4", "1.2.4.4", 4242,
1954+ "forceddomain=example.com"),
1955+ ("foo_internal__1_2_4_5", "1.2.4.5", 4242,
1956+ "forceddomain=example.com"),
1957+ ("foo_internal__website_1__foo_1", "1.2.3.4", 4242,
1958+ "forceddomain=example.com"),
1959+ ("foo_internal__website_1__foo_2", "1.2.3.5", 4242,
1960+ "forceddomain=example.com"),
1961+ ],
1962+ "bar.internal": [
1963+ ("bar_internal__website_1__foo_1", "1.2.3.4", 4243, ''),
1964+ ("bar_internal__website_1__foo_2", "1.2.3.5", 4243, ''),
1965+ ],
1966+ }
1967+ self.assertEqual(expected, hooks.get_relation_sites())
1968+
1969+ def test_empty_sites_from_config_no_domain(self):
1970+ self.relations_of_type.return_value = []
1971+ self.config_get.return_value = Serializable({
1972+ "services": ""})
1973+ self.assertEqual(None, hooks.get_relation_sites())
1974
1975=== added directory 'lib'
1976=== modified file 'metadata.yaml'
1977--- metadata.yaml 2013-01-11 16:56:09 +0000
1978+++ metadata.yaml 2013-04-16 21:22:23 +0000
1979@@ -1,6 +1,6 @@
1980 name: squid-reverseproxy
1981 summary: Full featured Web Proxy cache (HTTP proxy)
1982-maintainer: "Matthew Wedgwood <matthew.wedgwood@canonical.com>"
1983+maintainer: [Matthew Wedgwood <matthew.wedgwood@canonical.com>, Alexander List <alexander.list@canonical.com>]
1984 description: >
1985 Squid is a high-performance proxy caching server for web clients,
1986 supporting FTP, gopher, and HTTP data objects. Squid version 3 is a
1987@@ -21,6 +21,12 @@
1988 nrpe-external-master:
1989 interface: nrpe-external-master
1990 scope: container
1991+ nrpe-external-master:
1992+ interface: nrpe-external-master
1993+ scope: container
1994+ local-monitors:
1995+ interface: local-monitors
1996+ scope: container
1997 requires:
1998 website:
1999 interface: http
2000
2001=== removed file 'revision'
2002--- revision 2013-02-15 07:04:29 +0000
2003+++ revision 1970-01-01 00:00:00 +0000
2004@@ -1,1 +0,0 @@
2005-0
2006
2007=== added file 'setup.cfg'
2008--- setup.cfg 1970-01-01 00:00:00 +0000
2009+++ setup.cfg 2013-04-16 21:22:23 +0000
2010@@ -0,0 +1,4 @@
2011+[nosetests]
2012+with-coverage=1
2013+cover-erase=1
2014+cover-package=hooks
2015\ No newline at end of file
2016
2017=== added file 'tarmac_tests.sh'
2018--- tarmac_tests.sh 1970-01-01 00:00:00 +0000
2019+++ tarmac_tests.sh 2013-04-16 21:22:23 +0000
2020@@ -0,0 +1,6 @@
2021+#!/bin/sh
2022+# How the tests are run in Jenkins by Tarmac
2023+
2024+set -e
2025+
2026+make build
2027
2028=== modified file 'templates/main_config.template'
2029--- templates/main_config.template 2013-01-11 16:50:54 +0000
2030+++ templates/main_config.template 2013-04-16 21:22:23 +0000
2031@@ -1,10 +1,11 @@
2032-http_port {{ config.port }} accel vport=443
2033+http_port {{ config.port }} {{ config.port_options }}
2034
2035 acl manager proto cache_object
2036 acl localhost src 127.0.0.1/32
2037 acl to_localhost dst 127.0.0.0/8
2038 acl PURGE method PURGE
2039 acl CONNECT method CONNECT
2040+acl SSL_ports port 443
2041
2042 {% if config.snmp_community -%}
2043 acl snmp_access snmp_community {{ config.snmp_community }}
2044@@ -37,16 +38,40 @@
2045 {% for rp in refresh_patterns.keys() -%}
2046 refresh_pattern {{ rp }} {{ refresh_patterns[rp]['min'] }} {{ refresh_patterns[rp]['percent'] }}% {{ refresh_patterns[rp]['max'] }}
2047 {% endfor -%}
2048-refresh_pattern . 30 20% 4320
2049-
2050-{% for relid in relations.keys() -%}
2051-{% for sitename in relations[relid].sitenames -%}
2052-acl {{ relations[relid].name }}_acl dstdomain {{ sitename }}
2053-{% else -%}
2054-acl {{ relations[relid].name }}_acl myport {{ config.port }}
2055-{% endfor -%}
2056-http_access allow accel_ports {{ relations[relid].name }}_acl
2057-{% endfor -%}
2058+refresh_pattern . {{default_refresh_pattern.min}} {{default_refresh_pattern.percent}}% {{default_refresh_pattern.max}}
2059+
2060+{% if sites -%}
2061+{% for sitename in sites.keys() -%}
2062+{% if sitename -%}
2063+acl s_{{ loop.index }}_acl dstdomain {{ sitename }}
2064+http_access allow accel_ports s_{{ loop.index }}_acl
2065+http_access allow CONNECT SSL_ports s_{{ loop.index }}_acl
2066+{% if sitename in only_direct -%}
2067+always_direct allow s_{{ loop.index }}_acl
2068+{% else -%}
2069+always_direct allow SSL_ports s_{{ loop.index }}_acl
2070+never_direct allow s_{{ loop.index }}_acl
2071+{% endif -%}
2072+
2073+{% if config['x_balancer_name_allowed'] -%}
2074+acl s_{{ loop.index }}_balancer req_header X-Balancer-Name {{ sitename.replace('.', '\.') }}
2075+http_access allow accel_ports s_{{ loop.index }}_balancer
2076+http_access allow CONNECT SSL_ports s_{{ loop.index }}_balancer
2077+{% if sitename in only_direct -%}
2078+always_direct allow s_{{ loop.index }}_balancer
2079+{% else -%}
2080+always_direct allow SSL_ports s_{{ loop.index }}_balancer
2081+never_direct allow s_{{ loop.index }}_balancer
2082+{% endif -%}
2083+{% endif -%}
2084+
2085+{% else -%}
2086+acl no_sitename_acl myport {{ config.port }}
2087+http_access allow accel_ports no_sitename_acl
2088+never_direct allow no_sitename_acl
2089+{% endif -%}
2090+{% endfor -%}
2091+{% endif -%}
2092
2093 http_access allow manager localhost
2094 http_access deny manager
2095@@ -57,11 +82,24 @@
2096 http_access deny accel_ports all
2097 http_access deny all
2098 icp_access deny all
2099+always_direct deny all
2100
2101-{% for relid in relations -%}
2102-{% if relations[relid].port -%}
2103-cache_peer {{ relations[relid]['private-address'] }} parent {{ relations[relid].port }} 0 name={{ relations[relid].name }} no-query no-digest originserver round-robin login=PASS
2104-cache_peer_access {{ relations[relid].name }} allow {{ relations[relid].name }}_acl
2105-cache_peer_access {{ relations[relid].name }} deny !{{ relations[relid].name }}_acl
2106-{% endif -%}
2107-{% endfor -%}
2108+{% if sites -%}
2109+{% for sitename in sites.keys() -%}
2110+{% set sites_loop = loop %}
2111+{% for peer in sites[sitename] -%}
2112+{% if sitename -%}
2113+cache_peer {{ peer.address }} parent {{ peer.port }} 0 name={{ peer.name }} no-query no-digest originserver round-robin login=PASS {{ peer.options }}
2114+cache_peer_access {{ peer.name }} allow s_{{ sites_loop.index }}_acl
2115+{% if config['x_balancer_name_allowed'] -%}
2116+cache_peer_access {{ peer.name }} allow s_{{ sites_loop.index }}_balancer
2117+{% endif -%}
2118+cache_peer_access {{ peer.name }} deny all
2119+{% else %}
2120+cache_peer {{ peer.address }} parent {{ peer.port }} 0 name={{ peer.name }} no-query no-digest originserver round-robin login=PASS
2121+cache_peer_access {{ peer.name }} allow no_sitename_acl
2122+cache_peer_access {{ peer.name }} deny all
2123+{% endif -%}
2124+{% endfor -%}
2125+{% endfor -%}
2126+{% endif -%}

Subscribers

People subscribed via source and target branches