Merge lp:~bcsaller/charm-tools/composer into lp:charm-tools/1.6

Proposed by Benjamin Saller
Status: Merged
Merged at revision: 359
Proposed branch: lp:~bcsaller/charm-tools/composer
Merge into: lp:charm-tools/1.6
Diff against target: 12668 lines (+11533/-368)
117 files modified
.bzrignore (+3/-1)
MANIFEST.in (+1/-0)
Makefile (+14/-37)
charmtools/compose/__init__.py (+466/-0)
charmtools/compose/config.py (+113/-0)
charmtools/compose/diff_match_patch.py (+1919/-0)
charmtools/compose/fetchers.py (+117/-0)
charmtools/compose/inspector.py (+101/-0)
charmtools/compose/tactics.py (+449/-0)
charmtools/utils.py (+518/-0)
doc/source/compose-intro.md (+18/-0)
doc/source/composer.md (+123/-0)
ez_setup.py (+0/-272)
helpers/python/charmhelpers/tests/test_charmhelpers.py (+1/-1)
requirements.txt (+18/-13)
scripts/packages.sh (+19/-0)
scripts/test (+0/-7)
setup.cfg (+8/-0)
setup.py (+16/-16)
tests/interfaces/mysql/interface.yaml (+1/-0)
tests/interfaces/mysql/provides.py (+1/-0)
tests/interfaces/mysql/requires.py (+1/-0)
tests/test_charm_generate.py (+0/-7)
tests/test_compose.py (+239/-0)
tests/test_config.py (+29/-0)
tests/test_juju_test.py (+0/-9)
tests/test_utils.py (+43/-0)
tests/trusty/a/README.md (+1/-0)
tests/trusty/a/a (+1/-0)
tests/trusty/b/README.md (+1/-0)
tests/trusty/b/composer.yaml (+1/-0)
tests/trusty/b/metadata.yaml (+11/-0)
tests/trusty/c-reactive/README.md (+1/-0)
tests/trusty/c-reactive/composer.yaml (+1/-0)
tests/trusty/c-reactive/hooks/reactive/main.py (+6/-0)
tests/trusty/c/README.md (+1/-0)
tests/trusty/c/composer.yaml (+1/-0)
tests/trusty/c/metadata.yaml (+11/-0)
tests/trusty/chlayer/hooks/charmhelpers.pypi (+1/-0)
tests/trusty/mysql/.bzrignore (+2/-0)
tests/trusty/mysql/Makefile (+24/-0)
tests/trusty/mysql/README.md (+133/-0)
tests/trusty/mysql/charm-helpers.yaml (+9/-0)
tests/trusty/mysql/config.yaml (+141/-0)
tests/trusty/mysql/copyright (+17/-0)
tests/trusty/mysql/hooks/charmhelpers/__init__.py (+38/-0)
tests/trusty/mysql/hooks/charmhelpers/contrib/__init__.py (+15/-0)
tests/trusty/mysql/hooks/charmhelpers/contrib/charmsupport/nrpe.py (+219/-0)
tests/trusty/mysql/hooks/charmhelpers/contrib/charmsupport/volumes.py (+156/-0)
tests/trusty/mysql/hooks/charmhelpers/contrib/database/mysql.py (+385/-0)
tests/trusty/mysql/hooks/charmhelpers/contrib/network/__init__.py (+15/-0)
tests/trusty/mysql/hooks/charmhelpers/contrib/network/ip.py (+450/-0)
tests/trusty/mysql/hooks/charmhelpers/contrib/peerstorage/__init__.py (+148/-0)
tests/trusty/mysql/hooks/charmhelpers/core/__init__.py (+15/-0)
tests/trusty/mysql/hooks/charmhelpers/core/decorators.py (+41/-0)
tests/trusty/mysql/hooks/charmhelpers/core/fstab.py (+134/-0)
tests/trusty/mysql/hooks/charmhelpers/core/hookenv.py (+568/-0)
tests/trusty/mysql/hooks/charmhelpers/core/host.py (+446/-0)
tests/trusty/mysql/hooks/charmhelpers/core/services/__init__.py (+18/-0)
tests/trusty/mysql/hooks/charmhelpers/core/services/base.py (+329/-0)
tests/trusty/mysql/hooks/charmhelpers/core/services/helpers.py (+267/-0)
tests/trusty/mysql/hooks/charmhelpers/core/strutils.py (+42/-0)
tests/trusty/mysql/hooks/charmhelpers/core/sysctl.py (+56/-0)
tests/trusty/mysql/hooks/charmhelpers/core/templating.py (+69/-0)
tests/trusty/mysql/hooks/charmhelpers/core/unitdata.py (+477/-0)
tests/trusty/mysql/hooks/charmhelpers/fetch/__init__.py (+439/-0)
tests/trusty/mysql/hooks/charmhelpers/fetch/archiveurl.py (+161/-0)
tests/trusty/mysql/hooks/charmhelpers/fetch/bzrurl.py (+78/-0)
tests/trusty/mysql/hooks/charmhelpers/fetch/giturl.py (+71/-0)
tests/trusty/mysql/hooks/common.py (+109/-0)
tests/trusty/mysql/hooks/config-changed (+414/-0)
tests/trusty/mysql/hooks/data-relation.py (+31/-0)
tests/trusty/mysql/hooks/db-relation-broken (+21/-0)
tests/trusty/mysql/hooks/db-relation-joined (+87/-0)
tests/trusty/mysql/hooks/ha_relations.py (+163/-0)
tests/trusty/mysql/hooks/install (+49/-0)
tests/trusty/mysql/hooks/master-relation-changed (+95/-0)
tests/trusty/mysql/hooks/monitors-relation-broken (+8/-0)
tests/trusty/mysql/hooks/monitors-relation-departed (+3/-0)
tests/trusty/mysql/hooks/monitors-relation-joined (+9/-0)
tests/trusty/mysql/hooks/monitors.common.bash (+8/-0)
tests/trusty/mysql/hooks/munin-relation-changed (+26/-0)
tests/trusty/mysql/hooks/munin-relation-joined (+6/-0)
tests/trusty/mysql/hooks/nrpe_relations.py (+91/-0)
tests/trusty/mysql/hooks/shared_db_relations.py (+153/-0)
tests/trusty/mysql/hooks/slave-relation-broken (+11/-0)
tests/trusty/mysql/hooks/slave-relation-changed (+89/-0)
tests/trusty/mysql/hooks/slave-relation-joined (+2/-0)
tests/trusty/mysql/hooks/start (+5/-0)
tests/trusty/mysql/hooks/stop (+3/-0)
tests/trusty/mysql/hooks/upgrade-charm (+27/-0)
tests/trusty/mysql/icon.svg (+335/-0)
tests/trusty/mysql/keys/repo.percona.com (+30/-0)
tests/trusty/mysql/metadata.yaml (+43/-0)
tests/trusty/mysql/monitors.yaml (+13/-0)
tests/trusty/mysql/revision (+1/-0)
tests/trusty/mysql/scripts/add_to_cluster (+13/-0)
tests/trusty/mysql/scripts/charm_helpers_sync.py (+225/-0)
tests/trusty/mysql/scripts/mysql_backup.sh (+30/-0)
tests/trusty/mysql/scripts/remove_from_cluster (+4/-0)
tests/trusty/mysql/templates/apparmor.j2 (+15/-0)
tests/trusty/mysql/templates/mysql_backup.j2 (+12/-0)
tests/trusty/mysql/tests/00-setup (+12/-0)
tests/trusty/mysql/tests/15-configs (+77/-0)
tests/trusty/mysql/unit_tests/test_mysql_common.py (+18/-0)
tests/trusty/tester/README.md (+1/-0)
tests/trusty/tester/composer.yaml (+8/-0)
tests/trusty/tester/generate/custom.py (+17/-0)
tests/trusty/tester/hooks/start (+1/-0)
tests/trusty/tester/metadata.yaml (+14/-0)
tests/trusty/use-layers/README.md (+1/-0)
tests/trusty/use-layers/composer.yaml (+1/-0)
tests/trusty/use-layers/hooks/reactive/main.py (+6/-0)
tests_functional/add/test.sh (+2/-2)
tests_functional/create/test.sh (+4/-2)
tests_functional/proof/record.sh (+1/-1)
tox.ini (+21/-0)
To merge this branch: bzr merge lp:~bcsaller/charm-tools/composer
Reviewer Review Type Date Requested Status
Tim Van Steenburgh (community) Approve
Cory Johns (community) Needs Fixing
Marco Ceppi Pending
Review via email: mp+266281@code.launchpad.net

Description of the change

This adds the composer stuff
and ports the probject to use tox

To post a comment you must log in.
Revision history for this message
Cory Johns (johnsca) wrote :

Need to add blessings, ruamel.yaml, pathspec, and bundletester to the install_requires in setup.py

Revision history for this message
Cory Johns (johnsca) :
review: Needs Fixing
Revision history for this message
Adam Israel (aisrael) wrote :

Hi Ben,

Per our earlier conversation, I'd also like to see some documentation on how a charm author would consider and use composer. I'm very excited to see that myself, and give composer a spin.

lp:~bcsaller/charm-tools/composer updated
358. By Benjamin Saller

merge lp:~johnsca/charm-tools/compose

359. By Benjamin Saller

various composer fixes

360. By Benjamin Saller

without the pdb

361. By Benjamin Saller

show paths and so on with -l DEBUG, tests do less actual remote work

362. By Benjamin Saller

compose cli help a little better

363. By Benjamin Saller

fix bzrignore

364. By Benjamin Saller

fix typo

Revision history for this message
Charles Butler (lazypower) wrote :

Sorry this took me so long to circle back, but I've tried this branch of charm tools and it appears the manifest is not fetching all the dependencies.

What i did:

bzr branch lp:~bcsaller/charm-tools/composer/
virtualenv .venv
source .venv/bin/activate

pip install ./

charm compose -h
ImportError: No module named path

pip install path.py
charm compose -h
ImportError: No module named otherstuf

pip install otherstuf
charm compose -h

OSError: [Errno 2] No such file or directory: '/home/charles/projects/work/composer/.venv/local/lib/python2.7/site-packages/charmtools/compose/../../doc/source/compose-intro.md'

Once I ran through that dependency hoop, compose appears to be sorted and available.

Revision history for this message
Cory Johns (johnsca) wrote :

Fix for the missing deps and help error here: https://code.launchpad.net/~johnsca/charm-tools/compose/+merge/268164

Revision history for this message
Cory Johns (johnsca) wrote :

Updated my most recent MP to fix the default value for --name when composing current dir (e.g., "charm compose ." or just "charm compose")

lp:~bcsaller/charm-tools/composer updated
365. By Benjamin Saller

fix deps in setup, better seaching for charm.name

366. By Benjamin Saller

change default interface address to public one

367. By Benjamin Saller

update to work with real DNS and service

368. By Benjamin Saller

remove find name call

369. By Benjamin Saller

various fixes around naming and patch to install to ignore existing deps, also depend on modern pip

370. By Benjamin Saller

repair tests

371. By Benjamin Saller

fix/remove broken tests w/updated bundletester

372. By Benjamin Saller

patch for inspect to work with more varied naming

373. By Benjamin Saller

force key order on metadata.yaml merges

374. By Benjamin Saller

fix tests (to reflect remote changes to basic layer) and ordered metadata rendering

375. By Benjamin Saller

update notes on workflow

376. By Benjamin Saller

various fixes and cleanups. Change installer to do better signing

377. By Benjamin Saller

merge trunk

Revision history for this message
Tim Van Steenburgh (tvansteenburgh) wrote :
Download full text (9.0 KiB)

I'm eager to merge this but can't get the tests to run. I always run `make check` on charm-tools since that includes the integration tests as well. According to the Makefile that should still work, but I get:

tvansteenburgh@trusty-vm:/tmp/charm-tools> make clean
find . -name '*.py[co]' -delete
find . -type f -name '*~' -delete
find . -name '*.bak' -delete
rm -rf bin include lib local man dependencies
tvansteenburgh@trusty-vm:/tmp/charm-tools> make check
bzr checkout lp:~juju-jitsu/charm-tools/dependencies
tox --develop
py27 create: /tmp/charm-tools/.tox/py27
py27 installdeps: -r/tmp/charm-tools/requirements.txt
ERROR: invocation failed, logfile: /tmp/charm-tools/.tox/py27/log/py27-1.log
ERROR: actionid=py27
msg=getenv
cmdargs=[local('/tmp/charm-tools/.tox/py27/bin/pip'), 'install', '--no-index', '-f', 'dependencies/python', '-r/tmp/charm-tools/requirements.txt']
env={'BYOBU_TTY': '/dev/pts/1', 'UPSTART_EVENTS': 'started starting', 'SHELL': '/bin/bash', 'XDG_DATA_DIRS': '/usr/share/ubuntu:/usr/share/gnome:/usr/local/share/:/usr/share/', 'MANDATORY_PATH': '/usr/share/gconf/ubuntu.mandatory.path', 'CLUTTER_IM_MODULE': 'xim', 'BYOBU_RUN_DIR': '/dev/shm/byobu-tvansteenburgh-KRSOmrmO', 'UPSTART_INSTANCE': '', 'JOB': 'gnome-session', 'TEXTDOMAIN': 'im-config', 'XMODIFIERS': '@im=ibus', 'MFLAGS': '', 'SELINUX_INIT': 'YES', 'BYOBU_SED': 'sed', 'BYOBU_LIGHT': '#EEEEEE', 'DESKTOP_SESSION': 'ubuntu', 'BYOBU_DATE': '%Y-%m-%d ', 'XDG_SESSION_ID': 'c1', 'DBUS_SESSION_BUS_ADDRESS': 'unix:abstract=/tmp/dbus-m7XWOMt928', 'DEFAULTS_PATH': '/usr/share/gconf/ubuntu.default.path', 'LESS_TERMCAP_ue': '\x1b[0m', 'GTK_MODULES': 'overlay-scrollbar:unity-gtk-module', 'INSTANCE': 'Unity', 'LESS_TERMCAP_us': '\x1b[04;38;5;139m', 'LS_COLORS': 'rs=0:di=38;5;5:ln=4;5;37:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:su=37;41:sg=30;43:ca=30;41:tw=3;28:ow=34;42:st=37;44:ex=38;5;202:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lz=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.mid=00;36:*.midi=00;36:*.mka=...

Read more...

review: Needs Fixing
Revision history for this message
Tim Van Steenburgh (tvansteenburgh) wrote :
Download full text (13.6 KiB)

I got integration tests to run by changing this in the Makefile:

-build: deps develop
+build: deps

But then I got a bunch of legit failures from the tests:

2 tvansteenburgh@trusty-vm:/tmp/charm-tools> make check
tests_functional/helpers/helpers.sh || sh -x tests_functional/helpers/helpers.sh timeout
Fri Aug 28 11:04:24 EDT 2015: Testing ch_apparmor_load...PASS
Fri Aug 28 11:04:24 EDT 2015: Testing ch_type_hash...PASS
Fri Aug 28 11:04:24 EDT 2015: Testing ch_is_url...PASS
Fri Aug 28 11:04:24 EDT 2015: Testing Testing ch_is_ip...PASS
Fri Aug 28 11:04:24 EDT 2015: Testing Testing ch_get_ip...PASS
Fri Aug 28 11:04:24 EDT 2015: Starting SimpleHTTPServer in /tmp/charm-helper-srv.BlmbCS on port 8999 to test fetching files.
Fri Aug 28 11:04:24 EDT 2015: Looping wget until webserver responds...
Fri Aug 28 11:04:25 EDT 2015: Attempt 1 succeeded.
Fri Aug 28 11:04:25 EDT 2015: Creating temp data file
Fri Aug 28 11:04:25 EDT 2015: creating gzipped test data
Fri Aug 28 11:04:25 EDT 2015: Testing ch_get_file...PASS
Fri Aug 28 11:04:25 EDT 2015: Shutting down webserver...DONE
Fri Aug 28 11:04:25 EDT 2015: Printing server log
127.0.0.1 - - [28/Aug/2015 11:04:25] "GET / HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:25] "GET /testdata.txt HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:25] "GET /testdata.txt HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:25] "GET /testdata.txt HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:25] "GET /testdata.txt.gz HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:25] "GET /testdata.txt.gz HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:25] "GET /testdata.txt.gz HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:25] "GET /testdata.txt HTTP/1.1" 200 -
Fri Aug 28 11:04:25 EDT 2015: Testing ch_unit_name...PASS
Fri Aug 28 11:04:25 EDT 2015: Testing ch_unit_id...PASS
Fri Aug 28 11:04:25 EDT 2015: Testing ch_my_unit_id...PASS
Test shell helpers with bash
bash tests_functional/helpers/helpers.sh \
            || bash -x tests_functional/helpers/helpers.sh timeout
Fri Aug 28 11:04:25 EDT 2015: Testing ch_apparmor_load...PASS
Fri Aug 28 11:04:25 EDT 2015: Testing ch_type_hash...PASS
Fri Aug 28 11:04:25 EDT 2015: Testing ch_is_url...PASS
Fri Aug 28 11:04:25 EDT 2015: Testing Testing ch_is_ip...PASS
Fri Aug 28 11:04:25 EDT 2015: Testing Testing ch_get_ip...PASS
Fri Aug 28 11:04:25 EDT 2015: Starting SimpleHTTPServer in /tmp/charm-helper-srv.TOq187 on port 8999 to test fetching files.
Fri Aug 28 11:04:25 EDT 2015: Looping wget until webserver responds...
Fri Aug 28 11:04:26 EDT 2015: Attempt 1 succeeded.
Fri Aug 28 11:04:26 EDT 2015: Creating temp data file
Fri Aug 28 11:04:26 EDT 2015: creating gzipped test data
Fri Aug 28 11:04:26 EDT 2015: Testing ch_get_file...PASS
Fri Aug 28 11:04:26 EDT 2015: Shutting down webserver...DONE
Fri Aug 28 11:04:26 EDT 2015: Printing server log
127.0.0.1 - - [28/Aug/2015 11:04:26] "GET / HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:26] "GET /testdata.txt HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:26] "GET /testdata.txt HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:26] "GET /testdata.txt HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:26] "GET /testdata.txt.gz HTTP/1.1" 200 -
127.0.0...

review: Needs Fixing
lp:~bcsaller/charm-tools/composer updated
378. By Benjamin Saller

merge trunk

379. By Benjamin Saller

path fixes on integration tests

Revision history for this message
Benjamin Saller (bcsaller) wrote :

should be fixed, very minor pathing issues in the tests to point to the tox virtualenv

lp:~bcsaller/charm-tools/composer updated
380. By Benjamin Saller

notest on develop

381. By Benjamin Saller

pass term env

382. By Benjamin Saller

move passenv

Revision history for this message
Tim Van Steenburgh (tvansteenburgh) wrote :

Tests pass after upgrading tox to 2.1.1 (1.6.0 failed), LGTM.

review: Approve

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file '.bzrignore'
2--- .bzrignore 2014-01-14 03:23:17 +0000
3+++ .bzrignore 2015-08-31 19:32:56 +0000
4@@ -1,4 +1,4 @@
5-tests/proof/results/*
6+tests_functional/proof/results/*
7 .coverage
8 .noseids
9 charm_tools.egg-info
10@@ -13,3 +13,5 @@
11 dist
12 tests/.ropeproject/
13 charmtools/.ropeproject/
14+.ropeproject/
15+.tox/
16
17=== modified file 'MANIFEST.in'
18--- MANIFEST.in 2014-06-19 23:18:53 +0000
19+++ MANIFEST.in 2015-08-31 19:32:56 +0000
20@@ -1,3 +1,4 @@
21 include *.py README*
22+include doc/source/composer.md
23 recursive-include charmtools *
24 recursive-exclude charmtools *.pyc
25
26=== modified file 'Makefile'
27--- Makefile 2014-06-20 19:46:59 +0000
28+++ Makefile 2015-08-31 19:32:56 +0000
29@@ -17,44 +17,23 @@
30 confdir = $(DESTDIR)/etc
31 INSTALL = install
32
33-# We use a "canary" file to tell us if the package has been installed in
34-# "develop" mode.
35-DEVELOP_CANARY := lib/__develop_canary
36-develop: $(DEVELOP_CANARY)
37-$(DEVELOP_CANARY): | python-deps
38- bin/python setup.py develop
39- touch $(DEVELOP_CANARY)
40+develop:
41+ tox --develop --notest
42
43-build: deps develop bin/test
44+build: deps develop
45
46 dependencies:
47 bzr checkout lp:~juju-jitsu/charm-tools/dependencies
48
49-# We use a "canary" file to tell us if the Python packages have been installed.
50-PYTHON_PACKAGE_CANARY := lib/python2.7/site-packages/___canary
51-python-deps: $(PYTHON_PACKAGE_CANARY)
52-$(PYTHON_PACKAGE_CANARY): requirements.txt | dependencies
53- sudo apt-get update
54- sudo apt-get install -y build-essential bzr python-dev \
55- python-virtualenv
56- virtualenv .
57- bin/pip install --no-index --no-dependencies --find-links \
58- file:///$(WD)/dependencies/python -r requirements.txt
59- touch $(PYTHON_PACKAGE_CANARY)
60+PYTHON_DEPS=build-essential bzr python-dev python-tox
61+python-deps: scripts/packages.sh
62+ $(if $(shell ./scripts/packages.sh $(PYTHON_DEPS)), \
63+ tox -r --notest)
64
65 deps: python-deps | dependencies
66
67-bin/nosetests: python-deps
68-
69-bin/test: | bin/nosetests
70- ln scripts/test bin/test
71-
72-test: build bin/test
73- bin/test
74-
75-lint: sources = setup.py charmtools
76-lint: build
77- @find $(sources) -name '*.py' -print0 | xargs -r0 bin/flake8
78+test:
79+ tox
80
81 tags:
82 ctags --tag-relative --python-kinds=-iv -Rf tags --sort=yes \
83@@ -93,12 +72,11 @@
84 tests_functional/proof/test.sh
85 tests_functional/create/test.sh
86 tests_functional/add/test.sh
87-# PYTHONPATH=helpers/python python helpers/python/charmhelpers/tests/test_charmhelpers.py
88-
89-coverage: build bin/test
90- bin/test --with-coverage --cover-package=charmtools --cover-tests
91-
92-check: build integration test lint
93+
94+coverage: build
95+ tox
96+
97+check: build integration test
98
99 define phony
100 build
101@@ -106,7 +84,6 @@
102 clean
103 deps
104 install
105- lint
106 tags
107 test
108 endef
109
110=== added directory 'charmtools/compose'
111=== added file 'charmtools/compose/__init__.py'
112--- charmtools/compose/__init__.py 1970-01-01 00:00:00 +0000
113+++ charmtools/compose/__init__.py 2015-08-31 19:32:56 +0000
114@@ -0,0 +1,466 @@
115+#!/usr/bin/env python
116+# -*- coding: utf-8 -*-
117+import argparse
118+import json
119+import logging
120+import os
121+import sys
122+
123+import blessings
124+from collections import OrderedDict
125+from path import path
126+import yaml
127+from charmtools.compose import inspector
128+import charmtools.compose.tactics
129+from charmtools.compose.config import (ComposerConfig, DEFAULT_IGNORES)
130+from charmtools.compose.fetchers import (InterfaceFetcher,
131+ LayerFetcher,
132+ get_fetcher,
133+ FetchError)
134+from charmtools import utils
135+
136+log = logging.getLogger("composer")
137+
138+
139+class Configable(object):
140+ CONFIG_FILE = None
141+
142+ def __init__(self):
143+ self._config = ComposerConfig()
144+ self.config_file = None
145+
146+ @property
147+ def config(self):
148+ if self._config.configured:
149+ return self._config
150+ if self.config_file and self.config_file.exists():
151+ self._config.configure(self.config_file)
152+ return self._config
153+
154+ @property
155+ def configured(self):
156+ return bool(self.config is not None and self.config.configured)
157+
158+
159+class Fetched(Configable):
160+ def __init__(self, url, target_repo, name=None):
161+ super(Fetched, self).__init__()
162+ self.url = url
163+ self.target_repo = target_repo
164+ self.directory = None
165+ self._name = name
166+
167+ @property
168+ def name(self):
169+ if self._name:
170+ return self._name
171+ if self.url.startswith(self.NAMESPACE):
172+ return self.url[len(self.NAMESPACE):]
173+ return self.url
174+
175+ def __repr__(self):
176+ return "<{} {}:{}>".format(self.__class__.__name__,
177+ self.url, self.directory)
178+
179+ def __div__(self, other):
180+ return self.directory / other
181+
182+ def fetch(self):
183+ try:
184+ fetcher = get_fetcher(self.url)
185+ except FetchError:
186+ # We might be passing a local dir path directly
187+ # which fetchers don't currently support
188+ self.directory = path(self.url)
189+ else:
190+ if hasattr(fetcher, "path") and fetcher.path.exists():
191+ self.directory = path(fetcher.path)
192+ else:
193+ if not self.target_repo.exists():
194+ self.target_repo.makedirs_p()
195+ self.directory = path(fetcher.fetch(self.target_repo))
196+
197+ if not self.directory.exists():
198+ raise OSError(
199+ "Unable to locate {}. "
200+ "Do you need to set {}?".format(
201+ self.url, self.ENVIRON))
202+
203+ self.config_file = self.directory / self.CONFIG_FILE
204+ self._name = self.config.name
205+ return self
206+
207+
208+class Interface(Fetched):
209+ CONFIG_FILE = "interface.yaml"
210+ NAMESPACE = "interface"
211+ ENVIRON = "INTERFACE_PATH"
212+
213+
214+class Layer(Fetched):
215+ CONFIG_FILE = "composer.yaml"
216+ NAMESPACE = "layer"
217+ ENVIRON = "COMPOSER_PATH"
218+
219+
220+class Composer(object):
221+ """
222+ Handle the processing of overrides, implements the policy of ComposerConfig
223+ """
224+ PHASES = ['lint', 'read', 'call', 'sign', 'build']
225+
226+ def __init__(self):
227+ self.config = ComposerConfig()
228+ self.force = False
229+ self._name = None
230+ self._charm = None
231+
232+ @property
233+ def charm(self):
234+ return self._charm
235+
236+ @charm.setter
237+ def charm(self, value):
238+ self._charm = path(value)
239+
240+ @property
241+ def name(self):
242+ if self._name:
243+ return self._name
244+
245+ # optionally extract name from the top layer
246+ self._name = str(path(self.charm).abspath().name)
247+
248+ # however if the current layer has a metadata.yaml we can
249+ # use its name
250+ md = path(self.charm) / "metadata.yaml"
251+ if md.exists():
252+ data = yaml.load(md.open())
253+ name = data.get("name")
254+ if name:
255+ self._name = name
256+ return self._name
257+
258+ @name.setter
259+ def name(self, value):
260+ self._name = value
261+
262+ @property
263+ def charm_name(self):
264+ return "{}/{}".format(self.series, self.name)
265+
266+ def status(self):
267+ result = {}
268+ result.update(vars(self))
269+ for e in ["COMPOSER_PATH", "INTERFACE_PATH", "JUJU_REPOSITORY"]:
270+ result[e] = os.environ.get(e)
271+ return result
272+
273+ def create_repo(self):
274+ # Generated output will go into this directory
275+ base = path(self.output_dir)
276+ self.repo = (base / self.series)
277+ # And anything it includes from will be placed here
278+ # outside the series
279+ self.deps = (base / "deps" / self.series)
280+ self.target_dir = (self.repo / self.name)
281+
282+ def find_or_create_repo(self, allow_create=True):
283+ # see if output dir is already in a repo, we can use that directly
284+ if self.output_dir == path(self.charm).normpath():
285+ # we've indicated in the cmdline that we are doing an inplace
286+ # update
287+ if self.output_dir.parent.basename() == self.series:
288+ # we're already in a repo
289+ self.repo = self.output_dir.parent.parent
290+ self.deps = (self.repo / "deps" / self.series)
291+ self.target_dir = self.output_dir
292+ return
293+ if allow_create:
294+ self.create_repo()
295+ else:
296+ raise ValueError("%s doesn't seem valid", self.charm.directory)
297+
298+ def fetch(self):
299+ layer = Layer(self.charm, self.deps).fetch()
300+ if not layer.configured:
301+ log.info("The top level layer expects a "
302+ "valid composer.yaml file, "
303+ "using defaults.")
304+ # Manually create a layer object for the output
305+ self.target = Layer(self.name, self.repo)
306+ self.target.directory = self.target_dir
307+ return self.fetch_deps(layer)
308+
309+ def fetch_deps(self, layer):
310+ results = {"layers": [], "interfaces": []}
311+ self.fetch_dep(layer, results)
312+ # results should now be a bottom up list
313+ # of deps. Using the in order results traversal
314+ # we can build out our plan for each file in the
315+ # output layer
316+ results["layers"].append(layer)
317+ self._layers = results["layers"]
318+ self._interfaces = results["interfaces"]
319+ return results
320+
321+ @property
322+ def layers(self):
323+ layers = []
324+ for i in self._layers:
325+ layers.append(i.url)
326+ for i in self._interfaces:
327+ layers.append(i.url)
328+ layers.append("composer")
329+ return layers
330+
331+ def fetch_dep(self, layer, results):
332+ # Recursively fetch and scan layers
333+ # This returns a plan for each file in the result
334+ baselayers = layer.config.get('includes', [])
335+ if not baselayers:
336+ # no deps, this is possible for any base
337+ # but questionable for the target
338+ return
339+
340+ if isinstance(baselayers, str):
341+ baselayers = [baselayers]
342+
343+ for base in baselayers:
344+ if base.startswith("interface:"):
345+ iface = Interface(base, self.deps).fetch()
346+ results["interfaces"].append(iface)
347+ else:
348+ base_layer = Layer(base, self.deps).fetch()
349+ self.fetch_dep(base_layer, results)
350+ results["layers"].append(base_layer)
351+
352+ def build_tactics(self, entry, current, config, output_files):
353+ # Delegate to the config object, it's rules
354+ # will produce a tactic
355+ relname = entry.relpath(current.directory)
356+ current = current.config.tactic(entry, current, self.target, config)
357+ existing = output_files.get(relname)
358+ if existing is not None:
359+ tactic = current.combine(existing)
360+ else:
361+ tactic = current
362+ output_files[relname] = tactic
363+
364+ def plan_layers(self, layers, output_files):
365+ config = ComposerConfig()
366+ config = config.add_config(
367+ layers["layers"][0] / ComposerConfig.DEFAULT_FILE, True)
368+
369+ layers["layers"][-1].url = self.name
370+
371+ for i, layer in enumerate(layers["layers"]):
372+ log.info("Processing layer: %s", layer.url)
373+ if i + 1 < len(layers["layers"]):
374+ next_layer = layers["layers"][i + 1]
375+ config = config.add_config(
376+ next_layer / ComposerConfig.DEFAULT_FILE, True)
377+ list(e for e in utils.walk(layer.directory,
378+ self.build_tactics,
379+ current=layer,
380+ config=config,
381+ output_files=output_files))
382+ plan = [t for t in output_files.values() if t]
383+ return plan
384+
385+ def plan_interfaces(self, layers, output_files, plan):
386+ # Interface includes don't directly map to output files
387+ # as they are computed in combination with the metadata.yaml
388+ charm_meta = output_files.get("metadata.yaml")
389+ if charm_meta:
390+ meta = charm_meta()
391+ if not meta:
392+ return
393+ target_config = layers["layers"][-1].config
394+ specs = []
395+ used_interfaces = set()
396+ for kind in ("provides", "requires", "peer"):
397+ for k, v in meta.get(kind, {}).items():
398+ # ex: ["provides", "db", "mysql"]
399+ specs.append([kind, k, v["interface"]])
400+ used_interfaces.add(v["interface"])
401+
402+ for iface in layers["interfaces"]:
403+ if iface.name not in used_interfaces:
404+ # we shouldn't include something the charm doesn't use
405+ log.warn("composer.yaml includes {} which isn't "
406+ "used in metadata.yaml".format(
407+ iface.name))
408+ continue
409+ for kind, relation_name, interface_name in specs:
410+ if interface_name != iface.name:
411+ continue
412+ # COPY phase
413+ plan.append(
414+ charmtools.compose.tactics.InterfaceCopy(
415+ iface, relation_name,
416+ self.target, target_config)
417+ )
418+ # Link Phase
419+ plan.append(
420+ charmtools.compose.tactics.InterfaceBind(
421+ iface, relation_name, kind,
422+ self.target, target_config))
423+ elif not charm_meta and layers["interfaces"]:
424+ raise ValueError(
425+ "Includes interfaces but no metadata.yaml to bind them")
426+
427+ def formulate_plan(self, layers):
428+ """Build out a plan for each file in the various composed
429+ layers, taking into account config at each layer"""
430+ output_files = OrderedDict()
431+ self.plan = self.plan_layers(layers, output_files)
432+ self.plan_interfaces(layers, output_files, self.plan)
433+ return self.plan
434+
435+ def exec_plan(self, plan=None, layers=None):
436+ signatures = {}
437+ cont = True
438+ for phase in self.PHASES:
439+ for tactic in plan:
440+ if phase == "lint":
441+ cont &= tactic.lint()
442+ if cont is False and self.force is not True:
443+ break
444+ elif phase == "read":
445+ # We use a read (into memory phase to make layer comps
446+ # simpler)
447+ tactic.read()
448+ elif phase == "call":
449+ tactic()
450+ elif phase == "sign":
451+ sig = tactic.sign()
452+ if sig:
453+ signatures.update(sig)
454+ elif phase == "build":
455+ tactic.build()
456+ # write out the sigs
457+ if "sign" in self.PHASES:
458+ self.write_signatures(signatures, layers)
459+
460+ def write_signatures(self, signatures, layers):
461+ sigs = self.target / ".composer.manifest"
462+ signatures['.composer.manifest'] = ["composer", 'dynamic', 'unchecked']
463+ sigs.write_text(json.dumps(dict(
464+ signatures=signatures,
465+ layers=layers,
466+ ), indent=2))
467+
468+ def generate(self):
469+ layers = self.fetch()
470+ self.formulate_plan(layers)
471+ self.exec_plan(self.plan, self.layers)
472+
473+ def validate(self):
474+ p = self.target_dir / ".composer.manifest"
475+ if not p.exists():
476+ return [], [], []
477+ ignorer = utils.ignore_matcher(DEFAULT_IGNORES)
478+ a, c, d = utils.delta_signatures(p, ignorer)
479+
480+ for f in a:
481+ log.warn(
482+ "Added unexpected file, should be in a base layer: %s", f)
483+ for f in c:
484+ log.warn(
485+ "Changed file owned by another layer: %s", f)
486+ for f in d:
487+ log.warn(
488+ "Deleted a file owned by another layer: %s", f)
489+ if a or c or d:
490+ if self.force is True:
491+ log.info(
492+ "Continuing with known changes to target layer. "
493+ "Changes will be overwritten")
494+ else:
495+ raise ValueError(
496+ "Unable to continue due to unexpected modifications")
497+ return a, c, d
498+
499+ def __call__(self):
500+ self.find_or_create_repo()
501+
502+ log.debug(json.dumps(
503+ self.status(), indent=2, sort_keys=True, default=str))
504+ self.validate()
505+ self.generate()
506+
507+ def inspect(self):
508+ self.charm = path(self.charm).abspath()
509+ inspector.inspect(self.charm)
510+
511+ def normalize_outputdir(self):
512+ od = path(self.charm).normpath()
513+ repo = os.environ.get('JUJU_REPOSITORY')
514+ if repo:
515+ repo = path(repo)
516+ if repo.exists():
517+ od = repo
518+ elif ":" in od:
519+ od = od.basename
520+ log.info("Composing into {}".format(od))
521+ self.output_dir = od
522+
523+
524+def configLogging(composer):
525+ global log
526+ clifmt = utils.ColoredFormatter(
527+ blessings.Terminal(),
528+ '%(name)s: %(message)s')
529+ root_logger = logging.getLogger()
530+ clihandler = logging.StreamHandler(sys.stdout)
531+ clihandler.setFormatter(clifmt)
532+ if isinstance(composer.log_level, str):
533+ composer.log_level = composer.log_level.upper()
534+ root_logger.setLevel(composer.log_level)
535+ log.setLevel(composer.log_level)
536+ root_logger.addHandler(clihandler)
537+ requests_logger = logging.getLogger("requests")
538+ requests_logger.setLevel(logging.WARNING)
539+
540+
541+def inspect(args=None):
542+ composer = Composer()
543+ parser = argparse.ArgumentParser()
544+ parser.add_argument('-l', '--log-level', default=logging.INFO)
545+ parser.add_argument('charm', nargs="?", default=".", type=path)
546+ # Namespace will set the options as attrs of composer
547+ parser.parse_args(args, namespace=composer)
548+ configLogging(composer)
549+ composer.inspect()
550+
551+
552+def main(args=None):
553+ composer = Composer()
554+ parser = argparse.ArgumentParser(
555+ description="Compose layers into a charm",
556+ formatter_class=argparse.RawDescriptionHelpFormatter,)
557+ parser.add_argument('-l', '--log-level', default=logging.INFO)
558+ parser.add_argument('-f', '--force', action="store_true")
559+ parser.add_argument('-o', '--output-dir', type=path)
560+ parser.add_argument('-s', '--series', default="trusty")
561+ parser.add_argument('--interface-service',
562+ default="http://interfaces.juju.solutions")
563+ parser.add_argument('-n', '--name',
564+ help="Generate a charm of 'name' from 'charm'")
565+ parser.add_argument('charm', nargs="?", default=".", type=path)
566+ # Namespace will set the options as attrs of composer
567+ parser.parse_args(args, namespace=composer)
568+ # Monkey patch in the domain for the interface webservice
569+ InterfaceFetcher.INTERFACE_DOMAIN = composer.interface_service
570+ LayerFetcher.INTERFACE_DOMAIN = composer.interface_service
571+ configLogging(composer)
572+
573+ if not composer.output_dir:
574+ composer.normalize_outputdir()
575+
576+ composer()
577+
578+
579+if __name__ == '__main__':
580+ main()
581
582=== added file 'charmtools/compose/config.py'
583--- charmtools/compose/config.py 1970-01-01 00:00:00 +0000
584+++ charmtools/compose/config.py 2015-08-31 19:32:56 +0000
585@@ -0,0 +1,113 @@
586+from .tactics import DEFAULT_TACTICS, load_tactic
587+
588+
589+import pathspec
590+from ruamel import yaml
591+import logging
592+from path import path
593+from otherstuf import chainstuf
594+
595+DEFAULT_IGNORES = [
596+ ".bzr/",
597+ ".git/",
598+ "**/.ropeproject/",
599+ "*.pyc",
600+ "*~",
601+ ".tox/",
602+ "build/",
603+]
604+
605+
606+class ComposerConfig(chainstuf):
607+ """Defaults for controlling the generator, each layer in
608+ the inclusion graph can provide values, including things
609+ like overrides, or warnings if things are overridden that
610+ shouldn't be.
611+ """
612+ DEFAULT_FILE = "composer.yaml"
613+
614+ def __init__(self, *args, **kwargs):
615+ super(ComposerConfig, self).__init__(*args, **kwargs)
616+ self['_tactics'] = []
617+ self.configured = False
618+
619+ def __getattr__(self, key):
620+ return self[key]
621+
622+ def rget(self, key):
623+ """Combine all the results from all the layers into a single iter"""
624+ result = []
625+ for m in self.maps:
626+ r = m.get(key)
627+ if r:
628+ if isinstance(r, (list, tuple)):
629+ result.extend(r)
630+ else:
631+ result.append(r)
632+ return result
633+
634+ def configure(self, config_file, allow_missing=False):
635+ config_file = path(config_file)
636+ data = None
637+ if not config_file.exists() and not allow_missing:
638+ raise OSError("Missing Config File {}".format(config_file))
639+ try:
640+ if config_file.exists():
641+ data = yaml.load(config_file.open())
642+ self.configured = True
643+ except yaml.parser.ParserError:
644+ logging.critical("Malformed Config file: {}".format(config_file))
645+ raise
646+ if data:
647+ self.update(data)
648+ # look at any possible imports and use them to build tactics
649+ tactics = self.get('tactics')
650+ basedir = config_file.dirname()
651+ if tactics:
652+ for name in tactics:
653+ tactic = load_tactic(name, basedir)
654+ self._tactics.append(tactic)
655+ return self
656+
657+ @classmethod
658+ def from_config(cls, config_file, allow_missing=False):
659+ c = cls()
660+ c.configure(config_file, allow_missing)
661+ return c
662+
663+ def add_config(self, config_file, allow_missing=False):
664+ c = self.new_child()
665+ c.configure(config_file, allow_missing)
666+ return c
667+
668+ @property
669+ def name(self):
670+ return self.get('name')
671+
672+ @property
673+ def ignores(self):
674+ return self.rget('ignore') + DEFAULT_IGNORES
675+
676+ def tactics(self):
677+ # XXX: combine from config layer
678+ return self.rget('_tactics') + DEFAULT_TACTICS
679+
680+ def tactic(self, entity, current, target, next_config):
681+ # Produce a tactic for the entity in question
682+ # These will be accumulate through the layers
683+ # and executed later
684+ bd = current.directory
685+ # Ignore handling
686+ if next_config:
687+ spec = pathspec.PathSpec.from_lines(pathspec.GitIgnorePattern,
688+ next_config.ignores)
689+ p = entity.relpath(bd)
690+ matches = spec.match_files((p,))
691+ if p in matches:
692+ return None
693+
694+ for tactic in self.tactics():
695+ if tactic.trigger(entity.relpath(bd)):
696+ return tactic(target=target, entity=entity,
697+ current=current, config=next_config)
698+ return None
699
700=== added file 'charmtools/compose/diff_match_patch.py'
701--- charmtools/compose/diff_match_patch.py 1970-01-01 00:00:00 +0000
702+++ charmtools/compose/diff_match_patch.py 2015-08-31 19:32:56 +0000
703@@ -0,0 +1,1919 @@
704+#!/usr/bin/python2.4
705+
706+from __future__ import division
707+
708+"""Diff Match and Patch
709+
710+Copyright 2006 Google Inc.
711+http://code.google.com/p/google-diff-match-patch/
712+
713+Licensed under the Apache License, Version 2.0 (the "License");
714+you may not use this file except in compliance with the License.
715+You may obtain a copy of the License at
716+
717+ http://www.apache.org/licenses/LICENSE-2.0
718+
719+Unless required by applicable law or agreed to in writing, software
720+distributed under the License is distributed on an "AS IS" BASIS,
721+WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
722+See the License for the specific language governing permissions and
723+limitations under the License.
724+"""
725+
726+"""Functions for diff, match and patch.
727+
728+Computes the difference between two texts to create a patch.
729+Applies the patch onto another text, allowing for errors.
730+"""
731+
732+__author__ = 'fraser@google.com (Neil Fraser)'
733+
734+import math
735+import re
736+import sys
737+import time
738+import urllib
739+
740+class diff_match_patch:
741+ """Class containing the diff, match and patch methods.
742+
743+ Also contains the behaviour settings.
744+ """
745+
746+ def __init__(self):
747+ """Inits a diff_match_patch object with default settings.
748+ Redefine these in your program to override the defaults.
749+ """
750+
751+ # Number of seconds to map a diff before giving up (0 for infinity).
752+ self.Diff_Timeout = 1.0
753+ # Cost of an empty edit operation in terms of edit characters.
754+ self.Diff_EditCost = 4
755+ # At what point is no match declared (0.0 = perfection, 1.0 = very loose).
756+ self.Match_Threshold = 0.5
757+ # How far to search for a match (0 = exact location, 1000+ = broad match).
758+ # A match this many characters away from the expected location will add
759+ # 1.0 to the score (0.0 is a perfect match).
760+ self.Match_Distance = 1000
761+ # When deleting a large block of text (over ~64 characters), how close do
762+ # the contents have to be to match the expected contents. (0.0 = perfection,
763+ # 1.0 = very loose). Note that Match_Threshold controls how closely the
764+ # end points of a delete need to match.
765+ self.Patch_DeleteThreshold = 0.5
766+ # Chunk size for context length.
767+ self.Patch_Margin = 4
768+
769+ # The number of bits in an int.
770+ # Python has no maximum, thus to disable patch splitting set to 0.
771+ # However to avoid long patches in certain pathological cases, use 32.
772+ # Multiple short patches (using native ints) are much faster than long ones.
773+ self.Match_MaxBits = 32
774+
775+ # DIFF FUNCTIONS
776+
777+ # The data structure representing a diff is an array of tuples:
778+ # [(DIFF_DELETE, "Hello"), (DIFF_INSERT, "Goodbye"), (DIFF_EQUAL, " world.")]
779+ # which means: delete "Hello", add "Goodbye" and keep " world."
780+ DIFF_DELETE = -1
781+ DIFF_INSERT = 1
782+ DIFF_EQUAL = 0
783+
784+ def diff_main(self, text1, text2, checklines=True, deadline=None):
785+ """Find the differences between two texts. Simplifies the problem by
786+ stripping any common prefix or suffix off the texts before diffing.
787+
788+ Args:
789+ text1: Old string to be diffed.
790+ text2: New string to be diffed.
791+ checklines: Optional speedup flag. If present and false, then don't run
792+ a line-level diff first to identify the changed areas.
793+ Defaults to true, which does a faster, slightly less optimal diff.
794+ deadline: Optional time when the diff should be complete by. Used
795+ internally for recursive calls. Users should set DiffTimeout instead.
796+
797+ Returns:
798+ Array of changes.
799+ """
800+ # Set a deadline by which time the diff must be complete.
801+ if deadline == None:
802+ # Unlike in most languages, Python counts time in seconds.
803+ if self.Diff_Timeout <= 0:
804+ deadline = sys.maxint
805+ else:
806+ deadline = time.time() + self.Diff_Timeout
807+
808+ # Check for null inputs.
809+ if text1 == None or text2 == None:
810+ raise ValueError("Null inputs. (diff_main)")
811+
812+ # Check for equality (speedup).
813+ if text1 == text2:
814+ if text1:
815+ return [(self.DIFF_EQUAL, text1)]
816+ return []
817+
818+ # Trim off common prefix (speedup).
819+ commonlength = self.diff_commonPrefix(text1, text2)
820+ commonprefix = text1[:commonlength]
821+ text1 = text1[commonlength:]
822+ text2 = text2[commonlength:]
823+
824+ # Trim off common suffix (speedup).
825+ commonlength = self.diff_commonSuffix(text1, text2)
826+ if commonlength == 0:
827+ commonsuffix = ''
828+ else:
829+ commonsuffix = text1[-commonlength:]
830+ text1 = text1[:-commonlength]
831+ text2 = text2[:-commonlength]
832+
833+ # Compute the diff on the middle block.
834+ diffs = self.diff_compute(text1, text2, checklines, deadline)
835+
836+ # Restore the prefix and suffix.
837+ if commonprefix:
838+ diffs[:0] = [(self.DIFF_EQUAL, commonprefix)]
839+ if commonsuffix:
840+ diffs.append((self.DIFF_EQUAL, commonsuffix))
841+ self.diff_cleanupMerge(diffs)
842+ return diffs
843+
844+ def diff_compute(self, text1, text2, checklines, deadline):
845+ """Find the differences between two texts. Assumes that the texts do not
846+ have any common prefix or suffix.
847+
848+ Args:
849+ text1: Old string to be diffed.
850+ text2: New string to be diffed.
851+ checklines: Speedup flag. If false, then don't run a line-level diff
852+ first to identify the changed areas.
853+ If true, then run a faster, slightly less optimal diff.
854+ deadline: Time when the diff should be complete by.
855+
856+ Returns:
857+ Array of changes.
858+ """
859+ if not text1:
860+ # Just add some text (speedup).
861+ return [(self.DIFF_INSERT, text2)]
862+
863+ if not text2:
864+ # Just delete some text (speedup).
865+ return [(self.DIFF_DELETE, text1)]
866+
867+ if len(text1) > len(text2):
868+ (longtext, shorttext) = (text1, text2)
869+ else:
870+ (shorttext, longtext) = (text1, text2)
871+ i = longtext.find(shorttext)
872+ if i != -1:
873+ # Shorter text is inside the longer text (speedup).
874+ diffs = [(self.DIFF_INSERT, longtext[:i]), (self.DIFF_EQUAL, shorttext),
875+ (self.DIFF_INSERT, longtext[i + len(shorttext):])]
876+ # Swap insertions for deletions if diff is reversed.
877+ if len(text1) > len(text2):
878+ diffs[0] = (self.DIFF_DELETE, diffs[0][1])
879+ diffs[2] = (self.DIFF_DELETE, diffs[2][1])
880+ return diffs
881+
882+ if len(shorttext) == 1:
883+ # Single character string.
884+ # After the previous speedup, the character can't be an equality.
885+ return [(self.DIFF_DELETE, text1), (self.DIFF_INSERT, text2)]
886+
887+ # Check to see if the problem can be split in two.
888+ hm = self.diff_halfMatch(text1, text2)
889+ if hm:
890+ # A half-match was found, sort out the return data.
891+ (text1_a, text1_b, text2_a, text2_b, mid_common) = hm
892+ # Send both pairs off for separate processing.
893+ diffs_a = self.diff_main(text1_a, text2_a, checklines, deadline)
894+ diffs_b = self.diff_main(text1_b, text2_b, checklines, deadline)
895+ # Merge the results.
896+ return diffs_a + [(self.DIFF_EQUAL, mid_common)] + diffs_b
897+
898+ if checklines and len(text1) > 100 and len(text2) > 100:
899+ return self.diff_lineMode(text1, text2, deadline)
900+
901+ return self.diff_bisect(text1, text2, deadline)
902+
903+ def diff_lineMode(self, text1, text2, deadline):
904+ """Do a quick line-level diff on both strings, then rediff the parts for
905+ greater accuracy.
906+ This speedup can produce non-minimal diffs.
907+
908+ Args:
909+ text1: Old string to be diffed.
910+ text2: New string to be diffed.
911+ deadline: Time when the diff should be complete by.
912+
913+ Returns:
914+ Array of changes.
915+ """
916+
917+ # Scan the text on a line-by-line basis first.
918+ (text1, text2, linearray) = self.diff_linesToChars(text1, text2)
919+
920+ diffs = self.diff_main(text1, text2, False, deadline)
921+
922+ # Convert the diff back to original text.
923+ self.diff_charsToLines(diffs, linearray)
924+ # Eliminate freak matches (e.g. blank lines)
925+ self.diff_cleanupSemantic(diffs)
926+
927+ # Rediff any replacement blocks, this time character-by-character.
928+ # Add a dummy entry at the end.
929+ diffs.append((self.DIFF_EQUAL, ''))
930+ pointer = 0
931+ count_delete = 0
932+ count_insert = 0
933+ text_delete = ''
934+ text_insert = ''
935+ while pointer < len(diffs):
936+ if diffs[pointer][0] == self.DIFF_INSERT:
937+ count_insert += 1
938+ text_insert += diffs[pointer][1]
939+ elif diffs[pointer][0] == self.DIFF_DELETE:
940+ count_delete += 1
941+ text_delete += diffs[pointer][1]
942+ elif diffs[pointer][0] == self.DIFF_EQUAL:
943+ # Upon reaching an equality, check for prior redundancies.
944+ if count_delete >= 1 and count_insert >= 1:
945+ # Delete the offending records and add the merged ones.
946+ a = self.diff_main(text_delete, text_insert, False, deadline)
947+ diffs[pointer - count_delete - count_insert : pointer] = a
948+ pointer = pointer - count_delete - count_insert + len(a)
949+ count_insert = 0
950+ count_delete = 0
951+ text_delete = ''
952+ text_insert = ''
953+
954+ pointer += 1
955+
956+ diffs.pop() # Remove the dummy entry at the end.
957+
958+ return diffs
959+
960+ def diff_bisect(self, text1, text2, deadline):
961+ """Find the 'middle snake' of a diff, split the problem in two
962+ and return the recursively constructed diff.
963+ See Myers 1986 paper: An O(ND) Difference Algorithm and Its Variations.
964+
965+ Args:
966+ text1: Old string to be diffed.
967+ text2: New string to be diffed.
968+ deadline: Time at which to bail if not yet complete.
969+
970+ Returns:
971+ Array of diff tuples.
972+ """
973+
974+ # Cache the text lengths to prevent multiple calls.
975+ text1_length = len(text1)
976+ text2_length = len(text2)
977+ max_d = (text1_length + text2_length + 1) // 2
978+ v_offset = max_d
979+ v_length = 2 * max_d
980+ v1 = [-1] * v_length
981+ v1[v_offset + 1] = 0
982+ v2 = v1[:]
983+ delta = text1_length - text2_length
984+ # If the total number of characters is odd, then the front path will
985+ # collide with the reverse path.
986+ front = (delta % 2 != 0)
987+ # Offsets for start and end of k loop.
988+ # Prevents mapping of space beyond the grid.
989+ k1start = 0
990+ k1end = 0
991+ k2start = 0
992+ k2end = 0
993+ for d in xrange(max_d):
994+ # Bail out if deadline is reached.
995+ if time.time() > deadline:
996+ break
997+
998+ # Walk the front path one step.
999+ for k1 in xrange(-d + k1start, d + 1 - k1end, 2):
1000+ k1_offset = v_offset + k1
1001+ if k1 == -d or (k1 != d and
1002+ v1[k1_offset - 1] < v1[k1_offset + 1]):
1003+ x1 = v1[k1_offset + 1]
1004+ else:
1005+ x1 = v1[k1_offset - 1] + 1
1006+ y1 = x1 - k1
1007+ while (x1 < text1_length and y1 < text2_length and
1008+ text1[x1] == text2[y1]):
1009+ x1 += 1
1010+ y1 += 1
1011+ v1[k1_offset] = x1
1012+ if x1 > text1_length:
1013+ # Ran off the right of the graph.
1014+ k1end += 2
1015+ elif y1 > text2_length:
1016+ # Ran off the bottom of the graph.
1017+ k1start += 2
1018+ elif front:
1019+ k2_offset = v_offset + delta - k1
1020+ if k2_offset >= 0 and k2_offset < v_length and v2[k2_offset] != -1:
1021+ # Mirror x2 onto top-left coordinate system.
1022+ x2 = text1_length - v2[k2_offset]
1023+ if x1 >= x2:
1024+ # Overlap detected.
1025+ return self.diff_bisectSplit(text1, text2, x1, y1, deadline)
1026+
1027+ # Walk the reverse path one step.
1028+ for k2 in xrange(-d + k2start, d + 1 - k2end, 2):
1029+ k2_offset = v_offset + k2
1030+ if k2 == -d or (k2 != d and
1031+ v2[k2_offset - 1] < v2[k2_offset + 1]):
1032+ x2 = v2[k2_offset + 1]
1033+ else:
1034+ x2 = v2[k2_offset - 1] + 1
1035+ y2 = x2 - k2
1036+ while (x2 < text1_length and y2 < text2_length and
1037+ text1[-x2 - 1] == text2[-y2 - 1]):
1038+ x2 += 1
1039+ y2 += 1
1040+ v2[k2_offset] = x2
1041+ if x2 > text1_length:
1042+ # Ran off the left of the graph.
1043+ k2end += 2
1044+ elif y2 > text2_length:
1045+ # Ran off the top of the graph.
1046+ k2start += 2
1047+ elif not front:
1048+ k1_offset = v_offset + delta - k2
1049+ if k1_offset >= 0 and k1_offset < v_length and v1[k1_offset] != -1:
1050+ x1 = v1[k1_offset]
1051+ y1 = v_offset + x1 - k1_offset
1052+ # Mirror x2 onto top-left coordinate system.
1053+ x2 = text1_length - x2
1054+ if x1 >= x2:
1055+ # Overlap detected.
1056+ return self.diff_bisectSplit(text1, text2, x1, y1, deadline)
1057+
1058+ # Diff took too long and hit the deadline or
1059+ # number of diffs equals number of characters, no commonality at all.
1060+ return [(self.DIFF_DELETE, text1), (self.DIFF_INSERT, text2)]
1061+
1062+ def diff_bisectSplit(self, text1, text2, x, y, deadline):
1063+ """Given the location of the 'middle snake', split the diff in two parts
1064+ and recurse.
1065+
1066+ Args:
1067+ text1: Old string to be diffed.
1068+ text2: New string to be diffed.
1069+ x: Index of split point in text1.
1070+ y: Index of split point in text2.
1071+ deadline: Time at which to bail if not yet complete.
1072+
1073+ Returns:
1074+ Array of diff tuples.
1075+ """
1076+ text1a = text1[:x]
1077+ text2a = text2[:y]
1078+ text1b = text1[x:]
1079+ text2b = text2[y:]
1080+
1081+ # Compute both diffs serially.
1082+ diffs = self.diff_main(text1a, text2a, False, deadline)
1083+ diffsb = self.diff_main(text1b, text2b, False, deadline)
1084+
1085+ return diffs + diffsb
1086+
1087+ def diff_linesToChars(self, text1, text2):
1088+ """Split two texts into an array of strings. Reduce the texts to a string
1089+ of hashes where each Unicode character represents one line.
1090+
1091+ Args:
1092+ text1: First string.
1093+ text2: Second string.
1094+
1095+ Returns:
1096+ Three element tuple, containing the encoded text1, the encoded text2 and
1097+ the array of unique strings. The zeroth element of the array of unique
1098+ strings is intentionally blank.
1099+ """
1100+ lineArray = [] # e.g. lineArray[4] == "Hello\n"
1101+ lineHash = {} # e.g. lineHash["Hello\n"] == 4
1102+
1103+ # "\x00" is a valid character, but various debuggers don't like it.
1104+ # So we'll insert a junk entry to avoid generating a null character.
1105+ lineArray.append('')
1106+
1107+ def diff_linesToCharsMunge(text):
1108+ """Split a text into an array of strings. Reduce the texts to a string
1109+ of hashes where each Unicode character represents one line.
1110+ Modifies linearray and linehash through being a closure.
1111+
1112+ Args:
1113+ text: String to encode.
1114+
1115+ Returns:
1116+ Encoded string.
1117+ """
1118+ chars = []
1119+ # Walk the text, pulling out a substring for each line.
1120+ # text.split('\n') would would temporarily double our memory footprint.
1121+ # Modifying text would create many large strings to garbage collect.
1122+ lineStart = 0
1123+ lineEnd = -1
1124+ while lineEnd < len(text) - 1:
1125+ lineEnd = text.find('\n', lineStart)
1126+ if lineEnd == -1:
1127+ lineEnd = len(text) - 1
1128+ line = text[lineStart:lineEnd + 1]
1129+ lineStart = lineEnd + 1
1130+
1131+ if line in lineHash:
1132+ chars.append(unichr(lineHash[line]))
1133+ else:
1134+ lineArray.append(line)
1135+ lineHash[line] = len(lineArray) - 1
1136+ chars.append(unichr(len(lineArray) - 1))
1137+ return "".join(chars)
1138+
1139+ chars1 = diff_linesToCharsMunge(text1)
1140+ chars2 = diff_linesToCharsMunge(text2)
1141+ return (chars1, chars2, lineArray)
1142+
1143+ def diff_charsToLines(self, diffs, lineArray):
1144+ """Rehydrate the text in a diff from a string of line hashes to real lines
1145+ of text.
1146+
1147+ Args:
1148+ diffs: Array of diff tuples.
1149+ lineArray: Array of unique strings.
1150+ """
1151+ for x in xrange(len(diffs)):
1152+ text = []
1153+ for char in diffs[x][1]:
1154+ text.append(lineArray[ord(char)])
1155+ diffs[x] = (diffs[x][0], "".join(text))
1156+
1157+ def diff_commonPrefix(self, text1, text2):
1158+ """Determine the common prefix of two strings.
1159+
1160+ Args:
1161+ text1: First string.
1162+ text2: Second string.
1163+
1164+ Returns:
1165+ The number of characters common to the start of each string.
1166+ """
1167+ # Quick check for common null cases.
1168+ if not text1 or not text2 or text1[0] != text2[0]:
1169+ return 0
1170+ # Binary search.
1171+ # Performance analysis: http://neil.fraser.name/news/2007/10/09/
1172+ pointermin = 0
1173+ pointermax = min(len(text1), len(text2))
1174+ pointermid = pointermax
1175+ pointerstart = 0
1176+ while pointermin < pointermid:
1177+ if text1[pointerstart:pointermid] == text2[pointerstart:pointermid]:
1178+ pointermin = pointermid
1179+ pointerstart = pointermin
1180+ else:
1181+ pointermax = pointermid
1182+ pointermid = (pointermax - pointermin) // 2 + pointermin
1183+ return pointermid
1184+
1185+ def diff_commonSuffix(self, text1, text2):
1186+ """Determine the common suffix of two strings.
1187+
1188+ Args:
1189+ text1: First string.
1190+ text2: Second string.
1191+
1192+ Returns:
1193+ The number of characters common to the end of each string.
1194+ """
1195+ # Quick check for common null cases.
1196+ if not text1 or not text2 or text1[-1] != text2[-1]:
1197+ return 0
1198+ # Binary search.
1199+ # Performance analysis: http://neil.fraser.name/news/2007/10/09/
1200+ pointermin = 0
1201+ pointermax = min(len(text1), len(text2))
1202+ pointermid = pointermax
1203+ pointerend = 0
1204+ while pointermin < pointermid:
1205+ if (text1[-pointermid:len(text1) - pointerend] ==
1206+ text2[-pointermid:len(text2) - pointerend]):
1207+ pointermin = pointermid
1208+ pointerend = pointermin
1209+ else:
1210+ pointermax = pointermid
1211+ pointermid = (pointermax - pointermin) // 2 + pointermin
1212+ return pointermid
1213+
1214+ def diff_commonOverlap(self, text1, text2):
1215+ """Determine if the suffix of one string is the prefix of another.
1216+
1217+ Args:
1218+ text1 First string.
1219+ text2 Second string.
1220+
1221+ Returns:
1222+ The number of characters common to the end of the first
1223+ string and the start of the second string.
1224+ """
1225+ # Cache the text lengths to prevent multiple calls.
1226+ text1_length = len(text1)
1227+ text2_length = len(text2)
1228+ # Eliminate the null case.
1229+ if text1_length == 0 or text2_length == 0:
1230+ return 0
1231+ # Truncate the longer string.
1232+ if text1_length > text2_length:
1233+ text1 = text1[-text2_length:]
1234+ elif text1_length < text2_length:
1235+ text2 = text2[:text1_length]
1236+ text_length = min(text1_length, text2_length)
1237+ # Quick check for the worst case.
1238+ if text1 == text2:
1239+ return text_length
1240+
1241+ # Start by looking for a single character match
1242+ # and increase length until no match is found.
1243+ # Performance analysis: http://neil.fraser.name/news/2010/11/04/
1244+ best = 0
1245+ length = 1
1246+ while True:
1247+ pattern = text1[-length:]
1248+ found = text2.find(pattern)
1249+ if found == -1:
1250+ return best
1251+ length += found
1252+ if found == 0 or text1[-length:] == text2[:length]:
1253+ best = length
1254+ length += 1
1255+
1256+ def diff_halfMatch(self, text1, text2):
1257+ """Do the two texts share a substring which is at least half the length of
1258+ the longer text?
1259+ This speedup can produce non-minimal diffs.
1260+
1261+ Args:
1262+ text1: First string.
1263+ text2: Second string.
1264+
1265+ Returns:
1266+ Five element Array, containing the prefix of text1, the suffix of text1,
1267+ the prefix of text2, the suffix of text2 and the common middle. Or None
1268+ if there was no match.
1269+ """
1270+ if self.Diff_Timeout <= 0:
1271+ # Don't risk returning a non-optimal diff if we have unlimited time.
1272+ return None
1273+ if len(text1) > len(text2):
1274+ (longtext, shorttext) = (text1, text2)
1275+ else:
1276+ (shorttext, longtext) = (text1, text2)
1277+ if len(longtext) < 4 or len(shorttext) * 2 < len(longtext):
1278+ return None # Pointless.
1279+
1280+ def diff_halfMatchI(longtext, shorttext, i):
1281+ """Does a substring of shorttext exist within longtext such that the
1282+ substring is at least half the length of longtext?
1283+ Closure, but does not reference any external variables.
1284+
1285+ Args:
1286+ longtext: Longer string.
1287+ shorttext: Shorter string.
1288+ i: Start index of quarter length substring within longtext.
1289+
1290+ Returns:
1291+ Five element Array, containing the prefix of longtext, the suffix of
1292+ longtext, the prefix of shorttext, the suffix of shorttext and the
1293+ common middle. Or None if there was no match.
1294+ """
1295+ seed = longtext[i:i + len(longtext) // 4]
1296+ best_common = ''
1297+ j = shorttext.find(seed)
1298+ while j != -1:
1299+ prefixLength = self.diff_commonPrefix(longtext[i:], shorttext[j:])
1300+ suffixLength = self.diff_commonSuffix(longtext[:i], shorttext[:j])
1301+ if len(best_common) < suffixLength + prefixLength:
1302+ best_common = (shorttext[j - suffixLength:j] +
1303+ shorttext[j:j + prefixLength])
1304+ best_longtext_a = longtext[:i - suffixLength]
1305+ best_longtext_b = longtext[i + prefixLength:]
1306+ best_shorttext_a = shorttext[:j - suffixLength]
1307+ best_shorttext_b = shorttext[j + prefixLength:]
1308+ j = shorttext.find(seed, j + 1)
1309+
1310+ if len(best_common) * 2 >= len(longtext):
1311+ return (best_longtext_a, best_longtext_b,
1312+ best_shorttext_a, best_shorttext_b, best_common)
1313+ else:
1314+ return None
1315+
1316+ # First check if the second quarter is the seed for a half-match.
1317+ hm1 = diff_halfMatchI(longtext, shorttext, (len(longtext) + 3) // 4)
1318+ # Check again based on the third quarter.
1319+ hm2 = diff_halfMatchI(longtext, shorttext, (len(longtext) + 1) // 2)
1320+ if not hm1 and not hm2:
1321+ return None
1322+ elif not hm2:
1323+ hm = hm1
1324+ elif not hm1:
1325+ hm = hm2
1326+ else:
1327+ # Both matched. Select the longest.
1328+ if len(hm1[4]) > len(hm2[4]):
1329+ hm = hm1
1330+ else:
1331+ hm = hm2
1332+
1333+ # A half-match was found, sort out the return data.
1334+ if len(text1) > len(text2):
1335+ (text1_a, text1_b, text2_a, text2_b, mid_common) = hm
1336+ else:
1337+ (text2_a, text2_b, text1_a, text1_b, mid_common) = hm
1338+ return (text1_a, text1_b, text2_a, text2_b, mid_common)
1339+
1340+ def diff_cleanupSemantic(self, diffs):
1341+ """Reduce the number of edits by eliminating semantically trivial
1342+ equalities.
1343+
1344+ Args:
1345+ diffs: Array of diff tuples.
1346+ """
1347+ changes = False
1348+ equalities = [] # Stack of indices where equalities are found.
1349+ lastequality = None # Always equal to diffs[equalities[-1]][1]
1350+ pointer = 0 # Index of current position.
1351+ # Number of chars that changed prior to the equality.
1352+ length_insertions1, length_deletions1 = 0, 0
1353+ # Number of chars that changed after the equality.
1354+ length_insertions2, length_deletions2 = 0, 0
1355+ while pointer < len(diffs):
1356+ if diffs[pointer][0] == self.DIFF_EQUAL: # Equality found.
1357+ equalities.append(pointer)
1358+ length_insertions1, length_insertions2 = length_insertions2, 0
1359+ length_deletions1, length_deletions2 = length_deletions2, 0
1360+ lastequality = diffs[pointer][1]
1361+ else: # An insertion or deletion.
1362+ if diffs[pointer][0] == self.DIFF_INSERT:
1363+ length_insertions2 += len(diffs[pointer][1])
1364+ else:
1365+ length_deletions2 += len(diffs[pointer][1])
1366+ # Eliminate an equality that is smaller or equal to the edits on both
1367+ # sides of it.
1368+ if (lastequality and (len(lastequality) <=
1369+ max(length_insertions1, length_deletions1)) and
1370+ (len(lastequality) <= max(length_insertions2, length_deletions2))):
1371+ # Duplicate record.
1372+ diffs.insert(equalities[-1], (self.DIFF_DELETE, lastequality))
1373+ # Change second copy to insert.
1374+ diffs[equalities[-1] + 1] = (self.DIFF_INSERT,
1375+ diffs[equalities[-1] + 1][1])
1376+ # Throw away the equality we just deleted.
1377+ equalities.pop()
1378+ # Throw away the previous equality (it needs to be reevaluated).
1379+ if len(equalities):
1380+ equalities.pop()
1381+ if len(equalities):
1382+ pointer = equalities[-1]
1383+ else:
1384+ pointer = -1
1385+ # Reset the counters.
1386+ length_insertions1, length_deletions1 = 0, 0
1387+ length_insertions2, length_deletions2 = 0, 0
1388+ lastequality = None
1389+ changes = True
1390+ pointer += 1
1391+
1392+ # Normalize the diff.
1393+ if changes:
1394+ self.diff_cleanupMerge(diffs)
1395+ self.diff_cleanupSemanticLossless(diffs)
1396+
1397+ # Find any overlaps between deletions and insertions.
1398+ # e.g: <del>abcxxx</del><ins>xxxdef</ins>
1399+ # -> <del>abc</del>xxx<ins>def</ins>
1400+ # e.g: <del>xxxabc</del><ins>defxxx</ins>
1401+ # -> <ins>def</ins>xxx<del>abc</del>
1402+ # Only extract an overlap if it is as big as the edit ahead or behind it.
1403+ pointer = 1
1404+ while pointer < len(diffs):
1405+ if (diffs[pointer - 1][0] == self.DIFF_DELETE and
1406+ diffs[pointer][0] == self.DIFF_INSERT):
1407+ deletion = diffs[pointer - 1][1]
1408+ insertion = diffs[pointer][1]
1409+ overlap_length1 = self.diff_commonOverlap(deletion, insertion)
1410+ overlap_length2 = self.diff_commonOverlap(insertion, deletion)
1411+ if overlap_length1 >= overlap_length2:
1412+ if (overlap_length1 >= len(deletion) / 2.0 or
1413+ overlap_length1 >= len(insertion) / 2.0):
1414+ # Overlap found. Insert an equality and trim the surrounding edits.
1415+ diffs.insert(pointer, (self.DIFF_EQUAL,
1416+ insertion[:overlap_length1]))
1417+ diffs[pointer - 1] = (self.DIFF_DELETE,
1418+ deletion[:len(deletion) - overlap_length1])
1419+ diffs[pointer + 1] = (self.DIFF_INSERT,
1420+ insertion[overlap_length1:])
1421+ pointer += 1
1422+ else:
1423+ if (overlap_length2 >= len(deletion) / 2.0 or
1424+ overlap_length2 >= len(insertion) / 2.0):
1425+ # Reverse overlap found.
1426+ # Insert an equality and swap and trim the surrounding edits.
1427+ diffs.insert(pointer, (self.DIFF_EQUAL, deletion[:overlap_length2]))
1428+ diffs[pointer - 1] = (self.DIFF_INSERT,
1429+ insertion[:len(insertion) - overlap_length2])
1430+ diffs[pointer + 1] = (self.DIFF_DELETE, deletion[overlap_length2:])
1431+ pointer += 1
1432+ pointer += 1
1433+ pointer += 1
1434+
1435+ def diff_cleanupSemanticLossless(self, diffs):
1436+ """Look for single edits surrounded on both sides by equalities
1437+ which can be shifted sideways to align the edit to a word boundary.
1438+ e.g: The c<ins>at c</ins>ame. -> The <ins>cat </ins>came.
1439+
1440+ Args:
1441+ diffs: Array of diff tuples.
1442+ """
1443+
1444+ def diff_cleanupSemanticScore(one, two):
1445+ """Given two strings, compute a score representing whether the
1446+ internal boundary falls on logical boundaries.
1447+ Scores range from 6 (best) to 0 (worst).
1448+ Closure, but does not reference any external variables.
1449+
1450+ Args:
1451+ one: First string.
1452+ two: Second string.
1453+
1454+ Returns:
1455+ The score.
1456+ """
1457+ if not one or not two:
1458+ # Edges are the best.
1459+ return 6
1460+
1461+ # Each port of this function behaves slightly differently due to
1462+ # subtle differences in each language's definition of things like
1463+ # 'whitespace'. Since this function's purpose is largely cosmetic,
1464+ # the choice has been made to use each language's native features
1465+ # rather than force total conformity.
1466+ char1 = one[-1]
1467+ char2 = two[0]
1468+ nonAlphaNumeric1 = not char1.isalnum()
1469+ nonAlphaNumeric2 = not char2.isalnum()
1470+ whitespace1 = nonAlphaNumeric1 and char1.isspace()
1471+ whitespace2 = nonAlphaNumeric2 and char2.isspace()
1472+ lineBreak1 = whitespace1 and (char1 == "\r" or char1 == "\n")
1473+ lineBreak2 = whitespace2 and (char2 == "\r" or char2 == "\n")
1474+ blankLine1 = lineBreak1 and self.BLANKLINEEND.search(one)
1475+ blankLine2 = lineBreak2 and self.BLANKLINESTART.match(two)
1476+
1477+ if blankLine1 or blankLine2:
1478+ # Five points for blank lines.
1479+ return 5
1480+ elif lineBreak1 or lineBreak2:
1481+ # Four points for line breaks.
1482+ return 4
1483+ elif nonAlphaNumeric1 and not whitespace1 and whitespace2:
1484+ # Three points for end of sentences.
1485+ return 3
1486+ elif whitespace1 or whitespace2:
1487+ # Two points for whitespace.
1488+ return 2
1489+ elif nonAlphaNumeric1 or nonAlphaNumeric2:
1490+ # One point for non-alphanumeric.
1491+ return 1
1492+ return 0
1493+
1494+ pointer = 1
1495+ # Intentionally ignore the first and last element (don't need checking).
1496+ while pointer < len(diffs) - 1:
1497+ if (diffs[pointer - 1][0] == self.DIFF_EQUAL and
1498+ diffs[pointer + 1][0] == self.DIFF_EQUAL):
1499+ # This is a single edit surrounded by equalities.
1500+ equality1 = diffs[pointer - 1][1]
1501+ edit = diffs[pointer][1]
1502+ equality2 = diffs[pointer + 1][1]
1503+
1504+ # First, shift the edit as far left as possible.
1505+ commonOffset = self.diff_commonSuffix(equality1, edit)
1506+ if commonOffset:
1507+ commonString = edit[-commonOffset:]
1508+ equality1 = equality1[:-commonOffset]
1509+ edit = commonString + edit[:-commonOffset]
1510+ equality2 = commonString + equality2
1511+
1512+ # Second, step character by character right, looking for the best fit.
1513+ bestEquality1 = equality1
1514+ bestEdit = edit
1515+ bestEquality2 = equality2
1516+ bestScore = (diff_cleanupSemanticScore(equality1, edit) +
1517+ diff_cleanupSemanticScore(edit, equality2))
1518+ while edit and equality2 and edit[0] == equality2[0]:
1519+ equality1 += edit[0]
1520+ edit = edit[1:] + equality2[0]
1521+ equality2 = equality2[1:]
1522+ score = (diff_cleanupSemanticScore(equality1, edit) +
1523+ diff_cleanupSemanticScore(edit, equality2))
1524+ # The >= encourages trailing rather than leading whitespace on edits.
1525+ if score >= bestScore:
1526+ bestScore = score
1527+ bestEquality1 = equality1
1528+ bestEdit = edit
1529+ bestEquality2 = equality2
1530+
1531+ if diffs[pointer - 1][1] != bestEquality1:
1532+ # We have an improvement, save it back to the diff.
1533+ if bestEquality1:
1534+ diffs[pointer - 1] = (diffs[pointer - 1][0], bestEquality1)
1535+ else:
1536+ del diffs[pointer - 1]
1537+ pointer -= 1
1538+ diffs[pointer] = (diffs[pointer][0], bestEdit)
1539+ if bestEquality2:
1540+ diffs[pointer + 1] = (diffs[pointer + 1][0], bestEquality2)
1541+ else:
1542+ del diffs[pointer + 1]
1543+ pointer -= 1
1544+ pointer += 1
1545+
1546+ # Define some regex patterns for matching boundaries.
1547+ BLANKLINEEND = re.compile(r"\n\r?\n$");
1548+ BLANKLINESTART = re.compile(r"^\r?\n\r?\n");
1549+
1550+ def diff_cleanupEfficiency(self, diffs):
1551+ """Reduce the number of edits by eliminating operationally trivial
1552+ equalities.
1553+
1554+ Args:
1555+ diffs: Array of diff tuples.
1556+ """
1557+ changes = False
1558+ equalities = [] # Stack of indices where equalities are found.
1559+ lastequality = None # Always equal to diffs[equalities[-1]][1]
1560+ pointer = 0 # Index of current position.
1561+ pre_ins = False # Is there an insertion operation before the last equality.
1562+ pre_del = False # Is there a deletion operation before the last equality.
1563+ post_ins = False # Is there an insertion operation after the last equality.
1564+ post_del = False # Is there a deletion operation after the last equality.
1565+ while pointer < len(diffs):
1566+ if diffs[pointer][0] == self.DIFF_EQUAL: # Equality found.
1567+ if (len(diffs[pointer][1]) < self.Diff_EditCost and
1568+ (post_ins or post_del)):
1569+ # Candidate found.
1570+ equalities.append(pointer)
1571+ pre_ins = post_ins
1572+ pre_del = post_del
1573+ lastequality = diffs[pointer][1]
1574+ else:
1575+ # Not a candidate, and can never become one.
1576+ equalities = []
1577+ lastequality = None
1578+
1579+ post_ins = post_del = False
1580+ else: # An insertion or deletion.
1581+ if diffs[pointer][0] == self.DIFF_DELETE:
1582+ post_del = True
1583+ else:
1584+ post_ins = True
1585+
1586+ # Five types to be split:
1587+ # <ins>A</ins><del>B</del>XY<ins>C</ins><del>D</del>
1588+ # <ins>A</ins>X<ins>C</ins><del>D</del>
1589+ # <ins>A</ins><del>B</del>X<ins>C</ins>
1590+ # <ins>A</del>X<ins>C</ins><del>D</del>
1591+ # <ins>A</ins><del>B</del>X<del>C</del>
1592+
1593+ if lastequality and ((pre_ins and pre_del and post_ins and post_del) or
1594+ ((len(lastequality) < self.Diff_EditCost / 2) and
1595+ (pre_ins + pre_del + post_ins + post_del) == 3)):
1596+ # Duplicate record.
1597+ diffs.insert(equalities[-1], (self.DIFF_DELETE, lastequality))
1598+ # Change second copy to insert.
1599+ diffs[equalities[-1] + 1] = (self.DIFF_INSERT,
1600+ diffs[equalities[-1] + 1][1])
1601+ equalities.pop() # Throw away the equality we just deleted.
1602+ lastequality = None
1603+ if pre_ins and pre_del:
1604+ # No changes made which could affect previous entry, keep going.
1605+ post_ins = post_del = True
1606+ equalities = []
1607+ else:
1608+ if len(equalities):
1609+ equalities.pop() # Throw away the previous equality.
1610+ if len(equalities):
1611+ pointer = equalities[-1]
1612+ else:
1613+ pointer = -1
1614+ post_ins = post_del = False
1615+ changes = True
1616+ pointer += 1
1617+
1618+ if changes:
1619+ self.diff_cleanupMerge(diffs)
1620+
1621+ def diff_cleanupMerge(self, diffs):
1622+ """Reorder and merge like edit sections. Merge equalities.
1623+ Any edit section can move as long as it doesn't cross an equality.
1624+
1625+ Args:
1626+ diffs: Array of diff tuples.
1627+ """
1628+ diffs.append((self.DIFF_EQUAL, '')) # Add a dummy entry at the end.
1629+ pointer = 0
1630+ count_delete = 0
1631+ count_insert = 0
1632+ text_delete = ''
1633+ text_insert = ''
1634+ while pointer < len(diffs):
1635+ if diffs[pointer][0] == self.DIFF_INSERT:
1636+ count_insert += 1
1637+ text_insert += diffs[pointer][1]
1638+ pointer += 1
1639+ elif diffs[pointer][0] == self.DIFF_DELETE:
1640+ count_delete += 1
1641+ text_delete += diffs[pointer][1]
1642+ pointer += 1
1643+ elif diffs[pointer][0] == self.DIFF_EQUAL:
1644+ # Upon reaching an equality, check for prior redundancies.
1645+ if count_delete + count_insert > 1:
1646+ if count_delete != 0 and count_insert != 0:
1647+ # Factor out any common prefixies.
1648+ commonlength = self.diff_commonPrefix(text_insert, text_delete)
1649+ if commonlength != 0:
1650+ x = pointer - count_delete - count_insert - 1
1651+ if x >= 0 and diffs[x][0] == self.DIFF_EQUAL:
1652+ diffs[x] = (diffs[x][0], diffs[x][1] +
1653+ text_insert[:commonlength])
1654+ else:
1655+ diffs.insert(0, (self.DIFF_EQUAL, text_insert[:commonlength]))
1656+ pointer += 1
1657+ text_insert = text_insert[commonlength:]
1658+ text_delete = text_delete[commonlength:]
1659+ # Factor out any common suffixies.
1660+ commonlength = self.diff_commonSuffix(text_insert, text_delete)
1661+ if commonlength != 0:
1662+ diffs[pointer] = (diffs[pointer][0], text_insert[-commonlength:] +
1663+ diffs[pointer][1])
1664+ text_insert = text_insert[:-commonlength]
1665+ text_delete = text_delete[:-commonlength]
1666+ # Delete the offending records and add the merged ones.
1667+ if count_delete == 0:
1668+ diffs[pointer - count_insert : pointer] = [
1669+ (self.DIFF_INSERT, text_insert)]
1670+ elif count_insert == 0:
1671+ diffs[pointer - count_delete : pointer] = [
1672+ (self.DIFF_DELETE, text_delete)]
1673+ else:
1674+ diffs[pointer - count_delete - count_insert : pointer] = [
1675+ (self.DIFF_DELETE, text_delete),
1676+ (self.DIFF_INSERT, text_insert)]
1677+ pointer = pointer - count_delete - count_insert + 1
1678+ if count_delete != 0:
1679+ pointer += 1
1680+ if count_insert != 0:
1681+ pointer += 1
1682+ elif pointer != 0 and diffs[pointer - 1][0] == self.DIFF_EQUAL:
1683+ # Merge this equality with the previous one.
1684+ diffs[pointer - 1] = (diffs[pointer - 1][0],
1685+ diffs[pointer - 1][1] + diffs[pointer][1])
1686+ del diffs[pointer]
1687+ else:
1688+ pointer += 1
1689+
1690+ count_insert = 0
1691+ count_delete = 0
1692+ text_delete = ''
1693+ text_insert = ''
1694+
1695+ if diffs[-1][1] == '':
1696+ diffs.pop() # Remove the dummy entry at the end.
1697+
1698+ # Second pass: look for single edits surrounded on both sides by equalities
1699+ # which can be shifted sideways to eliminate an equality.
1700+ # e.g: A<ins>BA</ins>C -> <ins>AB</ins>AC
1701+ changes = False
1702+ pointer = 1
1703+ # Intentionally ignore the first and last element (don't need checking).
1704+ while pointer < len(diffs) - 1:
1705+ if (diffs[pointer - 1][0] == self.DIFF_EQUAL and
1706+ diffs[pointer + 1][0] == self.DIFF_EQUAL):
1707+ # This is a single edit surrounded by equalities.
1708+ if diffs[pointer][1].endswith(diffs[pointer - 1][1]):
1709+ # Shift the edit over the previous equality.
1710+ diffs[pointer] = (diffs[pointer][0],
1711+ diffs[pointer - 1][1] +
1712+ diffs[pointer][1][:-len(diffs[pointer - 1][1])])
1713+ diffs[pointer + 1] = (diffs[pointer + 1][0],
1714+ diffs[pointer - 1][1] + diffs[pointer + 1][1])
1715+ del diffs[pointer - 1]
1716+ changes = True
1717+ elif diffs[pointer][1].startswith(diffs[pointer + 1][1]):
1718+ # Shift the edit over the next equality.
1719+ diffs[pointer - 1] = (diffs[pointer - 1][0],
1720+ diffs[pointer - 1][1] + diffs[pointer + 1][1])
1721+ diffs[pointer] = (diffs[pointer][0],
1722+ diffs[pointer][1][len(diffs[pointer + 1][1]):] +
1723+ diffs[pointer + 1][1])
1724+ del diffs[pointer + 1]
1725+ changes = True
1726+ pointer += 1
1727+
1728+ # If shifts were made, the diff needs reordering and another shift sweep.
1729+ if changes:
1730+ self.diff_cleanupMerge(diffs)
1731+
1732+ def diff_xIndex(self, diffs, loc):
1733+ """loc is a location in text1, compute and return the equivalent location
1734+ in text2. e.g. "The cat" vs "The big cat", 1->1, 5->8
1735+
1736+ Args:
1737+ diffs: Array of diff tuples.
1738+ loc: Location within text1.
1739+
1740+ Returns:
1741+ Location within text2.
1742+ """
1743+ chars1 = 0
1744+ chars2 = 0
1745+ last_chars1 = 0
1746+ last_chars2 = 0
1747+ for x in xrange(len(diffs)):
1748+ (op, text) = diffs[x]
1749+ if op != self.DIFF_INSERT: # Equality or deletion.
1750+ chars1 += len(text)
1751+ if op != self.DIFF_DELETE: # Equality or insertion.
1752+ chars2 += len(text)
1753+ if chars1 > loc: # Overshot the location.
1754+ break
1755+ last_chars1 = chars1
1756+ last_chars2 = chars2
1757+
1758+ if len(diffs) != x and diffs[x][0] == self.DIFF_DELETE:
1759+ # The location was deleted.
1760+ return last_chars2
1761+ # Add the remaining len(character).
1762+ return last_chars2 + (loc - last_chars1)
1763+
1764+ def diff_prettyHtml(self, diffs):
1765+ """Convert a diff array into a pretty HTML report.
1766+
1767+ Args:
1768+ diffs: Array of diff tuples.
1769+
1770+ Returns:
1771+ HTML representation.
1772+ """
1773+ html = []
1774+ for (op, data) in diffs:
1775+ text = (data.replace("&", "&amp;").replace("<", "&lt;")
1776+ .replace(">", "&gt;").replace("\n", "&para;<br>"))
1777+ if op == self.DIFF_INSERT:
1778+ html.append("<ins style=\"background:#e6ffe6;\">%s</ins>" % text)
1779+ elif op == self.DIFF_DELETE:
1780+ html.append("<del style=\"background:#ffe6e6;\">%s</del>" % text)
1781+ elif op == self.DIFF_EQUAL:
1782+ html.append("<span>%s</span>" % text)
1783+ return "".join(html)
1784+
1785+ def diff_text1(self, diffs):
1786+ """Compute and return the source text (all equalities and deletions).
1787+
1788+ Args:
1789+ diffs: Array of diff tuples.
1790+
1791+ Returns:
1792+ Source text.
1793+ """
1794+ text = []
1795+ for (op, data) in diffs:
1796+ if op != self.DIFF_INSERT:
1797+ text.append(data)
1798+ return "".join(text)
1799+
1800+ def diff_text2(self, diffs):
1801+ """Compute and return the destination text (all equalities and insertions).
1802+
1803+ Args:
1804+ diffs: Array of diff tuples.
1805+
1806+ Returns:
1807+ Destination text.
1808+ """
1809+ text = []
1810+ for (op, data) in diffs:
1811+ if op != self.DIFF_DELETE:
1812+ text.append(data)
1813+ return "".join(text)
1814+
1815+ def diff_levenshtein(self, diffs):
1816+ """Compute the Levenshtein distance; the number of inserted, deleted or
1817+ substituted characters.
1818+
1819+ Args:
1820+ diffs: Array of diff tuples.
1821+
1822+ Returns:
1823+ Number of changes.
1824+ """
1825+ levenshtein = 0
1826+ insertions = 0
1827+ deletions = 0
1828+ for (op, data) in diffs:
1829+ if op == self.DIFF_INSERT:
1830+ insertions += len(data)
1831+ elif op == self.DIFF_DELETE:
1832+ deletions += len(data)
1833+ elif op == self.DIFF_EQUAL:
1834+ # A deletion and an insertion is one substitution.
1835+ levenshtein += max(insertions, deletions)
1836+ insertions = 0
1837+ deletions = 0
1838+ levenshtein += max(insertions, deletions)
1839+ return levenshtein
1840+
1841+ def diff_toDelta(self, diffs):
1842+ """Crush the diff into an encoded string which describes the operations
1843+ required to transform text1 into text2.
1844+ E.g. =3\t-2\t+ing -> Keep 3 chars, delete 2 chars, insert 'ing'.
1845+ Operations are tab-separated. Inserted text is escaped using %xx notation.
1846+
1847+ Args:
1848+ diffs: Array of diff tuples.
1849+
1850+ Returns:
1851+ Delta text.
1852+ """
1853+ text = []
1854+ for (op, data) in diffs:
1855+ if op == self.DIFF_INSERT:
1856+ # High ascii will raise UnicodeDecodeError. Use Unicode instead.
1857+ data = data.encode("utf-8")
1858+ text.append("+" + urllib.quote(data, "!~*'();/?:@&=+$,# "))
1859+ elif op == self.DIFF_DELETE:
1860+ text.append("-%d" % len(data))
1861+ elif op == self.DIFF_EQUAL:
1862+ text.append("=%d" % len(data))
1863+ return "\t".join(text)
1864+
1865+ def diff_fromDelta(self, text1, delta):
1866+ """Given the original text1, and an encoded string which describes the
1867+ operations required to transform text1 into text2, compute the full diff.
1868+
1869+ Args:
1870+ text1: Source string for the diff.
1871+ delta: Delta text.
1872+
1873+ Returns:
1874+ Array of diff tuples.
1875+
1876+ Raises:
1877+ ValueError: If invalid input.
1878+ """
1879+ if type(delta) == unicode:
1880+ # Deltas should be composed of a subset of ascii chars, Unicode not
1881+ # required. If this encode raises UnicodeEncodeError, delta is invalid.
1882+ delta = delta.encode("ascii")
1883+ diffs = []
1884+ pointer = 0 # Cursor in text1
1885+ tokens = delta.split("\t")
1886+ for token in tokens:
1887+ if token == "":
1888+ # Blank tokens are ok (from a trailing \t).
1889+ continue
1890+ # Each token begins with a one character parameter which specifies the
1891+ # operation of this token (delete, insert, equality).
1892+ param = token[1:]
1893+ if token[0] == "+":
1894+ param = urllib.unquote(param).decode("utf-8")
1895+ diffs.append((self.DIFF_INSERT, param))
1896+ elif token[0] == "-" or token[0] == "=":
1897+ try:
1898+ n = int(param)
1899+ except ValueError:
1900+ raise ValueError("Invalid number in diff_fromDelta: " + param)
1901+ if n < 0:
1902+ raise ValueError("Negative number in diff_fromDelta: " + param)
1903+ text = text1[pointer : pointer + n]
1904+ pointer += n
1905+ if token[0] == "=":
1906+ diffs.append((self.DIFF_EQUAL, text))
1907+ else:
1908+ diffs.append((self.DIFF_DELETE, text))
1909+ else:
1910+ # Anything else is an error.
1911+ raise ValueError("Invalid diff operation in diff_fromDelta: " +
1912+ token[0])
1913+ if pointer != len(text1):
1914+ raise ValueError(
1915+ "Delta length (%d) does not equal source text length (%d)." %
1916+ (pointer, len(text1)))
1917+ return diffs
1918+
1919+ # MATCH FUNCTIONS
1920+
1921+ def match_main(self, text, pattern, loc):
1922+ """Locate the best instance of 'pattern' in 'text' near 'loc'.
1923+
1924+ Args:
1925+ text: The text to search.
1926+ pattern: The pattern to search for.
1927+ loc: The location to search around.
1928+
1929+ Returns:
1930+ Best match index or -1.
1931+ """
1932+ # Check for null inputs.
1933+ if text == None or pattern == None:
1934+ raise ValueError("Null inputs. (match_main)")
1935+
1936+ loc = max(0, min(loc, len(text)))
1937+ if text == pattern:
1938+ # Shortcut (potentially not guaranteed by the algorithm)
1939+ return 0
1940+ elif not text:
1941+ # Nothing to match.
1942+ return -1
1943+ elif text[loc:loc + len(pattern)] == pattern:
1944+ # Perfect match at the perfect spot! (Includes case of null pattern)
1945+ return loc
1946+ else:
1947+ # Do a fuzzy compare.
1948+ match = self.match_bitap(text, pattern, loc)
1949+ return match
1950+
1951+ def match_bitap(self, text, pattern, loc):
1952+ """Locate the best instance of 'pattern' in 'text' near 'loc' using the
1953+ Bitap algorithm.
1954+
1955+ Args:
1956+ text: The text to search.
1957+ pattern: The pattern to search for.
1958+ loc: The location to search around.
1959+
1960+ Returns:
1961+ Best match index or -1.
1962+ """
1963+ # Python doesn't have a maxint limit, so ignore this check.
1964+ #if self.Match_MaxBits != 0 and len(pattern) > self.Match_MaxBits:
1965+ # raise ValueError("Pattern too long for this application.")
1966+
1967+ # Initialise the alphabet.
1968+ s = self.match_alphabet(pattern)
1969+
1970+ def match_bitapScore(e, x):
1971+ """Compute and return the score for a match with e errors and x location.
1972+ Accesses loc and pattern through being a closure.
1973+
1974+ Args:
1975+ e: Number of errors in match.
1976+ x: Location of match.
1977+
1978+ Returns:
1979+ Overall score for match (0.0 = good, 1.0 = bad).
1980+ """
1981+ accuracy = float(e) / len(pattern)
1982+ proximity = abs(loc - x)
1983+ if not self.Match_Distance:
1984+ # Dodge divide by zero error.
1985+ return proximity and 1.0 or accuracy
1986+ return accuracy + (proximity / float(self.Match_Distance))
1987+
1988+ # Highest score beyond which we give up.
1989+ score_threshold = self.Match_Threshold
1990+ # Is there a nearby exact match? (speedup)
1991+ best_loc = text.find(pattern, loc)
1992+ if best_loc != -1:
1993+ score_threshold = min(match_bitapScore(0, best_loc), score_threshold)
1994+ # What about in the other direction? (speedup)
1995+ best_loc = text.rfind(pattern, loc + len(pattern))
1996+ if best_loc != -1:
1997+ score_threshold = min(match_bitapScore(0, best_loc), score_threshold)
1998+
1999+ # Initialise the bit arrays.
2000+ matchmask = 1 << (len(pattern) - 1)
2001+ best_loc = -1
2002+
2003+ bin_max = len(pattern) + len(text)
2004+ # Empty initialization added to appease pychecker.
2005+ last_rd = None
2006+ for d in xrange(len(pattern)):
2007+ # Scan for the best match each iteration allows for one more error.
2008+ # Run a binary search to determine how far from 'loc' we can stray at
2009+ # this error level.
2010+ bin_min = 0
2011+ bin_mid = bin_max
2012+ while bin_min < bin_mid:
2013+ if match_bitapScore(d, loc + bin_mid) <= score_threshold:
2014+ bin_min = bin_mid
2015+ else:
2016+ bin_max = bin_mid
2017+ bin_mid = (bin_max - bin_min) // 2 + bin_min
2018+
2019+ # Use the result from this iteration as the maximum for the next.
2020+ bin_max = bin_mid
2021+ start = max(1, loc - bin_mid + 1)
2022+ finish = min(loc + bin_mid, len(text)) + len(pattern)
2023+
2024+ rd = [0] * (finish + 2)
2025+ rd[finish + 1] = (1 << d) - 1
2026+ for j in xrange(finish, start - 1, -1):
2027+ if len(text) <= j - 1:
2028+ # Out of range.
2029+ charMatch = 0
2030+ else:
2031+ charMatch = s.get(text[j - 1], 0)
2032+ if d == 0: # First pass: exact match.
2033+ rd[j] = ((rd[j + 1] << 1) | 1) & charMatch
2034+ else: # Subsequent passes: fuzzy match.
2035+ rd[j] = (((rd[j + 1] << 1) | 1) & charMatch) | (
2036+ ((last_rd[j + 1] | last_rd[j]) << 1) | 1) | last_rd[j + 1]
2037+ if rd[j] & matchmask:
2038+ score = match_bitapScore(d, j - 1)
2039+ # This match will almost certainly be better than any existing match.
2040+ # But check anyway.
2041+ if score <= score_threshold:
2042+ # Told you so.
2043+ score_threshold = score
2044+ best_loc = j - 1
2045+ if best_loc > loc:
2046+ # When passing loc, don't exceed our current distance from loc.
2047+ start = max(1, 2 * loc - best_loc)
2048+ else:
2049+ # Already passed loc, downhill from here on in.
2050+ break
2051+ # No hope for a (better) match at greater error levels.
2052+ if match_bitapScore(d + 1, loc) > score_threshold:
2053+ break
2054+ last_rd = rd
2055+ return best_loc
2056+
2057+ def match_alphabet(self, pattern):
2058+ """Initialise the alphabet for the Bitap algorithm.
2059+
2060+ Args:
2061+ pattern: The text to encode.
2062+
2063+ Returns:
2064+ Hash of character locations.
2065+ """
2066+ s = {}
2067+ for char in pattern:
2068+ s[char] = 0
2069+ for i in xrange(len(pattern)):
2070+ s[pattern[i]] |= 1 << (len(pattern) - i - 1)
2071+ return s
2072+
2073+ # PATCH FUNCTIONS
2074+
2075+ def patch_addContext(self, patch, text):
2076+ """Increase the context until it is unique,
2077+ but don't let the pattern expand beyond Match_MaxBits.
2078+
2079+ Args:
2080+ patch: The patch to grow.
2081+ text: Source text.
2082+ """
2083+ if len(text) == 0:
2084+ return
2085+ pattern = text[patch.start2 : patch.start2 + patch.length1]
2086+ padding = 0
2087+
2088+ # Look for the first and last matches of pattern in text. If two different
2089+ # matches are found, increase the pattern length.
2090+ while (text.find(pattern) != text.rfind(pattern) and (self.Match_MaxBits ==
2091+ 0 or len(pattern) < self.Match_MaxBits - self.Patch_Margin -
2092+ self.Patch_Margin)):
2093+ padding += self.Patch_Margin
2094+ pattern = text[max(0, patch.start2 - padding) :
2095+ patch.start2 + patch.length1 + padding]
2096+ # Add one chunk for good luck.
2097+ padding += self.Patch_Margin
2098+
2099+ # Add the prefix.
2100+ prefix = text[max(0, patch.start2 - padding) : patch.start2]
2101+ if prefix:
2102+ patch.diffs[:0] = [(self.DIFF_EQUAL, prefix)]
2103+ # Add the suffix.
2104+ suffix = text[patch.start2 + patch.length1 :
2105+ patch.start2 + patch.length1 + padding]
2106+ if suffix:
2107+ patch.diffs.append((self.DIFF_EQUAL, suffix))
2108+
2109+ # Roll back the start points.
2110+ patch.start1 -= len(prefix)
2111+ patch.start2 -= len(prefix)
2112+ # Extend lengths.
2113+ patch.length1 += len(prefix) + len(suffix)
2114+ patch.length2 += len(prefix) + len(suffix)
2115+
2116+ def patch_make(self, a, b=None, c=None):
2117+ """Compute a list of patches to turn text1 into text2.
2118+ Use diffs if provided, otherwise compute it ourselves.
2119+ There are four ways to call this function, depending on what data is
2120+ available to the caller:
2121+ Method 1:
2122+ a = text1, b = text2
2123+ Method 2:
2124+ a = diffs
2125+ Method 3 (optimal):
2126+ a = text1, b = diffs
2127+ Method 4 (deprecated, use method 3):
2128+ a = text1, b = text2, c = diffs
2129+
2130+ Args:
2131+ a: text1 (methods 1,3,4) or Array of diff tuples for text1 to
2132+ text2 (method 2).
2133+ b: text2 (methods 1,4) or Array of diff tuples for text1 to
2134+ text2 (method 3) or undefined (method 2).
2135+ c: Array of diff tuples for text1 to text2 (method 4) or
2136+ undefined (methods 1,2,3).
2137+
2138+ Returns:
2139+ Array of Patch objects.
2140+ """
2141+ text1 = None
2142+ diffs = None
2143+ # Note that texts may arrive as 'str' or 'unicode'.
2144+ if isinstance(a, basestring) and isinstance(b, basestring) and c is None:
2145+ # Method 1: text1, text2
2146+ # Compute diffs from text1 and text2.
2147+ text1 = a
2148+ diffs = self.diff_main(text1, b, True)
2149+ if len(diffs) > 2:
2150+ self.diff_cleanupSemantic(diffs)
2151+ self.diff_cleanupEfficiency(diffs)
2152+ elif isinstance(a, list) and b is None and c is None:
2153+ # Method 2: diffs
2154+ # Compute text1 from diffs.
2155+ diffs = a
2156+ text1 = self.diff_text1(diffs)
2157+ elif isinstance(a, basestring) and isinstance(b, list) and c is None:
2158+ # Method 3: text1, diffs
2159+ text1 = a
2160+ diffs = b
2161+ elif (isinstance(a, basestring) and isinstance(b, basestring) and
2162+ isinstance(c, list)):
2163+ # Method 4: text1, text2, diffs
2164+ # text2 is not used.
2165+ text1 = a
2166+ diffs = c
2167+ else:
2168+ raise ValueError("Unknown call format to patch_make.")
2169+
2170+ if not diffs:
2171+ return [] # Get rid of the None case.
2172+ patches = []
2173+ patch = patch_obj()
2174+ char_count1 = 0 # Number of characters into the text1 string.
2175+ char_count2 = 0 # Number of characters into the text2 string.
2176+ prepatch_text = text1 # Recreate the patches to determine context info.
2177+ postpatch_text = text1
2178+ for x in xrange(len(diffs)):
2179+ (diff_type, diff_text) = diffs[x]
2180+ if len(patch.diffs) == 0 and diff_type != self.DIFF_EQUAL:
2181+ # A new patch starts here.
2182+ patch.start1 = char_count1
2183+ patch.start2 = char_count2
2184+ if diff_type == self.DIFF_INSERT:
2185+ # Insertion
2186+ patch.diffs.append(diffs[x])
2187+ patch.length2 += len(diff_text)
2188+ postpatch_text = (postpatch_text[:char_count2] + diff_text +
2189+ postpatch_text[char_count2:])
2190+ elif diff_type == self.DIFF_DELETE:
2191+ # Deletion.
2192+ patch.length1 += len(diff_text)
2193+ patch.diffs.append(diffs[x])
2194+ postpatch_text = (postpatch_text[:char_count2] +
2195+ postpatch_text[char_count2 + len(diff_text):])
2196+ elif (diff_type == self.DIFF_EQUAL and
2197+ len(diff_text) <= 2 * self.Patch_Margin and
2198+ len(patch.diffs) != 0 and len(diffs) != x + 1):
2199+ # Small equality inside a patch.
2200+ patch.diffs.append(diffs[x])
2201+ patch.length1 += len(diff_text)
2202+ patch.length2 += len(diff_text)
2203+
2204+ if (diff_type == self.DIFF_EQUAL and
2205+ len(diff_text) >= 2 * self.Patch_Margin):
2206+ # Time for a new patch.
2207+ if len(patch.diffs) != 0:
2208+ self.patch_addContext(patch, prepatch_text)
2209+ patches.append(patch)
2210+ patch = patch_obj()
2211+ # Unlike Unidiff, our patch lists have a rolling context.
2212+ # http://code.google.com/p/google-diff-match-patch/wiki/Unidiff
2213+ # Update prepatch text & pos to reflect the application of the
2214+ # just completed patch.
2215+ prepatch_text = postpatch_text
2216+ char_count1 = char_count2
2217+
2218+ # Update the current character count.
2219+ if diff_type != self.DIFF_INSERT:
2220+ char_count1 += len(diff_text)
2221+ if diff_type != self.DIFF_DELETE:
2222+ char_count2 += len(diff_text)
2223+
2224+ # Pick up the leftover patch if not empty.
2225+ if len(patch.diffs) != 0:
2226+ self.patch_addContext(patch, prepatch_text)
2227+ patches.append(patch)
2228+ return patches
2229+
2230+ def patch_deepCopy(self, patches):
2231+ """Given an array of patches, return another array that is identical.
2232+
2233+ Args:
2234+ patches: Array of Patch objects.
2235+
2236+ Returns:
2237+ Array of Patch objects.
2238+ """
2239+ patchesCopy = []
2240+ for patch in patches:
2241+ patchCopy = patch_obj()
2242+ # No need to deep copy the tuples since they are immutable.
2243+ patchCopy.diffs = patch.diffs[:]
2244+ patchCopy.start1 = patch.start1
2245+ patchCopy.start2 = patch.start2
2246+ patchCopy.length1 = patch.length1
2247+ patchCopy.length2 = patch.length2
2248+ patchesCopy.append(patchCopy)
2249+ return patchesCopy
2250+
2251+ def patch_apply(self, patches, text):
2252+ """Merge a set of patches onto the text. Return a patched text, as well
2253+ as a list of true/false values indicating which patches were applied.
2254+
2255+ Args:
2256+ patches: Array of Patch objects.
2257+ text: Old text.
2258+
2259+ Returns:
2260+ Two element Array, containing the new text and an array of boolean values.
2261+ """
2262+ if not patches:
2263+ return (text, [])
2264+
2265+ # Deep copy the patches so that no changes are made to originals.
2266+ patches = self.patch_deepCopy(patches)
2267+
2268+ nullPadding = self.patch_addPadding(patches)
2269+ text = nullPadding + text + nullPadding
2270+ self.patch_splitMax(patches)
2271+
2272+ # delta keeps track of the offset between the expected and actual location
2273+ # of the previous patch. If there are patches expected at positions 10 and
2274+ # 20, but the first patch was found at 12, delta is 2 and the second patch
2275+ # has an effective expected position of 22.
2276+ delta = 0
2277+ results = []
2278+ for patch in patches:
2279+ expected_loc = patch.start2 + delta
2280+ text1 = self.diff_text1(patch.diffs)
2281+ end_loc = -1
2282+ if len(text1) > self.Match_MaxBits:
2283+ # patch_splitMax will only provide an oversized pattern in the case of
2284+ # a monster delete.
2285+ start_loc = self.match_main(text, text1[:self.Match_MaxBits],
2286+ expected_loc)
2287+ if start_loc != -1:
2288+ end_loc = self.match_main(text, text1[-self.Match_MaxBits:],
2289+ expected_loc + len(text1) - self.Match_MaxBits)
2290+ if end_loc == -1 or start_loc >= end_loc:
2291+ # Can't find valid trailing context. Drop this patch.
2292+ start_loc = -1
2293+ else:
2294+ start_loc = self.match_main(text, text1, expected_loc)
2295+ if start_loc == -1:
2296+ # No match found. :(
2297+ results.append(False)
2298+ # Subtract the delta for this failed patch from subsequent patches.
2299+ delta -= patch.length2 - patch.length1
2300+ else:
2301+ # Found a match. :)
2302+ results.append(True)
2303+ delta = start_loc - expected_loc
2304+ if end_loc == -1:
2305+ text2 = text[start_loc : start_loc + len(text1)]
2306+ else:
2307+ text2 = text[start_loc : end_loc + self.Match_MaxBits]
2308+ if text1 == text2:
2309+ # Perfect match, just shove the replacement text in.
2310+ text = (text[:start_loc] + self.diff_text2(patch.diffs) +
2311+ text[start_loc + len(text1):])
2312+ else:
2313+ # Imperfect match.
2314+ # Run a diff to get a framework of equivalent indices.
2315+ diffs = self.diff_main(text1, text2, False)
2316+ if (len(text1) > self.Match_MaxBits and
2317+ self.diff_levenshtein(diffs) / float(len(text1)) >
2318+ self.Patch_DeleteThreshold):
2319+ # The end points match, but the content is unacceptably bad.
2320+ results[-1] = False
2321+ else:
2322+ self.diff_cleanupSemanticLossless(diffs)
2323+ index1 = 0
2324+ for (op, data) in patch.diffs:
2325+ if op != self.DIFF_EQUAL:
2326+ index2 = self.diff_xIndex(diffs, index1)
2327+ if op == self.DIFF_INSERT: # Insertion
2328+ text = text[:start_loc + index2] + data + text[start_loc +
2329+ index2:]
2330+ elif op == self.DIFF_DELETE: # Deletion
2331+ text = text[:start_loc + index2] + text[start_loc +
2332+ self.diff_xIndex(diffs, index1 + len(data)):]
2333+ if op != self.DIFF_DELETE:
2334+ index1 += len(data)
2335+ # Strip the padding off.
2336+ text = text[len(nullPadding):-len(nullPadding)]
2337+ return (text, results)
2338+
2339+ def patch_addPadding(self, patches):
2340+ """Add some padding on text start and end so that edges can match
2341+ something. Intended to be called only from within patch_apply.
2342+
2343+ Args:
2344+ patches: Array of Patch objects.
2345+
2346+ Returns:
2347+ The padding string added to each side.
2348+ """
2349+ paddingLength = self.Patch_Margin
2350+ nullPadding = ""
2351+ for x in xrange(1, paddingLength + 1):
2352+ nullPadding += chr(x)
2353+
2354+ # Bump all the patches forward.
2355+ for patch in patches:
2356+ patch.start1 += paddingLength
2357+ patch.start2 += paddingLength
2358+
2359+ # Add some padding on start of first diff.
2360+ patch = patches[0]
2361+ diffs = patch.diffs
2362+ if not diffs or diffs[0][0] != self.DIFF_EQUAL:
2363+ # Add nullPadding equality.
2364+ diffs.insert(0, (self.DIFF_EQUAL, nullPadding))
2365+ patch.start1 -= paddingLength # Should be 0.
2366+ patch.start2 -= paddingLength # Should be 0.
2367+ patch.length1 += paddingLength
2368+ patch.length2 += paddingLength
2369+ elif paddingLength > len(diffs[0][1]):
2370+ # Grow first equality.
2371+ extraLength = paddingLength - len(diffs[0][1])
2372+ newText = nullPadding[len(diffs[0][1]):] + diffs[0][1]
2373+ diffs[0] = (diffs[0][0], newText)
2374+ patch.start1 -= extraLength
2375+ patch.start2 -= extraLength
2376+ patch.length1 += extraLength
2377+ patch.length2 += extraLength
2378+
2379+ # Add some padding on end of last diff.
2380+ patch = patches[-1]
2381+ diffs = patch.diffs
2382+ if not diffs or diffs[-1][0] != self.DIFF_EQUAL:
2383+ # Add nullPadding equality.
2384+ diffs.append((self.DIFF_EQUAL, nullPadding))
2385+ patch.length1 += paddingLength
2386+ patch.length2 += paddingLength
2387+ elif paddingLength > len(diffs[-1][1]):
2388+ # Grow last equality.
2389+ extraLength = paddingLength - len(diffs[-1][1])
2390+ newText = diffs[-1][1] + nullPadding[:extraLength]
2391+ diffs[-1] = (diffs[-1][0], newText)
2392+ patch.length1 += extraLength
2393+ patch.length2 += extraLength
2394+
2395+ return nullPadding
2396+
2397+ def patch_splitMax(self, patches):
2398+ """Look through the patches and break up any which are longer than the
2399+ maximum limit of the match algorithm.
2400+ Intended to be called only from within patch_apply.
2401+
2402+ Args:
2403+ patches: Array of Patch objects.
2404+ """
2405+ patch_size = self.Match_MaxBits
2406+ if patch_size == 0:
2407+ # Python has the option of not splitting strings due to its ability
2408+ # to handle integers of arbitrary precision.
2409+ return
2410+ for x in xrange(len(patches)):
2411+ if patches[x].length1 <= patch_size:
2412+ continue
2413+ bigpatch = patches[x]
2414+ # Remove the big old patch.
2415+ del patches[x]
2416+ x -= 1
2417+ start1 = bigpatch.start1
2418+ start2 = bigpatch.start2
2419+ precontext = ''
2420+ while len(bigpatch.diffs) != 0:
2421+ # Create one of several smaller patches.
2422+ patch = patch_obj()
2423+ empty = True
2424+ patch.start1 = start1 - len(precontext)
2425+ patch.start2 = start2 - len(precontext)
2426+ if precontext:
2427+ patch.length1 = patch.length2 = len(precontext)
2428+ patch.diffs.append((self.DIFF_EQUAL, precontext))
2429+
2430+ while (len(bigpatch.diffs) != 0 and
2431+ patch.length1 < patch_size - self.Patch_Margin):
2432+ (diff_type, diff_text) = bigpatch.diffs[0]
2433+ if diff_type == self.DIFF_INSERT:
2434+ # Insertions are harmless.
2435+ patch.length2 += len(diff_text)
2436+ start2 += len(diff_text)
2437+ patch.diffs.append(bigpatch.diffs.pop(0))
2438+ empty = False
2439+ elif (diff_type == self.DIFF_DELETE and len(patch.diffs) == 1 and
2440+ patch.diffs[0][0] == self.DIFF_EQUAL and
2441+ len(diff_text) > 2 * patch_size):
2442+ # This is a large deletion. Let it pass in one chunk.
2443+ patch.length1 += len(diff_text)
2444+ start1 += len(diff_text)
2445+ empty = False
2446+ patch.diffs.append((diff_type, diff_text))
2447+ del bigpatch.diffs[0]
2448+ else:
2449+ # Deletion or equality. Only take as much as we can stomach.
2450+ diff_text = diff_text[:patch_size - patch.length1 -
2451+ self.Patch_Margin]
2452+ patch.length1 += len(diff_text)
2453+ start1 += len(diff_text)
2454+ if diff_type == self.DIFF_EQUAL:
2455+ patch.length2 += len(diff_text)
2456+ start2 += len(diff_text)
2457+ else:
2458+ empty = False
2459+
2460+ patch.diffs.append((diff_type, diff_text))
2461+ if diff_text == bigpatch.diffs[0][1]:
2462+ del bigpatch.diffs[0]
2463+ else:
2464+ bigpatch.diffs[0] = (bigpatch.diffs[0][0],
2465+ bigpatch.diffs[0][1][len(diff_text):])
2466+
2467+ # Compute the head context for the next patch.
2468+ precontext = self.diff_text2(patch.diffs)
2469+ precontext = precontext[-self.Patch_Margin:]
2470+ # Append the end context for this patch.
2471+ postcontext = self.diff_text1(bigpatch.diffs)[:self.Patch_Margin]
2472+ if postcontext:
2473+ patch.length1 += len(postcontext)
2474+ patch.length2 += len(postcontext)
2475+ if len(patch.diffs) != 0 and patch.diffs[-1][0] == self.DIFF_EQUAL:
2476+ patch.diffs[-1] = (self.DIFF_EQUAL, patch.diffs[-1][1] +
2477+ postcontext)
2478+ else:
2479+ patch.diffs.append((self.DIFF_EQUAL, postcontext))
2480+
2481+ if not empty:
2482+ x += 1
2483+ patches.insert(x, patch)
2484+
2485+ def patch_toText(self, patches):
2486+ """Take a list of patches and return a textual representation.
2487+
2488+ Args:
2489+ patches: Array of Patch objects.
2490+
2491+ Returns:
2492+ Text representation of patches.
2493+ """
2494+ text = []
2495+ for patch in patches:
2496+ text.append(str(patch))
2497+ return "".join(text)
2498+
2499+ def patch_fromText(self, textline):
2500+ """Parse a textual representation of patches and return a list of patch
2501+ objects.
2502+
2503+ Args:
2504+ textline: Text representation of patches.
2505+
2506+ Returns:
2507+ Array of Patch objects.
2508+
2509+ Raises:
2510+ ValueError: If invalid input.
2511+ """
2512+ if type(textline) == unicode:
2513+ # Patches should be composed of a subset of ascii chars, Unicode not
2514+ # required. If this encode raises UnicodeEncodeError, patch is invalid.
2515+ textline = textline.encode("ascii")
2516+ patches = []
2517+ if not textline:
2518+ return patches
2519+ text = textline.split('\n')
2520+ while len(text) != 0:
2521+ m = re.match("^@@ -(\d+),?(\d*) \+(\d+),?(\d*) @@$", text[0])
2522+ if not m:
2523+ raise ValueError("Invalid patch string: " + text[0])
2524+ patch = patch_obj()
2525+ patches.append(patch)
2526+ patch.start1 = int(m.group(1))
2527+ if m.group(2) == '':
2528+ patch.start1 -= 1
2529+ patch.length1 = 1
2530+ elif m.group(2) == '0':
2531+ patch.length1 = 0
2532+ else:
2533+ patch.start1 -= 1
2534+ patch.length1 = int(m.group(2))
2535+
2536+ patch.start2 = int(m.group(3))
2537+ if m.group(4) == '':
2538+ patch.start2 -= 1
2539+ patch.length2 = 1
2540+ elif m.group(4) == '0':
2541+ patch.length2 = 0
2542+ else:
2543+ patch.start2 -= 1
2544+ patch.length2 = int(m.group(4))
2545+
2546+ del text[0]
2547+
2548+ while len(text) != 0:
2549+ if text[0]:
2550+ sign = text[0][0]
2551+ else:
2552+ sign = ''
2553+ line = urllib.unquote(text[0][1:])
2554+ line = line.decode("utf-8")
2555+ if sign == '+':
2556+ # Insertion.
2557+ patch.diffs.append((self.DIFF_INSERT, line))
2558+ elif sign == '-':
2559+ # Deletion.
2560+ patch.diffs.append((self.DIFF_DELETE, line))
2561+ elif sign == ' ':
2562+ # Minor equality.
2563+ patch.diffs.append((self.DIFF_EQUAL, line))
2564+ elif sign == '@':
2565+ # Start of next patch.
2566+ break
2567+ elif sign == '':
2568+ # Blank line? Whatever.
2569+ pass
2570+ else:
2571+ # WTF?
2572+ raise ValueError("Invalid patch mode: '%s'\n%s" % (sign, line))
2573+ del text[0]
2574+ return patches
2575+
2576+
2577+class patch_obj:
2578+ """Class representing one patch operation.
2579+ """
2580+
2581+ def __init__(self):
2582+ """Initializes with an empty list of diffs.
2583+ """
2584+ self.diffs = []
2585+ self.start1 = None
2586+ self.start2 = None
2587+ self.length1 = 0
2588+ self.length2 = 0
2589+
2590+ def __str__(self):
2591+ """Emmulate GNU diff's format.
2592+ Header: @@ -382,8 +481,9 @@
2593+ Indicies are printed as 1-based, not 0-based.
2594+
2595+ Returns:
2596+ The GNU diff string.
2597+ """
2598+ if self.length1 == 0:
2599+ coords1 = str(self.start1) + ",0"
2600+ elif self.length1 == 1:
2601+ coords1 = str(self.start1 + 1)
2602+ else:
2603+ coords1 = str(self.start1 + 1) + "," + str(self.length1)
2604+ if self.length2 == 0:
2605+ coords2 = str(self.start2) + ",0"
2606+ elif self.length2 == 1:
2607+ coords2 = str(self.start2 + 1)
2608+ else:
2609+ coords2 = str(self.start2 + 1) + "," + str(self.length2)
2610+ text = ["@@ -", coords1, " +", coords2, " @@\n"]
2611+ # Escape the body of the patch with %xx notation.
2612+ for (op, data) in self.diffs:
2613+ if op == diff_match_patch.DIFF_INSERT:
2614+ text.append("+")
2615+ elif op == diff_match_patch.DIFF_DELETE:
2616+ text.append("-")
2617+ elif op == diff_match_patch.DIFF_EQUAL:
2618+ text.append(" ")
2619+ # High ascii will raise UnicodeDecodeError. Use Unicode instead.
2620+ data = data.encode("utf-8")
2621+ text.append(urllib.quote(data, "!~*'();/?:@&=+$,# ") + "\n")
2622+ return "".join(text)
2623
2624=== added file 'charmtools/compose/fetchers.py'
2625--- charmtools/compose/fetchers.py 1970-01-01 00:00:00 +0000
2626+++ charmtools/compose/fetchers.py 2015-08-31 19:32:56 +0000
2627@@ -0,0 +1,117 @@
2628+import re
2629+import tempfile
2630+import os
2631+
2632+import requests
2633+from bundletester import fetchers
2634+from bundletester.fetchers import (git, # noqa
2635+ Fetcher,
2636+ get_fetcher,
2637+ FetchError)
2638+
2639+from path import path
2640+
2641+
2642+class RepoFetcher(fetchers.LocalFetcher):
2643+ @classmethod
2644+ def can_fetch(cls, url):
2645+ search_path = [os.getcwd(), os.environ.get("JUJU_REPOSITORY", ".")]
2646+ cp = os.environ.get("COMPOSER_PATH")
2647+ if cp:
2648+ search_path.extend(cp.split(":"))
2649+ for part in search_path:
2650+ p = (path(part) / url).normpath()
2651+ if p.exists():
2652+ return dict(path=p)
2653+ return {}
2654+
2655+fetchers.FETCHERS.insert(0, RepoFetcher)
2656+
2657+
2658+class InterfaceFetcher(fetchers.LocalFetcher):
2659+ # XXX: When hosted somewhere, fix this
2660+ INTERFACE_DOMAIN = "http://interfaces.juju.solutions"
2661+ NAMESPACE = "interface"
2662+ ENVIRON = "INTERFACE_PATH"
2663+ OPTIONAL_PREFIX = "juju-relation-"
2664+ ENDPOINT = "/api/v1/interface"
2665+
2666+ @classmethod
2667+ def can_fetch(cls, url):
2668+ # Search local path first, then
2669+ # the interface webservice
2670+ if url.startswith("{}:".format(cls.NAMESPACE)):
2671+ url = url[len(cls.NAMESPACE) + 1:]
2672+ search_path = [os.environ.get("JUJU_REPOSITORY", ".")]
2673+ cp = os.environ.get(cls.ENVIRON)
2674+ if cp:
2675+ search_path.extend(cp.split(os.pathsep))
2676+ for part in search_path:
2677+ p = (path(part) / url).normpath()
2678+ if p.exists():
2679+ return dict(path=p)
2680+
2681+ choices = [url]
2682+ if url.startswith(cls.OPTIONAL_PREFIX):
2683+ choices.append(url[len(cls.OPTIONAL_PREFIX):])
2684+ for choice in choices:
2685+ uri = "%s%s/%s/" % (
2686+ cls.INTERFACE_DOMAIN, cls.ENDPOINT, choice)
2687+ try:
2688+ result = requests.get(uri)
2689+ except:
2690+ result = None
2691+ if result and result.ok:
2692+ result = result.json()
2693+ if "repo" in result:
2694+ return result
2695+ return {}
2696+
2697+ def fetch(self, dir_):
2698+ if hasattr(self, "path"):
2699+ return super(InterfaceFetcher, self).fetch(dir_)
2700+ elif hasattr(self, "repo"):
2701+ # use the github fetcher for now
2702+ u = self.url[10:]
2703+ f = get_fetcher(self.repo)
2704+ if hasattr(f, "repo"):
2705+ basename = path(f.repo).name.splitext()[0]
2706+ else:
2707+ basename = u
2708+ res = f.fetch(dir_)
2709+ target = dir_ / basename
2710+ if res != target:
2711+ target.rmtree_p()
2712+ path(res).rename(target)
2713+ return target
2714+
2715+
2716+fetchers.FETCHERS.insert(0, InterfaceFetcher)
2717+
2718+
2719+class LayerFetcher(InterfaceFetcher):
2720+ INTERFACE_DOMAIN = "http://interfaces.juju.solutions"
2721+ NAMESPACE = "layer"
2722+ ENVIRON = "COMPOSER_PATH"
2723+ OPTIONAL_PREFIX = "juju-layer-"
2724+ ENDPOINT = "/api/v1/layer"
2725+
2726+fetchers.FETCHERS.insert(0, LayerFetcher)
2727+
2728+
2729+class LaunchpadGitFetcher(Fetcher):
2730+ # XXX: this should be upstreamed
2731+ MATCH = re.compile(r"""
2732+ ^(git:|https)?://git.launchpad.net/
2733+ (?P<repo>[^@]*)(@(?P<revision>.*))?$
2734+ """, re.VERBOSE)
2735+
2736+ def fetch(self, dir_):
2737+ dir_ = tempfile.mkdtemp(dir=dir_)
2738+ url = 'https://git.launchpad.net/' + self.repo
2739+ git('clone {} {}'.format(url, dir_))
2740+ if self.revision:
2741+ git('checkout {}'.format(self.revision), cwd=dir_)
2742+ return dir_
2743+
2744+fetchers.FETCHERS.append(LaunchpadGitFetcher)
2745
2746=== added file 'charmtools/compose/inspector.py'
2747--- charmtools/compose/inspector.py 1970-01-01 00:00:00 +0000
2748+++ charmtools/compose/inspector.py 2015-08-31 19:32:56 +0000
2749@@ -0,0 +1,101 @@
2750+# coding=utf-8
2751+import json
2752+from ruamel import yaml
2753+from charmtools.compose import config
2754+from charmtools import utils
2755+
2756+theme = {
2757+ 0: "normal",
2758+ 1: "green",
2759+ 2: "cyan",
2760+ 3: "magenta",
2761+ 4: "yellow",
2762+ 5: "red",
2763+}
2764+
2765+
2766+def scan_for(col, cur, depth):
2767+ for e, (rel, d) in col[cur:]:
2768+ if d and d == depth:
2769+ return True
2770+ return False
2771+
2772+
2773+def get_prefix(walk, cur, depth, next_depth):
2774+ guide = []
2775+ for i in range(depth):
2776+ # scan forward in walk from i seeing if a subsequent
2777+ # entry happens at each depth
2778+ if scan_for(walk, cur, i):
2779+ guide.append(" │ ")
2780+ else:
2781+ guide.append(" ")
2782+ if depth == next_depth:
2783+ prefix = " ├─── "
2784+ else:
2785+ prefix = " └─── "
2786+ return "{}{}".format("".join(guide), prefix)
2787+
2788+
2789+def inspect(charm):
2790+ tw = utils.TermWriter()
2791+ manp = charm / ".composer.manifest"
2792+ comp = charm / "composer.yaml"
2793+ if not manp.exists() or not comp.exists():
2794+ return
2795+ manifest = json.loads(manp.text())
2796+ composer = yaml.load(comp.open())
2797+ a, c, d = utils.delta_signatures(manp)
2798+
2799+ # ordered list of layers used for legend
2800+ layers = list(manifest['layers'])
2801+
2802+ def get_depth(e):
2803+ rel = e.relpath(charm)
2804+ depth = len(rel.splitall()) - 2
2805+ return rel, depth
2806+
2807+ def get_suffix(rel):
2808+ suffix = ""
2809+ if rel in a:
2810+ suffix = "+"
2811+ elif rel in c:
2812+ suffix = "*"
2813+ return suffix
2814+
2815+ def get_color(rel):
2816+ # name of layer this belongs to
2817+ color = tw.term.normal
2818+ if rel in manifest['signatures']:
2819+ layer = manifest['signatures'][rel][0]
2820+ layer_key = layers.index(layer)
2821+ color = getattr(tw, theme.get(layer_key, "normal"))
2822+ else:
2823+ if entry.isdir():
2824+ color = tw.blue
2825+ return color
2826+
2827+ tw.write("Inspect %s\n" % composer["is"])
2828+ for layer in layers:
2829+ tw.write("# {color}{layer}{t.normal}\n",
2830+ color=getattr(tw, theme.get(
2831+ layers.index(layer), "normal")),
2832+ layer=layer)
2833+ tw.write("\n")
2834+ tw.write("{t.blue}{target}{t.normal}\n", target=charm)
2835+
2836+ ignorer = utils.ignore_matcher(config.DEFAULT_IGNORES)
2837+ walk = sorted(utils.walk(charm, get_depth),
2838+ key=lambda x: x[1][0])
2839+ for i in range(len(walk) - 1):
2840+ entry, (rel, depth) = walk[i]
2841+ nEnt, (nrel, ndepth) = walk[i + 1]
2842+ if not ignorer(rel):
2843+ continue
2844+
2845+ tw.write("{prefix}{layerColor}{entry} "
2846+ "{t.bold}{suffix}{t.normal}\n",
2847+ prefix=get_prefix(walk, i, depth, ndepth),
2848+ layerColor=get_color(rel),
2849+ suffix=get_suffix(rel),
2850+ entry=rel.name)
2851
2852=== added file 'charmtools/compose/tactics.py'
2853--- charmtools/compose/tactics.py 1970-01-01 00:00:00 +0000
2854+++ charmtools/compose/tactics.py 2015-08-31 19:32:56 +0000
2855@@ -0,0 +1,449 @@
2856+import logging
2857+import json
2858+from ruamel import yaml
2859+
2860+from path import path
2861+from charmtools import utils
2862+
2863+log = logging.getLogger(__name__)
2864+
2865+
2866+class Tactic(object):
2867+ """
2868+ Tactics are first considered in the context of the config layer being
2869+ called the config layer will attempt to (using its author provided info)
2870+ create a tactic for a given file. That will later be intersected with any
2871+ later layers to create a final single plan for each element of the output
2872+ charm.
2873+
2874+ Callable that will implement some portion of the charm composition
2875+ Subclasses should implement __str__ and __call__ which should take whatever
2876+ actions are needed.
2877+ """
2878+ kind = "static" # used in signatures
2879+
2880+ def __init__(self, entity, current, target, config):
2881+ self.entity = entity
2882+ self._current = current
2883+ self._target = target
2884+ self._raw_data = None
2885+ self._config = config
2886+
2887+ def __call__(self):
2888+ raise NotImplementedError
2889+
2890+ def __str__(self):
2891+ return "{}: {} -> {}".format(
2892+ self.__class__.__name__, self.entity, self.target_file)
2893+
2894+ @property
2895+ def current(self):
2896+ """The file in the current layer under consideration"""
2897+ return self._current
2898+
2899+ @property
2900+ def target(self):
2901+ """The target (final) layer."""
2902+ return self._target
2903+
2904+ @property
2905+ def relpath(self):
2906+ return self.entity.relpath(self.current.directory)
2907+
2908+ @property
2909+ def target_file(self):
2910+ target = self.target.directory / self.relpath
2911+ return target
2912+
2913+ @property
2914+ def layer_name(self):
2915+ return self.current.directory.name
2916+
2917+ @property
2918+ def repo_path(self):
2919+ return path("/".join(self.current.directory.splitall()[-2:]))
2920+
2921+ @property
2922+ def config(self):
2923+ # Return the config of the layer *above* you
2924+ # as that is the one that controls your compositing
2925+ return self._config
2926+
2927+ def combine(self, existing):
2928+ """Produce a tactic informed by the last tactic for an entry.
2929+ This is when a rule in a higher level charm overrode something in
2930+ one of its bases for example."""
2931+ return self
2932+
2933+ @classmethod
2934+ def trigger(cls, relpath):
2935+ """Should the rule trigger for a given path object"""
2936+ return False
2937+
2938+ def sign(self):
2939+ """return sign in the form {relpath: (origin layer, SHA256)}
2940+ """
2941+ target = self.target_file
2942+ sig = {}
2943+ if target.exists() and target.isfile():
2944+ sig[self.relpath] = (self.current.url,
2945+ self.kind,
2946+ utils.sign(self.target_file))
2947+ return sig
2948+
2949+ def lint(self):
2950+ return True
2951+
2952+ def read(self):
2953+ return None
2954+
2955+ def build(self):
2956+ pass
2957+
2958+
2959+class ExactMatch(object):
2960+ FILENAME = None
2961+
2962+ @classmethod
2963+ def trigger(cls, relpath):
2964+ return cls.FILENAME == relpath
2965+
2966+
2967+class CopyTactic(Tactic):
2968+ def __call__(self):
2969+ if self.entity.isdir():
2970+ return
2971+ should_ignore = utils.ignore_matcher(self.target.config.ignores)
2972+ if not should_ignore(self.relpath):
2973+ return
2974+ target = self.target_file
2975+ log.debug("Copying %s: %s", self.layer_name, target)
2976+ # Ensure the path exists
2977+ target.dirname().makedirs_p()
2978+ if (self.entity != target) and not target.exists() \
2979+ or not self.entity.samefile(target):
2980+ data = self.read()
2981+ if data:
2982+ target.write_bytes(data)
2983+ self.entity.copymode(target)
2984+ else:
2985+ self.entity.copy2(target)
2986+
2987+ def __str__(self):
2988+ return "Copy {}".format(self.entity)
2989+
2990+ @classmethod
2991+ def trigger(cls, relpath):
2992+ return True
2993+
2994+
2995+class InterfaceCopy(Tactic):
2996+ def __init__(self, interface, relation_name, target, config):
2997+ self.interface = interface
2998+ self.relation_name = relation_name
2999+ self._target = target
3000+ self._config = config
3001+
3002+ @property
3003+ def target(self):
3004+ return self._target / "hooks/relations" / self.interface.name
3005+
3006+ def __call__(self):
3007+ # copy the entire tree into the
3008+ # hooks/relations/<interface>
3009+ # directory
3010+ log.debug("Copying Interface %s: %s",
3011+ self.interface.name, self.target)
3012+ # Ensure the path exists
3013+ if self.target.exists():
3014+ # XXX: fix this to do actual updates
3015+ return
3016+ ignorer = utils.ignore_matcher(self.config.ignores)
3017+ for entity, _ in utils.walk(self.interface.directory,
3018+ lambda x: True,
3019+ matcher=ignorer,
3020+ kind="files"):
3021+ target = entity.relpath(self.interface.directory)
3022+ target = (self.target / target).normpath()
3023+ target.parent.makedirs_p()
3024+ entity.copy2(target)
3025+ init = self.target / "__init__.py"
3026+ if not init.exists():
3027+ # ensure we can import from here directly
3028+ init.touch()
3029+
3030+ def __str__(self):
3031+ return "Copy Interface {}".format(self.interface.name)
3032+
3033+ def sign(self):
3034+ """return sign in the form {relpath: (origin layer, SHA256)}
3035+ """
3036+ sigs = {}
3037+ for entry, sig in utils.walk(self.target,
3038+ utils.sign, kind="files"):
3039+ relpath = entry.relpath(self._target.directory)
3040+ sigs[relpath] = (self.interface.url, "static", sig)
3041+ return sigs
3042+
3043+ def lint(self):
3044+ for entry in self.interface.directory.walkfiles():
3045+ if entry.splitext()[1] != ".py":
3046+ continue
3047+ relpath = entry.relpath(self._target.directory)
3048+ target = self._target.directory / relpath
3049+ if not target.exists():
3050+ continue
3051+ return utils.delta_python_dump(entry, target,
3052+ from_name=relpath)
3053+
3054+
3055+class InterfaceBind(InterfaceCopy):
3056+ def __init__(self, interface, relation_name, kind, target, config):
3057+ self.interface = interface
3058+ self.relation_name = relation_name
3059+ self.kind = kind
3060+ self._target = target
3061+ self._config = config
3062+
3063+ DEFAULT_BINDING = """#!/usr/bin/env python
3064+
3065+# Load modules from $CHARM_DIR/lib
3066+import sys
3067+sys.path.append('lib')
3068+
3069+# This will load and run the appropriate @hook and other decorated
3070+# handlers from $CHARM_DIR/reactive, $CHARM_DIR/hooks/reactive,
3071+# and $CHARM_DIR/hooks/relations.
3072+#
3073+# See https://jujucharms.com/docs/stable/getting-started-with-charms-reactive
3074+# for more information on this pattern.
3075+from charms.reactive import main
3076+main('{}')
3077+"""
3078+
3079+ def __call__(self):
3080+ for hook in ['joined', 'changed', 'broken', 'departed']:
3081+ target = self._target / "hooks" / "{}-relation-{}".format(
3082+ self.relation_name, hook)
3083+ if target.exists():
3084+ # XXX: warn
3085+ continue
3086+ target.parent.makedirs_p()
3087+ target.write_text(self.DEFAULT_BINDING.format(self.relation_name))
3088+ target.chmod(0755)
3089+
3090+ def sign(self):
3091+ """return sign in the form {relpath: (origin layer, SHA256)}
3092+ """
3093+ sigs = {}
3094+ for hook in ['joined', 'changed', 'broken', 'departed']:
3095+ target = self._target / "hooks" / "{}-relation-{}".format(
3096+ self.relation_name, hook)
3097+ rel = target.relpath(self._target.directory)
3098+ sigs[rel] = (self.interface.url,
3099+ "dynamic",
3100+ utils.sign(target))
3101+ return sigs
3102+
3103+ def __str__(self):
3104+ return "Bind Interface {}".format(self.interface.name)
3105+
3106+
3107+class ManifestTactic(ExactMatch, Tactic):
3108+ FILENAME = ".composer.manifest"
3109+
3110+ def __call__(self):
3111+ # Don't copy manifests, they are regenerated
3112+ pass
3113+
3114+
3115+class SerializedTactic(ExactMatch, Tactic):
3116+ kind = "dynamic"
3117+
3118+ def __init__(self, *args, **kwargs):
3119+ super(SerializedTactic, self).__init__(*args, **kwargs)
3120+ self.data = None
3121+
3122+ def combine(self, existing):
3123+ # Invoke the previous tactic
3124+ existing()
3125+ if existing.data is not None:
3126+ self.data = existing.data
3127+ return self
3128+
3129+ def __call__(self):
3130+ data = self.load(self.entity.open())
3131+ # self.data represents the product of previous layers
3132+ if self.data:
3133+ data = utils.deepmerge(self.data, data)
3134+
3135+ # Now apply any rules from config
3136+ config = self.config
3137+ if config:
3138+ section = config.get(self.section)
3139+ if section:
3140+ dels = section.get('deletes', [])
3141+ if self.prefix:
3142+ namespace = data[self.prefix]
3143+ else:
3144+ namespace = data
3145+ for key in dels:
3146+ utils.delete_path(key, namespace)
3147+ self.data = data
3148+ if not self.target_file.parent.exists():
3149+ self.target_file.parent.makedirs_p()
3150+ self.dump(data)
3151+ return data
3152+
3153+
3154+class YAMLTactic(SerializedTactic):
3155+ """Rule Driven YAML generation"""
3156+ prefix = None
3157+
3158+ def load(self, fn):
3159+ return yaml.load(fn, Loader=yaml.RoundTripLoader)
3160+
3161+ def dump(self, data):
3162+ yaml.dump(data, self.target_file.open('w'),
3163+ Dumper=yaml.RoundTripDumper,
3164+ default_flow_style=False)
3165+
3166+
3167+class JSONTactic(SerializedTactic):
3168+ """Rule Driven JSON generation"""
3169+ prefix = None
3170+
3171+ def load(self, fn):
3172+ return json.load(fn)
3173+
3174+ def dump(self, data):
3175+ json.dump(data, self.target_file.open('w'), indent=2)
3176+
3177+
3178+class ComposerYAML(YAMLTactic, ExactMatch):
3179+ FILENAME = "composer.yaml"
3180+
3181+ def read(self):
3182+ self._raw_data = self.load(self.entity.open())
3183+
3184+ def __call__(self):
3185+ # rewrite includes to be the current source
3186+ data = self._raw_data
3187+ if data is None:
3188+ return
3189+ # The split should result in the series/charm path only
3190+ # XXX: there will be strange interactions with cs: vs local:
3191+ if 'is' not in data:
3192+ data['is'] = str(self.current.url)
3193+ inc = data.get('includes', [])
3194+ norm = []
3195+ for i in inc:
3196+ if ":" in i:
3197+ norm.append(i)
3198+ else:
3199+ # Attempt to normalize to a repository base
3200+ norm.append("/".join(path(i).splitall()[-2:]))
3201+ if norm:
3202+ data['includes'] = norm
3203+ if not self.target_file.parent.exists():
3204+ self.target_file.parent.makedirs_p()
3205+ self.dump(data)
3206+ return data
3207+
3208+
3209+class MetadataYAML(YAMLTactic):
3210+ """Rule Driven metadata.yaml generation"""
3211+ section = "metadata"
3212+ FILENAME = "metadata.yaml"
3213+ KEY_ORDER = ["name", "summary", "maintainer",
3214+ "description", "tags",
3215+ "requires", "provides", "peers"]
3216+
3217+ def dump(self, data):
3218+ if not data:
3219+ return
3220+ final = yaml.comments.CommentedMap()
3221+ # assempt keys in know order
3222+ for k in self.KEY_ORDER:
3223+ if k in data:
3224+ final[k] = data[k]
3225+ missing = set(data.keys()) - set(self.KEY_ORDER)
3226+ for k in sorted(missing):
3227+ final[k] = data[k]
3228+ super(MetadataYAML, self).dump(final)
3229+
3230+
3231+class ConfigYAML(MetadataYAML):
3232+ """Rule driven config.yaml generation"""
3233+ section = "config"
3234+ prefix = "options"
3235+ FILENAME = "config.yaml"
3236+
3237+
3238+class InstallerTactic(Tactic):
3239+ def __str__(self):
3240+ return "Installing software to {}".format(self.relpath)
3241+
3242+ @classmethod
3243+ def trigger(cls, relpath):
3244+ ext = relpath.splitext()[1]
3245+ return ext in [".pypi", ]
3246+
3247+ def __call__(self):
3248+ # install package reference in trigger file
3249+ # in place directory of target
3250+ # XXX: Should this map multiline to "-r", self.entity
3251+ spec = self.entity.text().strip()
3252+ target = self.target_file.dirname()
3253+ target_dir = target / path(spec.split(" ", 1)[0]).normpath().namebase
3254+ log.debug("pip installing {} as {}".format(
3255+ spec, target_dir))
3256+ with utils.tempdir() as temp_dir:
3257+ # We do this dance so we don't have
3258+ # to guess package and .egg file names
3259+ # we move everything in the tempdir to the target
3260+ # and track it for later use in sign()
3261+ utils.Process(("pip",
3262+ "install",
3263+ "-U",
3264+ "--exists-action",
3265+ "i",
3266+ "-t",
3267+ temp_dir,
3268+ spec)).throw_on_error()()
3269+ dirs = temp_dir.listdir()
3270+ self._tracked = []
3271+ for d in dirs:
3272+ d.move(target)
3273+ self._tracked.append(target / d)
3274+
3275+ def sign(self):
3276+ """return sign in the form {relpath: (origin layer, SHA256)}
3277+ """
3278+ sigs = {}
3279+ for d in self._tracked:
3280+ for entry, sig in utils.walk(d,
3281+ utils.sign, kind="files"):
3282+ relpath = entry.relpath(self._target.directory)
3283+ sigs[relpath] = (self.current.url, "dynamic", sig)
3284+ return sigs
3285+
3286+
3287+def load_tactic(dpath, basedir):
3288+ """Load a tactic from the current layer using a dotted path. The last
3289+ element in the path should be a Tactic subclass
3290+ """
3291+ obj = utils.load_class(dpath, basedir)
3292+ if not issubclass(obj, Tactic):
3293+ raise ValueError("Expected to load a tactic for %s" % dpath)
3294+ return obj
3295+
3296+
3297+DEFAULT_TACTICS = [
3298+ ManifestTactic,
3299+ InstallerTactic,
3300+ MetadataYAML,
3301+ ConfigYAML,
3302+ ComposerYAML,
3303+ CopyTactic
3304+]
3305
3306=== added file 'charmtools/utils.py'
3307--- charmtools/utils.py 1970-01-01 00:00:00 +0000
3308+++ charmtools/utils.py 2015-08-31 19:32:56 +0000
3309@@ -0,0 +1,518 @@
3310+import copy
3311+import collections
3312+import hashlib
3313+import json
3314+import logging
3315+import os
3316+import re
3317+import subprocess
3318+import sys
3319+import tempfile
3320+import time
3321+from contextlib import contextmanager
3322+
3323+from .compose.diff_match_patch import diff_match_patch
3324+import blessings
3325+import pathspec
3326+from path import path
3327+
3328+log = logging.getLogger('utils')
3329+
3330+
3331+@contextmanager
3332+def cd(directory, make=False):
3333+ cwd = os.getcwd()
3334+ if not os.path.exists(directory) and make:
3335+ os.makedirs(directory)
3336+ os.chdir(directory)
3337+ try:
3338+ yield
3339+ finally:
3340+ os.chdir(cwd)
3341+
3342+
3343+@contextmanager
3344+def tempdir():
3345+ dirname = path(tempfile.mkdtemp())
3346+ with cd(dirname):
3347+ yield dirname
3348+ dirname.rmtree_p()
3349+
3350+
3351+def deepmerge(dest, src):
3352+ """
3353+ Deep merge of two dicts.
3354+
3355+ This is destructive (`dest` is modified), but values
3356+ from `src` are passed through `copy.deepcopy`.
3357+ """
3358+ for k, v in src.iteritems():
3359+ if dest.get(k) and isinstance(v, dict):
3360+ deepmerge(dest[k], v)
3361+ else:
3362+ dest[k] = copy.deepcopy(v)
3363+ return dest
3364+
3365+
3366+def delete_path(path, obj):
3367+ """Delete a dotted path from object, assuming each level is a dict"""
3368+ parts = path.split('.')
3369+ for p in parts[:-1]:
3370+ obj = obj[p]
3371+ del obj[parts[-1]]
3372+
3373+
3374+class NestedDict(dict):
3375+ def __init__(self, dict_or_iterable=None, **kwargs):
3376+ if dict_or_iterable:
3377+ if isinstance(dict_or_iterable, dict):
3378+ self.update(dict_or_iterable)
3379+ elif isinstance(dict_or_iterable, collections.Iterable):
3380+ for k, v in dict_or_iterable:
3381+ self[k] = v
3382+ if kwargs:
3383+ self.update(kwargs)
3384+
3385+ def __setitem__(self, key, value):
3386+ key = key.split('.')
3387+ o = self
3388+ for part in key[:-1]:
3389+ o = o.setdefault(part, self.__class__())
3390+ dict.__setitem__(o, key[-1], value)
3391+
3392+ def __getitem__(self, key):
3393+ o = self
3394+ if '.' in key:
3395+ parts = key.split('.')
3396+ key = parts[-1]
3397+ for part in parts[:-1]:
3398+ o = o[part]
3399+
3400+ return dict.__getitem__(o, key)
3401+
3402+ def __getattr__(self, key):
3403+ try:
3404+ return self[key]
3405+ except KeyError:
3406+ raise AttributeError(key)
3407+
3408+ def get(self, key, default=None):
3409+ try:
3410+ return self[key]
3411+ except KeyError:
3412+ return default
3413+
3414+ def update(self, other):
3415+ deepmerge(self, other)
3416+
3417+
3418+class ProcessResult(object):
3419+ def __init__(self, command, exit_code, stdout, stderr):
3420+ self.command = command
3421+ self.exit_code = exit_code
3422+ self.stdout = stdout
3423+ self.stderr = stderr
3424+
3425+ def __repr__(self):
3426+ return '<ProcessResult "%s" result %s>' % (self.cmd, self.exit_code)
3427+
3428+ @property
3429+ def cmd(self):
3430+ return ' '.join(self.command)
3431+
3432+ @property
3433+ def output(self):
3434+ result = ''
3435+ if self.stdout:
3436+ result += self.stdout
3437+ if self.stderr:
3438+ result += self.stderr
3439+ return result.strip()
3440+
3441+ @property
3442+ def json(self):
3443+ if self.stdout:
3444+ return json.loads(self.stdout)
3445+ return None
3446+
3447+ def __eq__(self, other):
3448+ return self.exit_code == other
3449+
3450+ def __bool__(self):
3451+ return self.exit_code == 0
3452+
3453+ __nonzero__ = __bool__
3454+
3455+ def throw_on_error(self):
3456+ if not bool(self):
3457+ raise subprocess.CalledProcessError(
3458+ self.exit_code, self.command, output=self.output)
3459+
3460+
3461+class Process(object):
3462+ def __init__(self, command=None, throw=False, log=log, **kwargs):
3463+ if isinstance(command, str):
3464+ command = (command, )
3465+ self.command = command
3466+ self._throw_on_error = False
3467+ self.log = log
3468+ self._kw = kwargs
3469+
3470+ def __repr__(self):
3471+ return "<Command %s>" % (self.command, )
3472+
3473+ def throw_on_error(self, throw=True):
3474+ self._throw_on_error = throw
3475+ return self
3476+
3477+ def __call__(self, *args, **kw):
3478+ kwargs = dict(stdout=subprocess.PIPE,
3479+ stderr=subprocess.STDOUT)
3480+ if self._kw:
3481+ kwargs.update(self._kw)
3482+ kwargs.update(kw)
3483+ if self.command:
3484+ all_args = self.command + args
3485+ else:
3486+ all_args = args
3487+ if 'env' not in kwargs:
3488+ kwargs['env'] = os.environ
3489+
3490+ p = subprocess.Popen(all_args, **kwargs)
3491+ stdout, stderr = p.communicate()
3492+ self.log.debug(stdout)
3493+ stdout = stdout.strip()
3494+ if stderr is not None:
3495+ stderr = stderr.strip()
3496+ self.log.debug(stderr)
3497+ exit_code = p.poll()
3498+ result = ProcessResult(all_args, exit_code, stdout, stderr)
3499+ self.log.debug("process: %s (%d)", result.cmd, result.exit_code)
3500+ if self._throw_on_error:
3501+ result.throw_on_error()
3502+ return result
3503+
3504+command = Process
3505+
3506+
3507+class Commander(object):
3508+ def __init__(self, log=log):
3509+ self.log = log
3510+
3511+ def set_log(self, logger):
3512+ self.log = logger
3513+
3514+ def __getattr__(self, key):
3515+ return command((key,), log=self.log)
3516+
3517+ def check(self, *args, **kwargs):
3518+ kwargs.update({'log': self.log})
3519+ return command(command=args, **kwargs).throw_on_error()
3520+
3521+ def __call__(self, *args, **kwargs):
3522+ kwargs.update({'log': self.log})
3523+ return command(command=args, shell=True, **kwargs)
3524+
3525+
3526+sh = Commander()
3527+dig = Process(('dig', '+short'))
3528+api_endpoints = Process(('juju', 'api-endpoints'))
3529+
3530+
3531+def wait_for(timeout, interval, *callbacks, **kwargs):
3532+ """
3533+ Repeatedly try callbacks until all return True
3534+
3535+ This will wait interval seconds between attempts and will error out
3536+ after timeout has been exceeded.
3537+
3538+ Callbacks will be called with the container as their argument.
3539+
3540+ Setting timeout to zero will loop until cancelled, power runs outs,
3541+ hardware fails, or the heat death of the universe.
3542+ """
3543+ start = time.time()
3544+ if timeout:
3545+ end = start + timeout
3546+ else:
3547+ end = 0
3548+
3549+ bar = kwargs.get('bar', None)
3550+ message = kwargs.get('message', None)
3551+ once = 1
3552+ while True:
3553+ passes = True
3554+ if end > 0 and time.time() > end:
3555+ raise OSError("Timeout exceeded in wait_for")
3556+ if bar:
3557+ bar.next(once, message=message)
3558+ if once == 1:
3559+ once = 0
3560+ if int(time.time()) % interval == 0:
3561+ for callback in callbacks:
3562+ result = callback()
3563+ passes = passes & bool(result)
3564+ if passes is False:
3565+ break
3566+ if passes is True:
3567+ break
3568+ time.sleep(1)
3569+
3570+
3571+def until(*callbacks, **kwargs):
3572+ return wait_for(0, 20, *callbacks, **kwargs)
3573+
3574+
3575+def retry(attempts, *callbacks, **kwargs):
3576+ """
3577+ Repeatedly try callbacks a fixed number of times or until all return True
3578+ """
3579+ for attempt in xrange(attempts):
3580+ if 'bar' in kwargs:
3581+ kwargs['bar'].next(attempt == 0, message=kwargs.get('message'))
3582+ for callback in callbacks:
3583+ if not callback():
3584+ break
3585+ else:
3586+ break
3587+ else:
3588+ raise OSError("Retry attempts exceeded")
3589+ return True
3590+
3591+
3592+def which(program):
3593+ def is_exe(fpath):
3594+ return os.path.isfile(fpath) and os.access(fpath, os.X_OK)
3595+
3596+ fpath, fname = os.path.split(program)
3597+ if fpath:
3598+ if is_exe(program):
3599+ return program
3600+ else:
3601+ for fpath in os.environ["PATH"].split(os.pathsep):
3602+ fpath = fpath.strip('"')
3603+ exe_file = os.path.join(fpath, program)
3604+ if is_exe(exe_file):
3605+ return exe_file
3606+ return None
3607+
3608+
3609+def load_class(dpath, workingdir=None):
3610+ # we expect the last element of the path
3611+ if not workingdir:
3612+ workingdir = os.getcwd()
3613+ with cd(workingdir):
3614+ modpath, classname = dpath.rsplit('.', 1)
3615+ modpath = path(modpath.replace(".", "/"))
3616+ if not modpath.exists():
3617+ modpath += ".py"
3618+ if not modpath.exists():
3619+ raise OSError("Unable to load {} from {}".format(
3620+ dpath, workingdir))
3621+ namespace = {}
3622+ execfile(modpath, globals(), namespace)
3623+ klass = namespace.get(classname)
3624+ if klass is None:
3625+ raise ImportError("Unable to load class {} at {}".format(
3626+ classname, dpath))
3627+ return klass
3628+
3629+
3630+def walk(pathobj, fn, matcher=None, kind=None, **kwargs):
3631+ """walk pathobj calling fn on each matched entry yielding each
3632+ result. If kind is 'file' or 'dir' only that type ofd entry will
3633+ be walked. matcher is an optional function returning bool indicating
3634+ if the entry should be processed.
3635+ """
3636+ p = path(pathobj)
3637+ walker = p.walk
3638+ if kind == "files":
3639+ walker = p.walkfiles
3640+ elif kind == "dir":
3641+ walker = p.walkdir
3642+
3643+ for entry in walker():
3644+ relpath = entry.relpath(pathobj)
3645+ if matcher and not matcher(relpath):
3646+ continue
3647+ yield (entry, fn(entry, **kwargs))
3648+
3649+
3650+def ignore_matcher(ignores=[]):
3651+ spec = pathspec.PathSpec.from_lines(pathspec.GitIgnorePattern, ignores)
3652+
3653+ def matcher(entity):
3654+ return entity not in spec.match_files((entity,))
3655+ return matcher
3656+
3657+
3658+def sign(pathobj):
3659+ p = path(pathobj)
3660+ if not p.isfile():
3661+ return None
3662+ return hashlib.sha256(p.bytes()).hexdigest()
3663+
3664+
3665+def delta_signatures(manifest_filename, ignorer=None):
3666+ md = path(manifest_filename)
3667+ repo = md.normpath().dirname()
3668+
3669+ expected = json.load(md.open())
3670+ current = {}
3671+ for rel, sig in walk(repo, sign):
3672+ rel = rel.relpath(repo)
3673+ current[rel] = sig
3674+ add, change, delete = set(), set(), set()
3675+
3676+ for p, s in current.items():
3677+ fp = repo / p
3678+ if not fp.isfile():
3679+ continue
3680+ if ignorer and not ignorer(p):
3681+ continue
3682+
3683+ if p not in expected["signatures"]:
3684+ add.add(p)
3685+ continue
3686+ # layer, kind, sig
3687+ # don't include items generated only for the last layer
3688+ if expected["signatures"][p][0] == "composer":
3689+ continue
3690+ if expected["signatures"][p][2] != s:
3691+ change.add(p)
3692+
3693+ for p, d in expected["signatures"].items():
3694+ if p not in current:
3695+ delete.add(path(p))
3696+ return add, change, delete
3697+
3698+
3699+class ColoredFormatter(logging.Formatter):
3700+
3701+ def __init__(self, terminal, *args, **kwargs):
3702+ super(ColoredFormatter, self).__init__(*args, **kwargs)
3703+ self._terminal = terminal
3704+
3705+ def format(self, record):
3706+ output = super(ColoredFormatter, self).format(record)
3707+ if record.levelno >= logging.CRITICAL:
3708+ line_color = self._terminal.bold_yellow_on_red
3709+ elif record.levelno >= logging.ERROR:
3710+ line_color = self._terminal.red
3711+ elif record.levelno >= logging.WARNING:
3712+ line_color = self._terminal.yellow
3713+ elif record.levelno >= logging.INFO:
3714+ line_color = self._terminal.green
3715+ else:
3716+ line_color = self._terminal.cyan
3717+ return line_color(output)
3718+
3719+
3720+class TermWriter(object):
3721+ def __init__(self, fp=None, term=None):
3722+ if fp is None:
3723+ fp = sys.stdout
3724+ self.fp = fp
3725+ if term is None:
3726+ term = blessings.Terminal()
3727+ self.term = term
3728+
3729+ def __getattr__(self, key):
3730+ return getattr(self.term, key)
3731+
3732+ def write(self, msg, *args, **kwargs):
3733+ if 't' in kwargs:
3734+ raise ValueError("Using reserved token 't' in TermWriter.write")
3735+ kwargs['t'] = self.term
3736+ self.fp.write(msg.format(*args, **kwargs))
3737+
3738+
3739+class _O(dict):
3740+ def __getattr__(self, k):
3741+ return self[k]
3742+
3743+REACTIVE_PATTERNS = [
3744+ re.compile("\s*@when"),
3745+ re.compile(".set_state\(")
3746+]
3747+
3748+
3749+def delta_python(orig, dest, patterns=REACTIVE_PATTERNS, context=2):
3750+ """Delta two python files looking for certain patterns"""
3751+ if isinstance(orig, path):
3752+ od = orig.text()
3753+ elif hasattr(orig, 'read'):
3754+ od = orig.read()
3755+ else:
3756+ raise TypeError("Expected path() or file(), got %s" % type(orig))
3757+ if isinstance(dest, path):
3758+ dd = dest.text()
3759+ elif hasattr(orig, 'read'):
3760+ dd = dest.read()
3761+ else:
3762+ raise TypeError("Expected path() or file(), got %s" % type(dest))
3763+
3764+ differ = diff_match_patch()
3765+ linect = 0
3766+ lastMatch = None
3767+ for res in differ.diff_main(od, dd):
3768+ if res[0] == diff_match_patch.DIFF_EQUAL:
3769+ linect += res[1].count('\n')
3770+ lastMatch = res[:]
3771+ continue
3772+ elif res[0] == diff_match_patch.DIFF_INSERT:
3773+ linect += res[1].count('\n')
3774+ else:
3775+ linect -= res[1].count('\n')
3776+
3777+ for p in patterns:
3778+ if p.search(lastMatch[1]):
3779+ yield [linect, lastMatch, res]
3780+ break
3781+
3782+
3783+def delta_python_dump(orig, dest, patterns=REACTIVE_PATTERNS,
3784+ context=2, term=None,
3785+ from_name=None, to_name=None):
3786+ if term is None:
3787+ term = TermWriter()
3788+
3789+ def norm_sources(orig, dest):
3790+ if from_name:
3791+ oname = from_name
3792+ else:
3793+ oname = orig
3794+ if to_name:
3795+ dname = to_name
3796+ else:
3797+ dname = dest
3798+ return _O({'orig_name': oname, 'dest_name': dname})
3799+
3800+ def prefix_lines(lines, lineno):
3801+ if isinstance(lines, str):
3802+ lines = lines.splitlines()
3803+ for i, l in enumerate(lines):
3804+ lines[i] = "%-5d| %s" % (lineno + i, l)
3805+ return "\n".join(lines)
3806+
3807+ i = 0
3808+ for lineno, last, current in delta_python(orig, dest, patterns, context):
3809+ # pull enough context
3810+ if last:
3811+ context_lines = last[1].splitlines()[-context:]
3812+ message = norm_sources(orig, dest)
3813+ message['context'] = prefix_lines(context_lines, lineno - context)
3814+ message['lineno'] = lineno
3815+ message['delta'] = current[1]
3816+ s = {diff_match_patch.DIFF_EQUAL: term.normal,
3817+ diff_match_patch.DIFF_INSERT: term.green,
3818+ diff_match_patch.DIFF_DELETE: term.red}[current[0]]
3819+ message['status_color'] = s
3820+ # output message
3821+ term.write("{t.bold}{m.orig_name}{t.normal} --> "
3822+ "{t.bold}{m.dest_name}{t.normal}:\n",
3823+ m=message)
3824+ term.write("{m.context}{m.status_color}{m.delta}{t.normal}\n",
3825+ m=message)
3826+ i += 1
3827+ return i == 0
3828
3829=== added file 'doc/source/compose-intro.md'
3830--- doc/source/compose-intro.md 1970-01-01 00:00:00 +0000
3831+++ doc/source/compose-intro.md 2015-08-31 19:32:56 +0000
3832@@ -0,0 +1,18 @@
3833+charm compose/refresh combines various included layers to produce an output
3834+charm. These layers can be maintained and updated separately and then the
3835+refresh process can be used to regenerate the charm.
3836+
3837+COMPOSER_PATH is a ':' delimited path list used to resolve local include matches.
3838+INTERFACE_PATH is the directory from which interfaces will be resolved.
3839+
3840+Examples:
3841+charm compose -o /tmp/out trusty/mycharm
3842+
3843+Will generate /tmp/out/trusty/mycharm will all the includes specified.
3844+
3845+WORKFLOW
3846+========
3847+
3848+Typically you'll make changes in the layer owning the file(s) in queustion
3849+and then recompose the charm and deploy/upgrade-charm that. You'll not
3850+want to edit the generated charm directly.
3851
3852=== added file 'doc/source/composer.md'
3853--- doc/source/composer.md 1970-01-01 00:00:00 +0000
3854+++ doc/source/composer.md 2015-08-31 19:32:56 +0000
3855@@ -0,0 +1,123 @@
3856+Juju Charm Composition
3857+======================
3858+
3859+Status | *Alpha*
3860+------- -------
3861+
3862+This is a Prototype designed to flush out requirements around Charm
3863+Composition. Today its very common to fork charms for minor changes or to have
3864+to use subordinate charms to take advantages of frameworks where you need to
3865+deploy a custom workload to an existing runtime. With charm composition you
3866+should be able to include from a charm that provides the runtime (or just some
3867+well contained feature set) and maintain you're delta as a 'layer' that gets
3868+composed with its base to produce a new charm.
3869+
3870+This process should be runnable repeatedly allowing charms to be regenerated.
3871+
3872+
3873+This work is currently feature incomplete but does allow the generation of
3874+simple charms and useful basic composition. It is my hope that this will
3875+encourage discussion of the feature set needed to one day have charm
3876+composition supported natively in juju-core.
3877+
3878+
3879+Today the system can be run as follows:
3880+
3881+ ./juju_compose.py -o <output_repo> <charm to build from>
3882+
3883+So you might use the included (very unrealistic) test case as like:
3884+
3885+ ./juju_compose -o out -n foo tests/trusty/tester
3886+
3887+Running this should produce a charm in out/trusty/foo which is composed
3888+according to the composer.yaml file in tests/trusty/tester. While this isn't
3889+documented yet it shows some of the basic features of diverting hooks (for
3890+pre/post hooks support), replacing files, merging metadata.yaml changes, etc.
3891+
3892+It should be enough to give you an idea how it works. In order for this example
3893+to run you'll need to pip install bundletester as it shares some code with that
3894+project.
3895+
3896+Theory
3897+======
3898+
3899+A generated charm is composed of layers. The generator acts almost like a
3900+compiler taking the input from each layer and producing an output file in the
3901+resultant charm.
3902+
3903+The generator keeps track of which layer owns each file and allows layers to
3904+update files they own should the charm be refreshed later.
3905+
3906+The generated charm itself should be treated as immutable. The top layer that
3907+was used to generate it is where user level modifications should live.
3908+
3909+
3910+Setting Up your Repo
3911+====================
3912+This currently allows for two new ENV variables when run
3913+ COMPOSER_PATH: a ':' separated list of JUJU_REPOSITORY that should be searched for includes
3914+ INTERFACE_PATH: a ':' separated list of paths to resolve interface:_name_ includes from.
3915+
3916+JUJU_REPOSITORY entries take the usual format *series*/*charm*
3917+INTERFACE repos take the format of *interface_name*. Where interface_name is
3918+the name as it appears in the metadata.yaml
3919+
3920+Composition Types
3921+=================
3922+
3923+Each file in each layer gets matched by a single Tactic. Tactics implement how
3924+the data in a file moves from one layer to the next (and finally to the target
3925+charm). By default this will be a simple copy but in the cases of certain files
3926+(mostly known YAML files like metadata.yaml and config.yaml) each layer is
3927+combined with the previous layers before being written.
3928+
3929+Normally the default tactics are fine but you have the ability in the
3930+composer.yaml to list a set of Tactics objects that will be checked before the
3931+default and control how data moves from one layer to the next.
3932+
3933+
3934+composer.yaml
3935+=============
3936+Each layer used to build a charm can have a composer.yaml file. The top layer
3937+(the one actually invoked from the command line) must. These tell the generator what do,
3938+ranging from which base layers to include, to which interfaces. They also allow for
3939+the inclusion of specialized directives for processing some types of files.
3940+
3941+ Keys:
3942+ includes: ["trusty/mysql", "interface:mysql"]
3943+ tactics: [ dottedpath.toTacticClass, ]
3944+ config:
3945+ deletes:
3946+ - key names
3947+ metadata:
3948+ deletes:
3949+ - key names
3950+
3951+
3952+Includes is a list of one or more layers and interfaces that should be
3953+composited. Those layers may themselves have other includes and/or
3954+interfaces.
3955+
3956+Tactics is a list of Tactics to be loaded. See juju_compose.tactics.Tactics for
3957+the default interface. You'll typically need to implement at least a trigger() method
3958+and a __call__() method.
3959+
3960+config and metadata take optional lists of keys to remove from config.yaml and
3961+metadata.yaml when generating their data. This allows for charms to, for
3962+example, narrow what they expose to clients.
3963+
3964+
3965+Inspect
3966+=======
3967+
3968+If you've already generated a charm you can see which layers own which files by
3969+using the include **juju inspect [charmdir]*** command. This should render a
3970+tree of the files in the color of each layer. Each layers assigned color is
3971+presented in a legend at the top of the output.
3972+
3973+TODO:
3974+- lint about methods in base layer not provided/extended in lower
3975+layers
3976+
3977+
3978+
3979
3980=== removed file 'ez_setup.py'
3981--- ez_setup.py 2013-02-25 21:54:26 +0000
3982+++ ez_setup.py 1970-01-01 00:00:00 +0000
3983@@ -1,272 +0,0 @@
3984-#!python
3985-
3986-# NOTE TO LAUNCHPAD DEVELOPERS: This is a bootstrapping file from the
3987-# setuptools project. It is imported by our setup.py.
3988-
3989-"""Bootstrap setuptools installation
3990-
3991-If you want to use setuptools in your package's setup.py, just include this
3992-file in the same directory with it, and add this to the top of your setup.py::
3993-
3994- from ez_setup import use_setuptools
3995- use_setuptools()
3996-
3997-If you want to require a specific version of setuptools, set a download
3998-mirror, or use an alternate download directory, you can do so by supplying
3999-the appropriate options to ``use_setuptools()``.
4000-
4001-This file can also be run as a script to install or upgrade setuptools.
4002-"""
4003-import sys
4004-DEFAULT_VERSION = "0.6c11"
4005-DEFAULT_URL = "http://pypi.python.org/packages/%s/s/setuptools/" \
4006- % sys.version[:3]
4007-
4008-md5_data = {
4009- 'setuptools-0.6b1-py2.3.egg': '8822caf901250d848b996b7f25c6e6ca',
4010- 'setuptools-0.6b1-py2.4.egg': 'b79a8a403e4502fbb85ee3f1941735cb',
4011- 'setuptools-0.6b2-py2.3.egg': '5657759d8a6d8fc44070a9d07272d99b',
4012- 'setuptools-0.6b2-py2.4.egg': '4996a8d169d2be661fa32a6e52e4f82a',
4013- 'setuptools-0.6b3-py2.3.egg': 'bb31c0fc7399a63579975cad9f5a0618',
4014- 'setuptools-0.6b3-py2.4.egg': '38a8c6b3d6ecd22247f179f7da669fac',
4015- 'setuptools-0.6b4-py2.3.egg': '62045a24ed4e1ebc77fe039aa4e6f7e5',
4016- 'setuptools-0.6b4-py2.4.egg': '4cb2a185d228dacffb2d17f103b3b1c4',
4017- 'setuptools-0.6c1-py2.3.egg': 'b3f2b5539d65cb7f74ad79127f1a908c',
4018- 'setuptools-0.6c1-py2.4.egg': 'b45adeda0667d2d2ffe14009364f2a4b',
4019- 'setuptools-0.6c10-py2.3.egg': 'ce1e2ab5d3a0256456d9fc13800a7090',
4020- 'setuptools-0.6c10-py2.4.egg': '57d6d9d6e9b80772c59a53a8433a5dd4',
4021- 'setuptools-0.6c10-py2.5.egg': 'de46ac8b1c97c895572e5e8596aeb8c7',
4022- 'setuptools-0.6c10-py2.6.egg': '58ea40aef06da02ce641495523a0b7f5',
4023- 'setuptools-0.6c11-py2.3.egg': '2baeac6e13d414a9d28e7ba5b5a596de',
4024- 'setuptools-0.6c11-py2.4.egg': 'bd639f9b0eac4c42497034dec2ec0c2b',
4025- 'setuptools-0.6c11-py2.5.egg': '64c94f3bf7a72a13ec83e0b24f2749b2',
4026- 'setuptools-0.6c11-py2.6.egg': 'bfa92100bd772d5a213eedd356d64086',
4027- 'setuptools-0.6c2-py2.3.egg': 'f0064bf6aa2b7d0f3ba0b43f20817c27',
4028- 'setuptools-0.6c2-py2.4.egg': '616192eec35f47e8ea16cd6a122b7277',
4029- 'setuptools-0.6c3-py2.3.egg': 'f181fa125dfe85a259c9cd6f1d7b78fa',
4030- 'setuptools-0.6c3-py2.4.egg': 'e0ed74682c998bfb73bf803a50e7b71e',
4031- 'setuptools-0.6c3-py2.5.egg': 'abef16fdd61955514841c7c6bd98965e',
4032- 'setuptools-0.6c4-py2.3.egg': 'b0b9131acab32022bfac7f44c5d7971f',
4033- 'setuptools-0.6c4-py2.4.egg': '2a1f9656d4fbf3c97bf946c0a124e6e2',
4034- 'setuptools-0.6c4-py2.5.egg': '8f5a052e32cdb9c72bcf4b5526f28afc',
4035- 'setuptools-0.6c5-py2.3.egg': 'ee9fd80965da04f2f3e6b3576e9d8167',
4036- 'setuptools-0.6c5-py2.4.egg': 'afe2adf1c01701ee841761f5bcd8aa64',
4037- 'setuptools-0.6c5-py2.5.egg': 'a8d3f61494ccaa8714dfed37bccd3d5d',
4038- 'setuptools-0.6c6-py2.3.egg': '35686b78116a668847237b69d549ec20',
4039- 'setuptools-0.6c6-py2.4.egg': '3c56af57be3225019260a644430065ab',
4040- 'setuptools-0.6c6-py2.5.egg': 'b2f8a7520709a5b34f80946de5f02f53',
4041- 'setuptools-0.6c7-py2.3.egg': '209fdf9adc3a615e5115b725658e13e2',
4042- 'setuptools-0.6c7-py2.4.egg': '5a8f954807d46a0fb67cf1f26c55a82e',
4043- 'setuptools-0.6c7-py2.5.egg': '45d2ad28f9750e7434111fde831e8372',
4044- 'setuptools-0.6c8-py2.3.egg': '50759d29b349db8cfd807ba8303f1902',
4045- 'setuptools-0.6c8-py2.4.egg': 'cba38d74f7d483c06e9daa6070cce6de',
4046- 'setuptools-0.6c8-py2.5.egg': '1721747ee329dc150590a58b3e1ac95b',
4047- 'setuptools-0.6c9-py2.3.egg': 'a83c4020414807b496e4cfbe08507c03',
4048- 'setuptools-0.6c9-py2.4.egg': '260a2be2e5388d66bdaee06abec6342a',
4049- 'setuptools-0.6c9-py2.5.egg': 'fe67c3e5a17b12c0e7c541b7ea43a8e6',
4050- 'setuptools-0.6c9-py2.6.egg': 'ca37b1ff16fa2ede6e19383e7b59245a',
4051-}
4052-
4053-import os
4054-import sys
4055-
4056-try:
4057- from hashlib import md5
4058-except ImportError:
4059- from md5 import md5
4060-
4061-
4062-def _validate_md5(egg_name, data):
4063- if egg_name in md5_data:
4064- digest = md5(data).hexdigest()
4065- if digest != md5_data[egg_name]:
4066- print >>sys.stderr, (
4067- "md5 validation of %s failed! (Possible download problem?)"
4068- % egg_name
4069- )
4070- sys.exit(2)
4071- return data
4072-
4073-
4074-def use_setuptools(
4075- version=DEFAULT_VERSION, download_base=DEFAULT_URL, to_dir=os.curdir,
4076- download_delay=15
4077-):
4078- """Automatically find/download setuptools and make it available on sys.path
4079-
4080- `version` should be a valid setuptools version number that is available
4081- as an egg for download under the `download_base` URL (which should end with
4082- a '/'). `to_dir` is the directory where setuptools will be downloaded, if
4083- it is not already available. If `download_delay` is specified, it should
4084- be the number of seconds that will be paused before initiating a download,
4085- should one be required. If an older version of setuptools is installed,
4086- this routine will print a message to ``sys.stderr`` and raise SystemExit in
4087- an attempt to abort the calling script.
4088- """
4089- was_imported = 'pkg_resources' in sys.modules \
4090- or 'setuptools' in sys.modules
4091-
4092- def do_download():
4093- egg = download_setuptools(version, download_base, to_dir,
4094- download_delay)
4095- sys.path.insert(0, egg)
4096- import setuptools
4097- setuptools.bootstrap_install_from = egg
4098- try:
4099- import pkg_resources
4100- except ImportError:
4101- return do_download()
4102- try:
4103- pkg_resources.require("setuptools>=" + version)
4104- return
4105- except pkg_resources.VersionConflict, e:
4106- if was_imported:
4107- print >>sys.stderr, (
4108- "The required version of setuptools (>=%s) is not available, and\n"
4109- "can't be installed while this script is running. Please install\n"
4110- " a more recent version first, using 'easy_install -U setuptools'."
4111- "\n\n(Currently using %r)"
4112- ) % (version, e.args[0])
4113- sys.exit(2)
4114- else:
4115- del pkg_resources, sys.modules['pkg_resources'] # reload ok
4116- return do_download()
4117- except pkg_resources.DistributionNotFound:
4118- return do_download()
4119-
4120-
4121-def download_setuptools(
4122- version=DEFAULT_VERSION, download_base=DEFAULT_URL, to_dir=os.curdir,
4123- delay=15
4124-):
4125- """Download setuptools from a specified location and return its filename
4126-
4127- `version` should be a valid setuptools version number that is available
4128- as an egg for download under the `download_base` URL (which should end
4129- with a '/'). `to_dir` is the directory where the egg will be downloaded.
4130- `delay` is the number of seconds to pause before an actual download
4131- attempt.
4132- """
4133- import urllib2
4134- import shutil
4135- egg_name = "setuptools-%s-py%s.egg" % (version, sys.version[:3])
4136- url = download_base + egg_name
4137- saveto = os.path.join(to_dir, egg_name)
4138- src = dst = None
4139- if not os.path.exists(saveto): # Avoid repeated downloads
4140- try:
4141- from distutils import log
4142- if delay:
4143- log.warn("""
4144----------------------------------------------------------------------------
4145-This script requires setuptools version %s to run (even to display
4146-help). I will attempt to download it for you (from
4147-%s), but
4148-you may need to enable firewall access for this script first.
4149-I will start the download in %d seconds.
4150-
4151-(Note: if this machine does not have network access, please obtain the file
4152-
4153- %s
4154-
4155-and place it in this directory before rerunning this script.)
4156----------------------------------------------------------------------------""",
4157- version, download_base, delay, url
4158- )
4159- from time import sleep
4160- sleep(delay)
4161- log.warn("Downloading %s", url)
4162- src = urllib2.urlopen(url)
4163- # Read/write all in one block, so we don't create a corrupt file
4164- # if the download is interrupted.
4165- data = _validate_md5(egg_name, src.read())
4166- dst = open(saveto, "wb")
4167- dst.write(data)
4168- finally:
4169- if src:
4170- src.close()
4171- if dst:
4172- dst.close()
4173- return os.path.realpath(saveto)
4174-
4175-
4176-def main(argv, version=DEFAULT_VERSION):
4177- """Install or upgrade setuptools and EasyInstall"""
4178- try:
4179- import setuptools
4180- except ImportError:
4181- egg = None
4182- try:
4183- egg = download_setuptools(version, delay=0)
4184- sys.path.insert(0, egg)
4185- from setuptools.command.easy_install import main
4186- return main(list(argv) + [egg]) # we're done here
4187- finally:
4188- if egg and os.path.exists(egg):
4189- os.unlink(egg)
4190- else:
4191- if setuptools.__version__ == '0.0.1':
4192- print >>sys.stderr, (
4193- "You have an obsolete version of setuptools installed. Please\n"
4194- "remove it from your system entirely before rerunning this script."
4195- )
4196- sys.exit(2)
4197-
4198- req = "setuptools>=" + version
4199- import pkg_resources
4200- try:
4201- pkg_resources.require(req)
4202- except pkg_resources.VersionConflict:
4203- try:
4204- from setuptools.command.easy_install import main
4205- except ImportError:
4206- from easy_install import main
4207- main(list(argv) + [download_setuptools(delay=0)])
4208- sys.exit(0) # try to force an exit
4209- else:
4210- if argv:
4211- from setuptools.command.easy_install import main
4212- main(argv)
4213- else:
4214- print "Setuptools version", version, "or greater has been " \
4215- + "installed."
4216- print '(Run "ez_setup.py -U setuptools" to reinstall or upgrade.)'
4217-
4218-
4219-def update_md5(filenames):
4220- """Update our built-in md5 registry"""
4221-
4222- import re
4223-
4224- for name in filenames:
4225- base = os.path.basename(name)
4226- f = open(name, 'rb')
4227- md5_data[base] = md5(f.read()).hexdigest()
4228- f.close()
4229-
4230- data = [" %r: %r,\n" % it for it in md5_data.items()]
4231- data.sort()
4232- repl = "".join(data)
4233-
4234- import inspect
4235- srcfile = inspect.getsourcefile(sys.modules[__name__])
4236- f = open(srcfile, 'rb')
4237- src = f.read()
4238- f.close()
4239-
4240- match = re.search("\nmd5_data = {\n([^}]+)}", src)
4241- if not match:
4242- print >>sys.stderr, "Internal error!"
4243- sys.exit(2)
4244-
4245- src = src[:match.start(1)] + repl + src[match.end(1):]
4246- f = open(srcfile, 'w')
4247- f.write(src)
4248- f.close()
4249-
4250-
4251-if __name__ == '__main__':
4252- if len(sys.argv) > 2 and sys.argv[1] == '--md5update':
4253- update_md5(sys.argv[2:])
4254- else:
4255- main(sys.argv[1:])
4256
4257=== modified file 'helpers/python/charmhelpers/tests/test_charmhelpers.py'
4258--- helpers/python/charmhelpers/tests/test_charmhelpers.py 2013-08-21 04:04:39 +0000
4259+++ helpers/python/charmhelpers/tests/test_charmhelpers.py 2015-08-31 19:32:56 +0000
4260@@ -3,7 +3,7 @@
4261 import unittest
4262 import yaml
4263
4264-from simplejson import dumps
4265+from json import dumps
4266 from StringIO import StringIO
4267 from testtools import TestCase
4268
4269
4270=== modified file 'requirements.txt'
4271--- requirements.txt 2015-08-24 16:35:22 +0000
4272+++ requirements.txt 2015-08-31 19:32:56 +0000
4273@@ -1,17 +1,22 @@
4274 PyYAML==3.11
4275+blessings==1.6
4276+bundletester==0.5.2
4277+bzr>=2.6.0
4278+charmworldlib>=0.4.2
4279+coverage==3.7.1
4280+flake8==1.6.2
4281+httplib2==0.7.7
4282+juju-deployer==0.4.3
4283+jujubundlelib>=0.1.9
4284+jujuclient==0.50.1
4285 launchpadlib==1.10.2
4286+mock==1.0.1
4287 nose==1.2.1
4288-requests==1.1.0
4289-lazr.authentication==0.1.2
4290-lazr.restfulclient==0.13.1
4291-lazr.uri==1.0.3
4292-simplejson==2.2.1
4293-wadllib==1.3.1
4294-httplib2==0.7.7
4295 oauth==1.0.1
4296-flake8==1.6.2
4297-mock==1.0.1
4298-coverage==3.7.1
4299-charmworldlib>=0.3.0
4300-bzr>=2.6.0
4301-jujubundlelib>=0.1.9
4302+otherstuf==1.1.0
4303+path.py==7.4
4304+pathspec==0.3.3
4305+pip>=7.1.2
4306+requests==2.7.0
4307+responses==0.4.0
4308+ruamel.yaml==0.10.2
4309
4310=== added directory 'scripts'
4311=== removed directory 'scripts'
4312=== added file 'scripts/packages.sh'
4313--- scripts/packages.sh 1970-01-01 00:00:00 +0000
4314+++ scripts/packages.sh 2015-08-31 19:32:56 +0000
4315@@ -0,0 +1,19 @@
4316+#!/bin/bash
4317+
4318+function apt_install() {
4319+ packages=$@
4320+ missing=()
4321+ for p in $packages; do
4322+ if ! dpkg-query -s $p &> /dev/null; then
4323+ missing+=($p)
4324+ fi
4325+ done
4326+ if [ -n "${missing}" ]; then
4327+ sudo apt-get update
4328+ sudo apt-get install -y ${missing}
4329+ return 1
4330+ fi
4331+ return 0
4332+}
4333+apt_install $@
4334+exit $?
4335
4336=== removed file 'scripts/test'
4337--- scripts/test 2014-05-27 21:23:41 +0000
4338+++ scripts/test 1970-01-01 00:00:00 +0000
4339@@ -1,7 +0,0 @@
4340-#!/bin/bash
4341-# If the --pdb switch is passed, inject --pdb-failures too.
4342-if [[ $* =~ --pdb( .*|$) ]]
4343-then
4344- extra_args="--pdb-failures"
4345-fi
4346-bin/nosetests --exe --with-id $extra_args $@
4347
4348=== added file 'setup.cfg'
4349--- setup.cfg 1970-01-01 00:00:00 +0000
4350+++ setup.cfg 2015-08-31 19:32:56 +0000
4351@@ -0,0 +1,8 @@
4352+[nosetests]
4353+verbosity=1
4354+detailed-errors=1
4355+#pdb=1
4356+#pdb-failures=1
4357+logging-level=INFO
4358+
4359+
4360
4361=== modified file 'setup.py'
4362--- setup.py 2015-08-24 16:35:22 +0000
4363+++ setup.py 2015-08-31 19:32:56 +0000
4364@@ -3,11 +3,6 @@
4365 # Copyright 2012 Canonical Ltd. This software is licensed under the
4366 # GNU General Public License version 3 (see the file LICENSE).
4367
4368-import ez_setup
4369-
4370-
4371-ez_setup.use_setuptools()
4372-
4373 from setuptools import setup, find_packages
4374
4375
4376@@ -18,7 +13,9 @@
4377 exclude=["*.tests", "*.tests.*", "tests.*", "tests"]),
4378 install_requires=['launchpadlib', 'argparse', 'cheetah', 'pyyaml',
4379 'pycrypto', 'paramiko', 'bzr', 'requests',
4380- 'charmworldlib', 'jujubundlelib'],
4381+ 'charmworldlib', 'blessings', 'ruamel.yaml',
4382+ 'pathspec', 'bundletester', 'otherstuf', "path.py",
4383+ "jujubundlelib"],
4384 include_package_data=True,
4385 maintainer='Marco Ceppi',
4386 maintainer_email='marco@ceppi.net',
4387@@ -33,27 +30,30 @@
4388 entry_points={
4389 'console_scripts': [
4390 'charm = charmtools:charm',
4391- 'juju-charm = charmtools:charm',
4392- 'juju-bundle = charmtools:bundle',
4393- 'juju-test = charmtools.test:main',
4394+ 'charm-add = charmtools.generate:main',
4395+ 'charm-compose = charmtools.compose:main',
4396+ 'charm-create = charmtools.create:main',
4397+ 'charm-generate = charmtools.generate:main',
4398 'charm-get = charmtools.get:main',
4399 'charm-getall = charmtools.getall:main',
4400- 'charm-proof = charmtools.proof:main',
4401- 'charm-create = charmtools.create:main',
4402+ 'charm-help = charmtools.cli:usage',
4403+ 'charm-info = charmtools.info:main',
4404+ 'charm-inspect = charmtools.compose:inspect',
4405 'charm-list = charmtools.list:main',
4406 'charm-promulgate = charmtools.promulgate:main',
4407+ 'charm-proof = charmtools.proof:main',
4408+ 'charm-refresh = charmtools.compose:main',
4409 'charm-review = charmtools.review:main',
4410 'charm-review-queue = charmtools.review_queue:main',
4411 'charm-search = charmtools.search:main',
4412 'charm-subscribers = charmtools.subscribers:main',
4413+ 'charm-test = charmtools.test:main',
4414 'charm-unpromulgate = charmtools.unpromulgate:main',
4415 'charm-update = charmtools.update:main',
4416 'charm-version = charmtools.version:main',
4417- 'charm-help = charmtools.cli:usage',
4418- 'charm-test = charmtools.test:main',
4419- 'charm-info = charmtools.info:main',
4420- 'charm-generate = charmtools.generate:main',
4421- 'charm-add = charmtools.generate:main',
4422+ 'juju-bundle = charmtools:bundle',
4423+ 'juju-charm = charmtools:charm',
4424+ 'juju-test = charmtools.test:main',
4425 ],
4426 'charmtools.templates': [
4427 'bash = charmtools.templates.bash:BashCharmTemplate',
4428
4429=== added directory 'tests/interfaces'
4430=== added directory 'tests/interfaces/mysql'
4431=== added file 'tests/interfaces/mysql/interface.yaml'
4432--- tests/interfaces/mysql/interface.yaml 1970-01-01 00:00:00 +0000
4433+++ tests/interfaces/mysql/interface.yaml 2015-08-31 19:32:56 +0000
4434@@ -0,0 +1,1 @@
4435+name: mysql
4436
4437=== added file 'tests/interfaces/mysql/provides.py'
4438--- tests/interfaces/mysql/provides.py 1970-01-01 00:00:00 +0000
4439+++ tests/interfaces/mysql/provides.py 2015-08-31 19:32:56 +0000
4440@@ -0,0 +1,1 @@
4441+"provides"
4442
4443=== added file 'tests/interfaces/mysql/requires.py'
4444--- tests/interfaces/mysql/requires.py 1970-01-01 00:00:00 +0000
4445+++ tests/interfaces/mysql/requires.py 2015-08-31 19:32:56 +0000
4446@@ -0,0 +1,1 @@
4447+"requires"
4448
4449=== modified file 'tests/test_charm_generate.py'
4450--- tests/test_charm_generate.py 2014-11-05 22:09:11 +0000
4451+++ tests/test_charm_generate.py 2015-08-31 19:32:56 +0000
4452@@ -24,13 +24,6 @@
4453 'provides': 'nointerface'})
4454
4455 @patch('charmtools.generate.Charm')
4456- @patch('charmtools.generate.shutil')
4457- def test_copy_file(self, msh, mcharm):
4458- m = mcharm.return_value.is_charm.return_value = True
4459- copy_file('1.ex', '/tmp')
4460- msh.copy.assert_called()
4461-
4462- @patch('charmtools.generate.Charm')
4463 def test_not_charm(self, mcharm):
4464 mcharm.return_value.is_charm.return_value = False
4465 self.assertRaises(Exception, copy_file, '1.ex', '/no-charm')
4466
4467=== added file 'tests/test_compose.py'
4468--- tests/test_compose.py 1970-01-01 00:00:00 +0000
4469+++ tests/test_compose.py 2015-08-31 19:32:56 +0000
4470@@ -0,0 +1,239 @@
4471+from charmtools import compose
4472+from charmtools import utils
4473+from path import path
4474+from ruamel import yaml
4475+import json
4476+import logging
4477+import mock
4478+import os
4479+import pkg_resources
4480+import responses
4481+import unittest
4482+
4483+
4484+class TestCompose(unittest.TestCase):
4485+ def setUp(self):
4486+ dirname = pkg_resources.resource_filename(__name__, "")
4487+ os.environ["COMPOSER_PATH"] = path(dirname)
4488+ os.environ["INTERFACE_PATH"] = path(dirname) / "interfaces"
4489+ path("out").rmtree_p()
4490+
4491+ def tearDown(self):
4492+ path("out").rmtree_p()
4493+
4494+ def test_tester_compose(self):
4495+ composer = compose.Composer()
4496+ composer.log_level = "WARNING"
4497+ composer.output_dir = "out"
4498+ composer.series = "trusty"
4499+ composer.name = "foo"
4500+ composer.charm = "trusty/tester"
4501+ composer()
4502+ base = path('out/trusty/foo')
4503+ self.assertTrue(base.exists())
4504+
4505+ # Verify ignore rules applied
4506+ self.assertFalse((base / ".bzr").exists())
4507+
4508+ # Metadata should have combined provides fields
4509+ metadata = base / "metadata.yaml"
4510+ self.assertTrue(metadata.exists())
4511+ metadata_data = yaml.load(metadata.open())
4512+ self.assertIn("shared-db", metadata_data['provides'])
4513+ self.assertIn("storage", metadata_data['provides'])
4514+
4515+ # Config should have keys but not the ones in deletes
4516+ config = base / "config.yaml"
4517+ self.assertTrue(config.exists())
4518+ config_data = yaml.load(config.open())['options']
4519+ self.assertIn("bind-address", config_data)
4520+ self.assertNotIn("vip", config_data)
4521+
4522+ cyaml = base / "composer.yaml"
4523+ self.assertTrue(cyaml.exists())
4524+ cyaml_data = yaml.load(cyaml.open())
4525+ self.assertEquals(cyaml_data['includes'], ['trusty/mysql'])
4526+ self.assertEquals(cyaml_data['is'], 'foo')
4527+
4528+ self.assertTrue((base / "hooks/config-changed").exists())
4529+
4530+ # Files from the top layer as overrides
4531+ start = base / "hooks/start"
4532+ self.assertTrue(start.exists())
4533+ self.assertIn("Overridden", start.text())
4534+
4535+ self.assertTrue((base / "README.md").exists())
4536+ self.assertEqual("dynamic tactics", (base / "README.md").text())
4537+
4538+ sigs = base / ".composer.manifest"
4539+ self.assertTrue(sigs.exists())
4540+ data = json.load(sigs.open())
4541+ self.assertEquals(data['signatures']["README.md"], [
4542+ u'foo',
4543+ "static",
4544+ u'cfac20374288c097975e9f25a0d7c81783acdbc81'
4545+ '24302ff4a731a4aea10de99'])
4546+
4547+ self.assertEquals(data["signatures"]['metadata.yaml'], [
4548+ u'foo',
4549+ "dynamic",
4550+ u'8dd9059eae849c61a1bd3d8de7f96a418e'
4551+ u'f8b4bf5d9c058c413b5169e2783815',
4552+ ])
4553+
4554+ def test_regenerate_inplace(self):
4555+ # take a generated example where a base layer has changed
4556+ # regenerate in place
4557+ # make some assertions
4558+ composer = compose.Composer()
4559+ composer.log_level = "WARNING"
4560+ composer.output_dir = "out"
4561+ composer.series = "trusty"
4562+ composer.name = "foo"
4563+ composer.charm = "trusty/b"
4564+ composer()
4565+ base = path('out/trusty/foo')
4566+ self.assertTrue(base.exists())
4567+
4568+ # verify the 1st gen worked
4569+ self.assertTrue((base / "a").exists())
4570+ self.assertTrue((base / "README.md").exists())
4571+
4572+ # now regenerate from the target
4573+ with utils.cd("out/trusty/foo"):
4574+ composer = compose.Composer()
4575+ composer.log_level = "WARNING"
4576+ composer.output_dir = path(os.getcwd())
4577+ composer.series = "trusty"
4578+ # The generate target and source are now the same
4579+ composer.name = "foo"
4580+ composer.charm = "."
4581+ composer()
4582+ base = composer.output_dir
4583+ self.assertTrue(base.exists())
4584+
4585+ # Check that the generated composer makes sense
4586+ cy = base / "composer.yaml"
4587+ config = yaml.load(cy.open())
4588+ self.assertEquals(config["includes"], ["trusty/a", "interface:mysql"])
4589+ self.assertEquals(config["is"], "foo")
4590+
4591+ # We can even run it more than once
4592+ composer()
4593+ cy = base / "composer.yaml"
4594+ config = yaml.load(cy.open())
4595+ self.assertEquals(config["includes"], ["trusty/a", "interface:mysql"])
4596+ self.assertEquals(config["is"], "foo")
4597+
4598+ # We included an interface, we should be able to assert things about it
4599+ # in its final form as well
4600+ provides = base / "hooks/relations/mysql/provides.py"
4601+ requires = base / "hooks/relations/mysql/requires.py"
4602+ self.assertTrue(provides.exists())
4603+ self.assertTrue(requires.exists())
4604+
4605+ # and that we generated the hooks themselves
4606+ for kind in ["joined", "changed", "broken", "departed"]:
4607+ self.assertTrue((base / "hooks" /
4608+ "mysql-relation-{}".format(kind)).exists())
4609+
4610+ # and ensure we have an init file (the interface doesn't its added)
4611+ init = base / "hooks/relations/mysql/__init__.py"
4612+ self.assertTrue(init.exists())
4613+
4614+ @responses.activate
4615+ def test_remote_interface(self):
4616+ # XXX: this test does pull the git repo in the response
4617+ responses.add(responses.GET,
4618+ "http://interfaces.juju.solutions/api/v1/interface/pgsql/",
4619+ body='''{
4620+ "id": "pgsql",
4621+ "name": "pgsql4",
4622+ "repo":
4623+ "https://github.com/bcsaller/juju-relation-pgsql.git",
4624+ "_id": {
4625+ "$oid": "55a471959c1d246feae487e5"
4626+ },
4627+ "version": 1
4628+ }''',
4629+ content_type="application/json")
4630+ composer = compose.Composer()
4631+ composer.log_level = "WARNING"
4632+ composer.output_dir = "out"
4633+ composer.series = "trusty"
4634+ composer.name = "foo"
4635+ composer.charm = "trusty/c-reactive"
4636+ composer()
4637+ base = path('out/trusty/foo')
4638+ self.assertTrue(base.exists())
4639+
4640+ # basics
4641+ self.assertTrue((base / "a").exists())
4642+ self.assertTrue((base / "README.md").exists())
4643+ # show that we pulled the interface from github
4644+ init = base / "hooks/relations/pgsql/__init__.py"
4645+ self.assertTrue(init.exists())
4646+ main = base / "hooks/reactive/main.py"
4647+ self.assertTrue(main.exists())
4648+
4649+ @mock.patch("charmtools.utils.Process")
4650+ @responses.activate
4651+ def test_remote_layer(self, mcall):
4652+ # XXX: this test does pull the git repo in the response
4653+ responses.add(responses.GET,
4654+ "http://interfaces.juju.solutions/api/v1/layer/basic/",
4655+ body='''{
4656+ "id": "basic",
4657+ "name": "basic",
4658+ "repo":
4659+ "https://git.launchpad.net/~bcsaller/charms/+source/basic",
4660+ "_id": {
4661+ "$oid": "55a471959c1d246feae487e5"
4662+ },
4663+ "version": 1
4664+ }''',
4665+ content_type="application/json")
4666+ composer = compose.Composer()
4667+ composer.log_level = "WARNING"
4668+ composer.output_dir = "out"
4669+ composer.series = "trusty"
4670+ composer.name = "foo"
4671+ composer.charm = "trusty/use-layers"
4672+ # remove the sign phase
4673+ composer.PHASES = composer.PHASES[:-2]
4674+
4675+ composer()
4676+ base = path('out/trusty/foo')
4677+ self.assertTrue(base.exists())
4678+
4679+ # basics
4680+ self.assertTrue((base / "README.md").exists())
4681+
4682+ # show that we pulled charmhelpers from the basic layer as well
4683+ mcall.assert_called_with(("pip", "install", "-U",
4684+ '--exists-action', 'i',
4685+ "-t", mock.ANY,
4686+ mock.ANY))
4687+
4688+
4689+ @mock.patch("charmtools.utils.Process")
4690+ def test_pypi_installer(self, mcall):
4691+ composer = compose.Composer()
4692+ composer.log_level = "WARN"
4693+ composer.output_dir = "out"
4694+ composer.series = "trusty"
4695+ composer.name = "foo"
4696+ composer.charm = "trusty/chlayer"
4697+
4698+ # remove the sign phase
4699+ composer.PHASES = composer.PHASES[:-2]
4700+ composer()
4701+ mcall.assert_called_with(("pip", "install", "-U",
4702+ '--exists-action', 'i',
4703+ "-t", mock.ANY,
4704+ "charmhelpers"))
4705+
4706+
4707+if __name__ == '__main__':
4708+ logging.basicConfig()
4709+ unittest.main()
4710
4711=== added file 'tests/test_config.py'
4712--- tests/test_config.py 1970-01-01 00:00:00 +0000
4713+++ tests/test_config.py 2015-08-31 19:32:56 +0000
4714@@ -0,0 +1,29 @@
4715+import logging
4716+import unittest
4717+
4718+from charmtools.compose.config import ComposerConfig
4719+
4720+
4721+class TestConfig(unittest.TestCase):
4722+ def test_rget(self):
4723+ c = ComposerConfig()
4724+ c['a'] = 1
4725+ c = c.new_child()
4726+ c['a'] = 99
4727+ c['b'] = "alpha"
4728+ self.assertEqual(c.get('a'), 99)
4729+ self.assertEqual(c.get('b'), "alpha")
4730+ self.assertEqual(c.rget('a'), [99, 1])
4731+
4732+ def test_tactics(self):
4733+ # configure from empty and a layer with tactics
4734+ c = ComposerConfig()
4735+ c._tactics = ['a', 'b', 'c']
4736+ c = c.new_child()
4737+ c._tactics = ['d', 'c']
4738+ self.assertEqual(c.tactics()[:5], ['d', 'c', 'a', 'b', 'c'])
4739+
4740+
4741+if __name__ == '__main__':
4742+ logging.basicConfig()
4743+ unittest.main()
4744
4745=== modified file 'tests/test_juju_test.py'
4746--- tests/test_juju_test.py 2014-06-10 21:26:52 +0000
4747+++ tests/test_juju_test.py 2015-08-31 19:32:56 +0000
4748@@ -714,15 +714,6 @@
4749 call('Failed to grab logs for dummy/0')]
4750 o.log.warn.assert_has_calls(expected_warns)
4751
4752- @patch('subprocess.check_output')
4753- @patch.object(juju_test.Orchestra, 'print_status')
4754- def test_orchestra_perform(self, mprint_status, mcheck_output):
4755- args = Arguments(tests='dummy', juju_env='testing', timeout=1)
4756- c = juju_test.Conductor(args)
4757- o = juju_test.Orchestra(c, 'test/dummy')
4758- o.perform()
4759- mprint_status.assert_called_once()
4760-
4761
4762 class TestCfgTest(unittest.TestCase):
4763 test_config = '''\
4764
4765=== added file 'tests/test_utils.py'
4766--- tests/test_utils.py 1970-01-01 00:00:00 +0000
4767+++ tests/test_utils.py 2015-08-31 19:32:56 +0000
4768@@ -0,0 +1,43 @@
4769+from unittest import TestCase
4770+from charmtools import utils
4771+from StringIO import StringIO
4772+
4773+
4774+class TestUtils(TestCase):
4775+
4776+ def test_delta_python(self):
4777+ a = StringIO("""
4778+ def foo(n):
4779+ return n * 2
4780+
4781+
4782+ @when('db.ready')
4783+ def react(db):
4784+ print db
4785+ """)
4786+
4787+ b = StringIO("""
4788+ def foo(n):
4789+ return n * 2
4790+
4791+
4792+ @when('db.ready', 'bar')
4793+ def react(db):
4794+ print db
4795+ """)
4796+
4797+ result = StringIO()
4798+ t = utils.TermWriter(fp=result)
4799+ rc = utils.delta_python_dump(a, b, utils.REACTIVE_PATTERNS,
4800+ context=3,
4801+ term=t,
4802+ from_name="Alpha",
4803+ to_name="Beta")
4804+ # return code here indicates that there was a diff
4805+ self.assertFalse(rc)
4806+ result.seek(0)
4807+ output = result.read()
4808+ self.assertIn("Alpha", output)
4809+ self.assertIn("Beta", output)
4810+ self.assertIn("@when('db.ready'", output)
4811+ self.assertIn("bar", output)
4812
4813=== added directory 'tests/trusty'
4814=== added directory 'tests/trusty/a'
4815=== added file 'tests/trusty/a/README.md'
4816--- tests/trusty/a/README.md 1970-01-01 00:00:00 +0000
4817+++ tests/trusty/a/README.md 2015-08-31 19:32:56 +0000
4818@@ -0,0 +1,1 @@
4819+From A
4820
4821=== added file 'tests/trusty/a/a'
4822--- tests/trusty/a/a 1970-01-01 00:00:00 +0000
4823+++ tests/trusty/a/a 2015-08-31 19:32:56 +0000
4824@@ -0,0 +1,1 @@
4825+from a
4826
4827=== added directory 'tests/trusty/b'
4828=== added file 'tests/trusty/b/README.md'
4829--- tests/trusty/b/README.md 1970-01-01 00:00:00 +0000
4830+++ tests/trusty/b/README.md 2015-08-31 19:32:56 +0000
4831@@ -0,0 +1,1 @@
4832+This is an overridden readme file
4833
4834=== added file 'tests/trusty/b/composer.yaml'
4835--- tests/trusty/b/composer.yaml 1970-01-01 00:00:00 +0000
4836+++ tests/trusty/b/composer.yaml 2015-08-31 19:32:56 +0000
4837@@ -0,0 +1,1 @@
4838+includes: ["trusty/a", "interface:mysql"]
4839
4840=== added file 'tests/trusty/b/metadata.yaml'
4841--- tests/trusty/b/metadata.yaml 1970-01-01 00:00:00 +0000
4842+++ tests/trusty/b/metadata.yaml 2015-08-31 19:32:56 +0000
4843@@ -0,0 +1,11 @@
4844+name: b
4845+summary: An imagined extension to the a charm
4846+maintainer: None
4847+description: |
4848+ Test layer b
4849+categories:
4850+ - app
4851+requires:
4852+ mysql:
4853+ interface: mysql
4854+
4855
4856=== added directory 'tests/trusty/c'
4857=== added directory 'tests/trusty/c-reactive'
4858=== added file 'tests/trusty/c-reactive/README.md'
4859--- tests/trusty/c-reactive/README.md 1970-01-01 00:00:00 +0000
4860+++ tests/trusty/c-reactive/README.md 2015-08-31 19:32:56 +0000
4861@@ -0,0 +1,1 @@
4862+This is an overridden readme file
4863
4864=== added file 'tests/trusty/c-reactive/composer.yaml'
4865--- tests/trusty/c-reactive/composer.yaml 1970-01-01 00:00:00 +0000
4866+++ tests/trusty/c-reactive/composer.yaml 2015-08-31 19:32:56 +0000
4867@@ -0,0 +1,1 @@
4868+includes: ["trusty/c"]
4869
4870=== added directory 'tests/trusty/c-reactive/hooks'
4871=== added directory 'tests/trusty/c-reactive/hooks/reactive'
4872=== added file 'tests/trusty/c-reactive/hooks/reactive/main.py'
4873--- tests/trusty/c-reactive/hooks/reactive/main.py 1970-01-01 00:00:00 +0000
4874+++ tests/trusty/c-reactive/hooks/reactive/main.py 2015-08-31 19:32:56 +0000
4875@@ -0,0 +1,6 @@
4876+from charmhelpers.core.reactive import when
4877+from charmhelpers.core import hookenv
4878+
4879+@when('db.database.available')
4880+def pretend_we_have_db(pgsql):
4881+ hookenv.log("Got db: %s:%s" %(pgsql.host(), pgsql.database()))
4882
4883=== added file 'tests/trusty/c/README.md'
4884--- tests/trusty/c/README.md 1970-01-01 00:00:00 +0000
4885+++ tests/trusty/c/README.md 2015-08-31 19:32:56 +0000
4886@@ -0,0 +1,1 @@
4887+This is an overridden readme file
4888
4889=== added file 'tests/trusty/c/composer.yaml'
4890--- tests/trusty/c/composer.yaml 1970-01-01 00:00:00 +0000
4891+++ tests/trusty/c/composer.yaml 2015-08-31 19:32:56 +0000
4892@@ -0,0 +1,1 @@
4893+includes: ["trusty/a", "interface:pgsql"]
4894
4895=== added file 'tests/trusty/c/metadata.yaml'
4896--- tests/trusty/c/metadata.yaml 1970-01-01 00:00:00 +0000
4897+++ tests/trusty/c/metadata.yaml 2015-08-31 19:32:56 +0000
4898@@ -0,0 +1,11 @@
4899+name: c
4900+summary: An imagined extension to the a charm
4901+maintainer: None
4902+description: |
4903+ Test layer c
4904+categories:
4905+ - app
4906+requires:
4907+ db:
4908+ interface: pgsql
4909+
4910
4911=== added directory 'tests/trusty/chlayer'
4912=== added directory 'tests/trusty/chlayer/hooks'
4913=== added file 'tests/trusty/chlayer/hooks/charmhelpers.pypi'
4914--- tests/trusty/chlayer/hooks/charmhelpers.pypi 1970-01-01 00:00:00 +0000
4915+++ tests/trusty/chlayer/hooks/charmhelpers.pypi 2015-08-31 19:32:56 +0000
4916@@ -0,0 +1,1 @@
4917+charmhelpers
4918
4919=== added directory 'tests/trusty/mysql'
4920=== added file 'tests/trusty/mysql/.bzrignore'
4921--- tests/trusty/mysql/.bzrignore 1970-01-01 00:00:00 +0000
4922+++ tests/trusty/mysql/.bzrignore 2015-08-31 19:32:56 +0000
4923@@ -0,0 +1,2 @@
4924+bin/
4925+.venv
4926
4927=== added file 'tests/trusty/mysql/Makefile'
4928--- tests/trusty/mysql/Makefile 1970-01-01 00:00:00 +0000
4929+++ tests/trusty/mysql/Makefile 2015-08-31 19:32:56 +0000
4930@@ -0,0 +1,24 @@
4931+#!/usr/bin/make
4932+PYTHON := /usr/bin/env python
4933+export PYTHONPATH := hooks
4934+
4935+virtualenv:
4936+ virtualenv .venv
4937+ .venv/bin/pip install flake8 nose mock six
4938+
4939+lint: virtualenv
4940+ .venv/bin/flake8 --exclude hooks/charmhelpers hooks
4941+ @charm proof
4942+
4943+test: virtualenv
4944+ @echo Starting tests...
4945+ @sudo apt-get install python-six
4946+ @.venv/bin/nosetests --nologcapture unit_tests
4947+
4948+bin/charm_helpers_sync.py:
4949+ @mkdir -p bin
4950+ @bzr cat lp:charm-helpers/tools/charm_helpers_sync/charm_helpers_sync.py \
4951+ > bin/charm_helpers_sync.py
4952+
4953+sync: bin/charm_helpers_sync.py
4954+ $(PYTHON) bin/charm_helpers_sync.py -c charm-helpers.yaml
4955
4956=== added file 'tests/trusty/mysql/README.md'
4957--- tests/trusty/mysql/README.md 1970-01-01 00:00:00 +0000
4958+++ tests/trusty/mysql/README.md 2015-08-31 19:32:56 +0000
4959@@ -0,0 +1,133 @@
4960+# Overview
4961+
4962+[MySQL](http://www.mysql.com) is a fast, stable and true multi-user, multi-threaded SQL database server. SQL (Structured Query Language) is the most popular database query language in the world. The main goals of MySQL are speed, robustness and ease of use.
4963+
4964+This charm also can deploy [Percona Server](http://www.percona.com/software/percona-server) is fork of MySQL by Percona Inc. which focuses on maximizing performance, particularly for heavy workloads. It is a drop-in replacement for MySQL and features XtraDB, a drop-in replacement for the InnoDB storage engine.
4965+
4966+# Usage
4967+
4968+## General Usage
4969+
4970+To deploy a MySQL service:
4971+
4972+ juju deploy mysql
4973+
4974+Once deployed, you can retrieve the MySQL root user password by logging in to the machine via `juju ssh` and readin the `/var/lib/mysql/mysql.passwd` file. To log in as root MySQL User at the MySQL console you can issue the following:
4975+
4976+ juju ssh mysql/0
4977+ mysql -u root -p`sudo cat /var/lib/mysql/mysql.passwd`
4978+
4979+## Backups
4980+
4981+The charm supports simple backups. To enable them set `backup_schedule` option. Optionally you can override default `backup_dir` and/or `backup_retention_count`:
4982+
4983+ juju set mysql backup_schedule="45 5 * * *" # cron formatted schedule
4984+ juju set mysql backup_dir="/mnt/backup"
4985+ juju set mysql backup_retention_count=28
4986+
4987+# Scale Out Usage
4988+
4989+## Replication
4990+
4991+MySQL supports the ability to replicate databases to slave instances. This
4992+allows you, for example, to load balance read queries across multiple slaves or
4993+use a slave to perform backups, all whilst not impeding the master's
4994+performance.
4995+
4996+To deploy a slave:
4997+
4998+ # deploy second service
4999+ juju deploy mysql mysql-slave
5000+
The diff has been truncated for viewing.

Subscribers

People subscribed via source and target branches