Merge lp:~bcsaller/charm-tools/composer into lp:charm-tools/1.6

Proposed by Benjamin Saller
Status: Merged
Merged at revision: 359
Proposed branch: lp:~bcsaller/charm-tools/composer
Merge into: lp:charm-tools/1.6
Diff against target: 12668 lines (+11533/-368)
117 files modified
.bzrignore (+3/-1)
MANIFEST.in (+1/-0)
Makefile (+14/-37)
charmtools/compose/__init__.py (+466/-0)
charmtools/compose/config.py (+113/-0)
charmtools/compose/diff_match_patch.py (+1919/-0)
charmtools/compose/fetchers.py (+117/-0)
charmtools/compose/inspector.py (+101/-0)
charmtools/compose/tactics.py (+449/-0)
charmtools/utils.py (+518/-0)
doc/source/compose-intro.md (+18/-0)
doc/source/composer.md (+123/-0)
ez_setup.py (+0/-272)
helpers/python/charmhelpers/tests/test_charmhelpers.py (+1/-1)
requirements.txt (+18/-13)
scripts/packages.sh (+19/-0)
scripts/test (+0/-7)
setup.cfg (+8/-0)
setup.py (+16/-16)
tests/interfaces/mysql/interface.yaml (+1/-0)
tests/interfaces/mysql/provides.py (+1/-0)
tests/interfaces/mysql/requires.py (+1/-0)
tests/test_charm_generate.py (+0/-7)
tests/test_compose.py (+239/-0)
tests/test_config.py (+29/-0)
tests/test_juju_test.py (+0/-9)
tests/test_utils.py (+43/-0)
tests/trusty/a/README.md (+1/-0)
tests/trusty/a/a (+1/-0)
tests/trusty/b/README.md (+1/-0)
tests/trusty/b/composer.yaml (+1/-0)
tests/trusty/b/metadata.yaml (+11/-0)
tests/trusty/c-reactive/README.md (+1/-0)
tests/trusty/c-reactive/composer.yaml (+1/-0)
tests/trusty/c-reactive/hooks/reactive/main.py (+6/-0)
tests/trusty/c/README.md (+1/-0)
tests/trusty/c/composer.yaml (+1/-0)
tests/trusty/c/metadata.yaml (+11/-0)
tests/trusty/chlayer/hooks/charmhelpers.pypi (+1/-0)
tests/trusty/mysql/.bzrignore (+2/-0)
tests/trusty/mysql/Makefile (+24/-0)
tests/trusty/mysql/README.md (+133/-0)
tests/trusty/mysql/charm-helpers.yaml (+9/-0)
tests/trusty/mysql/config.yaml (+141/-0)
tests/trusty/mysql/copyright (+17/-0)
tests/trusty/mysql/hooks/charmhelpers/__init__.py (+38/-0)
tests/trusty/mysql/hooks/charmhelpers/contrib/__init__.py (+15/-0)
tests/trusty/mysql/hooks/charmhelpers/contrib/charmsupport/nrpe.py (+219/-0)
tests/trusty/mysql/hooks/charmhelpers/contrib/charmsupport/volumes.py (+156/-0)
tests/trusty/mysql/hooks/charmhelpers/contrib/database/mysql.py (+385/-0)
tests/trusty/mysql/hooks/charmhelpers/contrib/network/__init__.py (+15/-0)
tests/trusty/mysql/hooks/charmhelpers/contrib/network/ip.py (+450/-0)
tests/trusty/mysql/hooks/charmhelpers/contrib/peerstorage/__init__.py (+148/-0)
tests/trusty/mysql/hooks/charmhelpers/core/__init__.py (+15/-0)
tests/trusty/mysql/hooks/charmhelpers/core/decorators.py (+41/-0)
tests/trusty/mysql/hooks/charmhelpers/core/fstab.py (+134/-0)
tests/trusty/mysql/hooks/charmhelpers/core/hookenv.py (+568/-0)
tests/trusty/mysql/hooks/charmhelpers/core/host.py (+446/-0)
tests/trusty/mysql/hooks/charmhelpers/core/services/__init__.py (+18/-0)
tests/trusty/mysql/hooks/charmhelpers/core/services/base.py (+329/-0)
tests/trusty/mysql/hooks/charmhelpers/core/services/helpers.py (+267/-0)
tests/trusty/mysql/hooks/charmhelpers/core/strutils.py (+42/-0)
tests/trusty/mysql/hooks/charmhelpers/core/sysctl.py (+56/-0)
tests/trusty/mysql/hooks/charmhelpers/core/templating.py (+69/-0)
tests/trusty/mysql/hooks/charmhelpers/core/unitdata.py (+477/-0)
tests/trusty/mysql/hooks/charmhelpers/fetch/__init__.py (+439/-0)
tests/trusty/mysql/hooks/charmhelpers/fetch/archiveurl.py (+161/-0)
tests/trusty/mysql/hooks/charmhelpers/fetch/bzrurl.py (+78/-0)
tests/trusty/mysql/hooks/charmhelpers/fetch/giturl.py (+71/-0)
tests/trusty/mysql/hooks/common.py (+109/-0)
tests/trusty/mysql/hooks/config-changed (+414/-0)
tests/trusty/mysql/hooks/data-relation.py (+31/-0)
tests/trusty/mysql/hooks/db-relation-broken (+21/-0)
tests/trusty/mysql/hooks/db-relation-joined (+87/-0)
tests/trusty/mysql/hooks/ha_relations.py (+163/-0)
tests/trusty/mysql/hooks/install (+49/-0)
tests/trusty/mysql/hooks/master-relation-changed (+95/-0)
tests/trusty/mysql/hooks/monitors-relation-broken (+8/-0)
tests/trusty/mysql/hooks/monitors-relation-departed (+3/-0)
tests/trusty/mysql/hooks/monitors-relation-joined (+9/-0)
tests/trusty/mysql/hooks/monitors.common.bash (+8/-0)
tests/trusty/mysql/hooks/munin-relation-changed (+26/-0)
tests/trusty/mysql/hooks/munin-relation-joined (+6/-0)
tests/trusty/mysql/hooks/nrpe_relations.py (+91/-0)
tests/trusty/mysql/hooks/shared_db_relations.py (+153/-0)
tests/trusty/mysql/hooks/slave-relation-broken (+11/-0)
tests/trusty/mysql/hooks/slave-relation-changed (+89/-0)
tests/trusty/mysql/hooks/slave-relation-joined (+2/-0)
tests/trusty/mysql/hooks/start (+5/-0)
tests/trusty/mysql/hooks/stop (+3/-0)
tests/trusty/mysql/hooks/upgrade-charm (+27/-0)
tests/trusty/mysql/icon.svg (+335/-0)
tests/trusty/mysql/keys/repo.percona.com (+30/-0)
tests/trusty/mysql/metadata.yaml (+43/-0)
tests/trusty/mysql/monitors.yaml (+13/-0)
tests/trusty/mysql/revision (+1/-0)
tests/trusty/mysql/scripts/add_to_cluster (+13/-0)
tests/trusty/mysql/scripts/charm_helpers_sync.py (+225/-0)
tests/trusty/mysql/scripts/mysql_backup.sh (+30/-0)
tests/trusty/mysql/scripts/remove_from_cluster (+4/-0)
tests/trusty/mysql/templates/apparmor.j2 (+15/-0)
tests/trusty/mysql/templates/mysql_backup.j2 (+12/-0)
tests/trusty/mysql/tests/00-setup (+12/-0)
tests/trusty/mysql/tests/15-configs (+77/-0)
tests/trusty/mysql/unit_tests/test_mysql_common.py (+18/-0)
tests/trusty/tester/README.md (+1/-0)
tests/trusty/tester/composer.yaml (+8/-0)
tests/trusty/tester/generate/custom.py (+17/-0)
tests/trusty/tester/hooks/start (+1/-0)
tests/trusty/tester/metadata.yaml (+14/-0)
tests/trusty/use-layers/README.md (+1/-0)
tests/trusty/use-layers/composer.yaml (+1/-0)
tests/trusty/use-layers/hooks/reactive/main.py (+6/-0)
tests_functional/add/test.sh (+2/-2)
tests_functional/create/test.sh (+4/-2)
tests_functional/proof/record.sh (+1/-1)
tox.ini (+21/-0)
To merge this branch: bzr merge lp:~bcsaller/charm-tools/composer
Reviewer Review Type Date Requested Status
Tim Van Steenburgh (community) Approve
Cory Johns (community) Needs Fixing
Marco Ceppi Pending
Review via email: mp+266281@code.launchpad.net

Description of the change

This adds the composer stuff
and ports the probject to use tox

To post a comment you must log in.
Revision history for this message
Cory Johns (johnsca) wrote :

Need to add blessings, ruamel.yaml, pathspec, and bundletester to the install_requires in setup.py

Revision history for this message
Cory Johns (johnsca) :
review: Needs Fixing
Revision history for this message
Adam Israel (aisrael) wrote :

Hi Ben,

Per our earlier conversation, I'd also like to see some documentation on how a charm author would consider and use composer. I'm very excited to see that myself, and give composer a spin.

lp:~bcsaller/charm-tools/composer updated
358. By Benjamin Saller

merge lp:~johnsca/charm-tools/compose

359. By Benjamin Saller

various composer fixes

360. By Benjamin Saller

without the pdb

361. By Benjamin Saller

show paths and so on with -l DEBUG, tests do less actual remote work

362. By Benjamin Saller

compose cli help a little better

363. By Benjamin Saller

fix bzrignore

364. By Benjamin Saller

fix typo

Revision history for this message
Charles Butler (lazypower) wrote :

Sorry this took me so long to circle back, but I've tried this branch of charm tools and it appears the manifest is not fetching all the dependencies.

What i did:

bzr branch lp:~bcsaller/charm-tools/composer/
virtualenv .venv
source .venv/bin/activate

pip install ./

charm compose -h
ImportError: No module named path

pip install path.py
charm compose -h
ImportError: No module named otherstuf

pip install otherstuf
charm compose -h

OSError: [Errno 2] No such file or directory: '/home/charles/projects/work/composer/.venv/local/lib/python2.7/site-packages/charmtools/compose/../../doc/source/compose-intro.md'

Once I ran through that dependency hoop, compose appears to be sorted and available.

Revision history for this message
Cory Johns (johnsca) wrote :

Fix for the missing deps and help error here: https://code.launchpad.net/~johnsca/charm-tools/compose/+merge/268164

Revision history for this message
Cory Johns (johnsca) wrote :

Updated my most recent MP to fix the default value for --name when composing current dir (e.g., "charm compose ." or just "charm compose")

lp:~bcsaller/charm-tools/composer updated
365. By Benjamin Saller

fix deps in setup, better seaching for charm.name

366. By Benjamin Saller

change default interface address to public one

367. By Benjamin Saller

update to work with real DNS and service

368. By Benjamin Saller

remove find name call

369. By Benjamin Saller

various fixes around naming and patch to install to ignore existing deps, also depend on modern pip

370. By Benjamin Saller

repair tests

371. By Benjamin Saller

fix/remove broken tests w/updated bundletester

372. By Benjamin Saller

patch for inspect to work with more varied naming

373. By Benjamin Saller

force key order on metadata.yaml merges

374. By Benjamin Saller

fix tests (to reflect remote changes to basic layer) and ordered metadata rendering

375. By Benjamin Saller

update notes on workflow

376. By Benjamin Saller

various fixes and cleanups. Change installer to do better signing

377. By Benjamin Saller

merge trunk

Revision history for this message
Tim Van Steenburgh (tvansteenburgh) wrote :
Download full text (9.0 KiB)

I'm eager to merge this but can't get the tests to run. I always run `make check` on charm-tools since that includes the integration tests as well. According to the Makefile that should still work, but I get:

tvansteenburgh@trusty-vm:/tmp/charm-tools> make clean
find . -name '*.py[co]' -delete
find . -type f -name '*~' -delete
find . -name '*.bak' -delete
rm -rf bin include lib local man dependencies
tvansteenburgh@trusty-vm:/tmp/charm-tools> make check
bzr checkout lp:~juju-jitsu/charm-tools/dependencies
tox --develop
py27 create: /tmp/charm-tools/.tox/py27
py27 installdeps: -r/tmp/charm-tools/requirements.txt
ERROR: invocation failed, logfile: /tmp/charm-tools/.tox/py27/log/py27-1.log
ERROR: actionid=py27
msg=getenv
cmdargs=[local('/tmp/charm-tools/.tox/py27/bin/pip'), 'install', '--no-index', '-f', 'dependencies/python', '-r/tmp/charm-tools/requirements.txt']
env={'BYOBU_TTY': '/dev/pts/1', 'UPSTART_EVENTS': 'started starting', 'SHELL': '/bin/bash', 'XDG_DATA_DIRS': '/usr/share/ubuntu:/usr/share/gnome:/usr/local/share/:/usr/share/', 'MANDATORY_PATH': '/usr/share/gconf/ubuntu.mandatory.path', 'CLUTTER_IM_MODULE': 'xim', 'BYOBU_RUN_DIR': '/dev/shm/byobu-tvansteenburgh-KRSOmrmO', 'UPSTART_INSTANCE': '', 'JOB': 'gnome-session', 'TEXTDOMAIN': 'im-config', 'XMODIFIERS': '@im=ibus', 'MFLAGS': '', 'SELINUX_INIT': 'YES', 'BYOBU_SED': 'sed', 'BYOBU_LIGHT': '#EEEEEE', 'DESKTOP_SESSION': 'ubuntu', 'BYOBU_DATE': '%Y-%m-%d ', 'XDG_SESSION_ID': 'c1', 'DBUS_SESSION_BUS_ADDRESS': 'unix:abstract=/tmp/dbus-m7XWOMt928', 'DEFAULTS_PATH': '/usr/share/gconf/ubuntu.default.path', 'LESS_TERMCAP_ue': '\x1b[0m', 'GTK_MODULES': 'overlay-scrollbar:unity-gtk-module', 'INSTANCE': 'Unity', 'LESS_TERMCAP_us': '\x1b[04;38;5;139m', 'LS_COLORS': 'rs=0:di=38;5;5:ln=4;5;37:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:su=37;41:sg=30;43:ca=30;41:tw=3;28:ow=34;42:st=37;44:ex=38;5;202:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lz=01;31:*.xz=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.mid=00;36:*.midi=00;36:*.mka=...

Read more...

review: Needs Fixing
Revision history for this message
Tim Van Steenburgh (tvansteenburgh) wrote :
Download full text (13.6 KiB)

I got integration tests to run by changing this in the Makefile:

-build: deps develop
+build: deps

But then I got a bunch of legit failures from the tests:

2 tvansteenburgh@trusty-vm:/tmp/charm-tools> make check
tests_functional/helpers/helpers.sh || sh -x tests_functional/helpers/helpers.sh timeout
Fri Aug 28 11:04:24 EDT 2015: Testing ch_apparmor_load...PASS
Fri Aug 28 11:04:24 EDT 2015: Testing ch_type_hash...PASS
Fri Aug 28 11:04:24 EDT 2015: Testing ch_is_url...PASS
Fri Aug 28 11:04:24 EDT 2015: Testing Testing ch_is_ip...PASS
Fri Aug 28 11:04:24 EDT 2015: Testing Testing ch_get_ip...PASS
Fri Aug 28 11:04:24 EDT 2015: Starting SimpleHTTPServer in /tmp/charm-helper-srv.BlmbCS on port 8999 to test fetching files.
Fri Aug 28 11:04:24 EDT 2015: Looping wget until webserver responds...
Fri Aug 28 11:04:25 EDT 2015: Attempt 1 succeeded.
Fri Aug 28 11:04:25 EDT 2015: Creating temp data file
Fri Aug 28 11:04:25 EDT 2015: creating gzipped test data
Fri Aug 28 11:04:25 EDT 2015: Testing ch_get_file...PASS
Fri Aug 28 11:04:25 EDT 2015: Shutting down webserver...DONE
Fri Aug 28 11:04:25 EDT 2015: Printing server log
127.0.0.1 - - [28/Aug/2015 11:04:25] "GET / HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:25] "GET /testdata.txt HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:25] "GET /testdata.txt HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:25] "GET /testdata.txt HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:25] "GET /testdata.txt.gz HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:25] "GET /testdata.txt.gz HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:25] "GET /testdata.txt.gz HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:25] "GET /testdata.txt HTTP/1.1" 200 -
Fri Aug 28 11:04:25 EDT 2015: Testing ch_unit_name...PASS
Fri Aug 28 11:04:25 EDT 2015: Testing ch_unit_id...PASS
Fri Aug 28 11:04:25 EDT 2015: Testing ch_my_unit_id...PASS
Test shell helpers with bash
bash tests_functional/helpers/helpers.sh \
            || bash -x tests_functional/helpers/helpers.sh timeout
Fri Aug 28 11:04:25 EDT 2015: Testing ch_apparmor_load...PASS
Fri Aug 28 11:04:25 EDT 2015: Testing ch_type_hash...PASS
Fri Aug 28 11:04:25 EDT 2015: Testing ch_is_url...PASS
Fri Aug 28 11:04:25 EDT 2015: Testing Testing ch_is_ip...PASS
Fri Aug 28 11:04:25 EDT 2015: Testing Testing ch_get_ip...PASS
Fri Aug 28 11:04:25 EDT 2015: Starting SimpleHTTPServer in /tmp/charm-helper-srv.TOq187 on port 8999 to test fetching files.
Fri Aug 28 11:04:25 EDT 2015: Looping wget until webserver responds...
Fri Aug 28 11:04:26 EDT 2015: Attempt 1 succeeded.
Fri Aug 28 11:04:26 EDT 2015: Creating temp data file
Fri Aug 28 11:04:26 EDT 2015: creating gzipped test data
Fri Aug 28 11:04:26 EDT 2015: Testing ch_get_file...PASS
Fri Aug 28 11:04:26 EDT 2015: Shutting down webserver...DONE
Fri Aug 28 11:04:26 EDT 2015: Printing server log
127.0.0.1 - - [28/Aug/2015 11:04:26] "GET / HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:26] "GET /testdata.txt HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:26] "GET /testdata.txt HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:26] "GET /testdata.txt HTTP/1.1" 200 -
127.0.0.1 - - [28/Aug/2015 11:04:26] "GET /testdata.txt.gz HTTP/1.1" 200 -
127.0.0...

review: Needs Fixing
lp:~bcsaller/charm-tools/composer updated
378. By Benjamin Saller

merge trunk

379. By Benjamin Saller

path fixes on integration tests

Revision history for this message
Benjamin Saller (bcsaller) wrote :

should be fixed, very minor pathing issues in the tests to point to the tox virtualenv

lp:~bcsaller/charm-tools/composer updated
380. By Benjamin Saller

notest on develop

381. By Benjamin Saller

pass term env

382. By Benjamin Saller

move passenv

Revision history for this message
Tim Van Steenburgh (tvansteenburgh) wrote :

Tests pass after upgrading tox to 2.1.1 (1.6.0 failed), LGTM.

review: Approve

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file '.bzrignore'
--- .bzrignore 2014-01-14 03:23:17 +0000
+++ .bzrignore 2015-08-31 19:32:56 +0000
@@ -1,4 +1,4 @@
1tests/proof/results/*1tests_functional/proof/results/*
2.coverage2.coverage
3.noseids3.noseids
4charm_tools.egg-info4charm_tools.egg-info
@@ -13,3 +13,5 @@
13dist13dist
14tests/.ropeproject/14tests/.ropeproject/
15charmtools/.ropeproject/15charmtools/.ropeproject/
16.ropeproject/
17.tox/
1618
=== modified file 'MANIFEST.in'
--- MANIFEST.in 2014-06-19 23:18:53 +0000
+++ MANIFEST.in 2015-08-31 19:32:56 +0000
@@ -1,3 +1,4 @@
1include *.py README*1include *.py README*
2include doc/source/composer.md
2recursive-include charmtools *3recursive-include charmtools *
3recursive-exclude charmtools *.pyc4recursive-exclude charmtools *.pyc
45
=== modified file 'Makefile'
--- Makefile 2014-06-20 19:46:59 +0000
+++ Makefile 2015-08-31 19:32:56 +0000
@@ -17,44 +17,23 @@
17confdir = $(DESTDIR)/etc17confdir = $(DESTDIR)/etc
18INSTALL = install18INSTALL = install
1919
20# We use a "canary" file to tell us if the package has been installed in20develop:
21# "develop" mode.21 tox --develop --notest
22DEVELOP_CANARY := lib/__develop_canary
23develop: $(DEVELOP_CANARY)
24$(DEVELOP_CANARY): | python-deps
25 bin/python setup.py develop
26 touch $(DEVELOP_CANARY)
2722
28build: deps develop bin/test23build: deps develop
2924
30dependencies:25dependencies:
31 bzr checkout lp:~juju-jitsu/charm-tools/dependencies26 bzr checkout lp:~juju-jitsu/charm-tools/dependencies
3227
33# We use a "canary" file to tell us if the Python packages have been installed.28PYTHON_DEPS=build-essential bzr python-dev python-tox
34PYTHON_PACKAGE_CANARY := lib/python2.7/site-packages/___canary29python-deps: scripts/packages.sh
35python-deps: $(PYTHON_PACKAGE_CANARY)30 $(if $(shell ./scripts/packages.sh $(PYTHON_DEPS)), \
36$(PYTHON_PACKAGE_CANARY): requirements.txt | dependencies31 tox -r --notest)
37 sudo apt-get update
38 sudo apt-get install -y build-essential bzr python-dev \
39 python-virtualenv
40 virtualenv .
41 bin/pip install --no-index --no-dependencies --find-links \
42 file:///$(WD)/dependencies/python -r requirements.txt
43 touch $(PYTHON_PACKAGE_CANARY)
4432
45deps: python-deps | dependencies33deps: python-deps | dependencies
4634
47bin/nosetests: python-deps35test:
4836 tox
49bin/test: | bin/nosetests
50 ln scripts/test bin/test
51
52test: build bin/test
53 bin/test
54
55lint: sources = setup.py charmtools
56lint: build
57 @find $(sources) -name '*.py' -print0 | xargs -r0 bin/flake8
5837
59tags:38tags:
60 ctags --tag-relative --python-kinds=-iv -Rf tags --sort=yes \39 ctags --tag-relative --python-kinds=-iv -Rf tags --sort=yes \
@@ -93,12 +72,11 @@
93 tests_functional/proof/test.sh72 tests_functional/proof/test.sh
94 tests_functional/create/test.sh73 tests_functional/create/test.sh
95 tests_functional/add/test.sh74 tests_functional/add/test.sh
96# PYTHONPATH=helpers/python python helpers/python/charmhelpers/tests/test_charmhelpers.py75
9776coverage: build
98coverage: build bin/test77 tox
99 bin/test --with-coverage --cover-package=charmtools --cover-tests78
10079check: build integration test
101check: build integration test lint
10280
103define phony81define phony
104 build82 build
@@ -106,7 +84,6 @@
106 clean84 clean
107 deps85 deps
108 install86 install
109 lint
110 tags87 tags
111 test88 test
112endef89endef
11390
=== added directory 'charmtools/compose'
=== added file 'charmtools/compose/__init__.py'
--- charmtools/compose/__init__.py 1970-01-01 00:00:00 +0000
+++ charmtools/compose/__init__.py 2015-08-31 19:32:56 +0000
@@ -0,0 +1,466 @@
1#!/usr/bin/env python
2# -*- coding: utf-8 -*-
3import argparse
4import json
5import logging
6import os
7import sys
8
9import blessings
10from collections import OrderedDict
11from path import path
12import yaml
13from charmtools.compose import inspector
14import charmtools.compose.tactics
15from charmtools.compose.config import (ComposerConfig, DEFAULT_IGNORES)
16from charmtools.compose.fetchers import (InterfaceFetcher,
17 LayerFetcher,
18 get_fetcher,
19 FetchError)
20from charmtools import utils
21
22log = logging.getLogger("composer")
23
24
25class Configable(object):
26 CONFIG_FILE = None
27
28 def __init__(self):
29 self._config = ComposerConfig()
30 self.config_file = None
31
32 @property
33 def config(self):
34 if self._config.configured:
35 return self._config
36 if self.config_file and self.config_file.exists():
37 self._config.configure(self.config_file)
38 return self._config
39
40 @property
41 def configured(self):
42 return bool(self.config is not None and self.config.configured)
43
44
45class Fetched(Configable):
46 def __init__(self, url, target_repo, name=None):
47 super(Fetched, self).__init__()
48 self.url = url
49 self.target_repo = target_repo
50 self.directory = None
51 self._name = name
52
53 @property
54 def name(self):
55 if self._name:
56 return self._name
57 if self.url.startswith(self.NAMESPACE):
58 return self.url[len(self.NAMESPACE):]
59 return self.url
60
61 def __repr__(self):
62 return "<{} {}:{}>".format(self.__class__.__name__,
63 self.url, self.directory)
64
65 def __div__(self, other):
66 return self.directory / other
67
68 def fetch(self):
69 try:
70 fetcher = get_fetcher(self.url)
71 except FetchError:
72 # We might be passing a local dir path directly
73 # which fetchers don't currently support
74 self.directory = path(self.url)
75 else:
76 if hasattr(fetcher, "path") and fetcher.path.exists():
77 self.directory = path(fetcher.path)
78 else:
79 if not self.target_repo.exists():
80 self.target_repo.makedirs_p()
81 self.directory = path(fetcher.fetch(self.target_repo))
82
83 if not self.directory.exists():
84 raise OSError(
85 "Unable to locate {}. "
86 "Do you need to set {}?".format(
87 self.url, self.ENVIRON))
88
89 self.config_file = self.directory / self.CONFIG_FILE
90 self._name = self.config.name
91 return self
92
93
94class Interface(Fetched):
95 CONFIG_FILE = "interface.yaml"
96 NAMESPACE = "interface"
97 ENVIRON = "INTERFACE_PATH"
98
99
100class Layer(Fetched):
101 CONFIG_FILE = "composer.yaml"
102 NAMESPACE = "layer"
103 ENVIRON = "COMPOSER_PATH"
104
105
106class Composer(object):
107 """
108 Handle the processing of overrides, implements the policy of ComposerConfig
109 """
110 PHASES = ['lint', 'read', 'call', 'sign', 'build']
111
112 def __init__(self):
113 self.config = ComposerConfig()
114 self.force = False
115 self._name = None
116 self._charm = None
117
118 @property
119 def charm(self):
120 return self._charm
121
122 @charm.setter
123 def charm(self, value):
124 self._charm = path(value)
125
126 @property
127 def name(self):
128 if self._name:
129 return self._name
130
131 # optionally extract name from the top layer
132 self._name = str(path(self.charm).abspath().name)
133
134 # however if the current layer has a metadata.yaml we can
135 # use its name
136 md = path(self.charm) / "metadata.yaml"
137 if md.exists():
138 data = yaml.load(md.open())
139 name = data.get("name")
140 if name:
141 self._name = name
142 return self._name
143
144 @name.setter
145 def name(self, value):
146 self._name = value
147
148 @property
149 def charm_name(self):
150 return "{}/{}".format(self.series, self.name)
151
152 def status(self):
153 result = {}
154 result.update(vars(self))
155 for e in ["COMPOSER_PATH", "INTERFACE_PATH", "JUJU_REPOSITORY"]:
156 result[e] = os.environ.get(e)
157 return result
158
159 def create_repo(self):
160 # Generated output will go into this directory
161 base = path(self.output_dir)
162 self.repo = (base / self.series)
163 # And anything it includes from will be placed here
164 # outside the series
165 self.deps = (base / "deps" / self.series)
166 self.target_dir = (self.repo / self.name)
167
168 def find_or_create_repo(self, allow_create=True):
169 # see if output dir is already in a repo, we can use that directly
170 if self.output_dir == path(self.charm).normpath():
171 # we've indicated in the cmdline that we are doing an inplace
172 # update
173 if self.output_dir.parent.basename() == self.series:
174 # we're already in a repo
175 self.repo = self.output_dir.parent.parent
176 self.deps = (self.repo / "deps" / self.series)
177 self.target_dir = self.output_dir
178 return
179 if allow_create:
180 self.create_repo()
181 else:
182 raise ValueError("%s doesn't seem valid", self.charm.directory)
183
184 def fetch(self):
185 layer = Layer(self.charm, self.deps).fetch()
186 if not layer.configured:
187 log.info("The top level layer expects a "
188 "valid composer.yaml file, "
189 "using defaults.")
190 # Manually create a layer object for the output
191 self.target = Layer(self.name, self.repo)
192 self.target.directory = self.target_dir
193 return self.fetch_deps(layer)
194
195 def fetch_deps(self, layer):
196 results = {"layers": [], "interfaces": []}
197 self.fetch_dep(layer, results)
198 # results should now be a bottom up list
199 # of deps. Using the in order results traversal
200 # we can build out our plan for each file in the
201 # output layer
202 results["layers"].append(layer)
203 self._layers = results["layers"]
204 self._interfaces = results["interfaces"]
205 return results
206
207 @property
208 def layers(self):
209 layers = []
210 for i in self._layers:
211 layers.append(i.url)
212 for i in self._interfaces:
213 layers.append(i.url)
214 layers.append("composer")
215 return layers
216
217 def fetch_dep(self, layer, results):
218 # Recursively fetch and scan layers
219 # This returns a plan for each file in the result
220 baselayers = layer.config.get('includes', [])
221 if not baselayers:
222 # no deps, this is possible for any base
223 # but questionable for the target
224 return
225
226 if isinstance(baselayers, str):
227 baselayers = [baselayers]
228
229 for base in baselayers:
230 if base.startswith("interface:"):
231 iface = Interface(base, self.deps).fetch()
232 results["interfaces"].append(iface)
233 else:
234 base_layer = Layer(base, self.deps).fetch()
235 self.fetch_dep(base_layer, results)
236 results["layers"].append(base_layer)
237
238 def build_tactics(self, entry, current, config, output_files):
239 # Delegate to the config object, it's rules
240 # will produce a tactic
241 relname = entry.relpath(current.directory)
242 current = current.config.tactic(entry, current, self.target, config)
243 existing = output_files.get(relname)
244 if existing is not None:
245 tactic = current.combine(existing)
246 else:
247 tactic = current
248 output_files[relname] = tactic
249
250 def plan_layers(self, layers, output_files):
251 config = ComposerConfig()
252 config = config.add_config(
253 layers["layers"][0] / ComposerConfig.DEFAULT_FILE, True)
254
255 layers["layers"][-1].url = self.name
256
257 for i, layer in enumerate(layers["layers"]):
258 log.info("Processing layer: %s", layer.url)
259 if i + 1 < len(layers["layers"]):
260 next_layer = layers["layers"][i + 1]
261 config = config.add_config(
262 next_layer / ComposerConfig.DEFAULT_FILE, True)
263 list(e for e in utils.walk(layer.directory,
264 self.build_tactics,
265 current=layer,
266 config=config,
267 output_files=output_files))
268 plan = [t for t in output_files.values() if t]
269 return plan
270
271 def plan_interfaces(self, layers, output_files, plan):
272 # Interface includes don't directly map to output files
273 # as they are computed in combination with the metadata.yaml
274 charm_meta = output_files.get("metadata.yaml")
275 if charm_meta:
276 meta = charm_meta()
277 if not meta:
278 return
279 target_config = layers["layers"][-1].config
280 specs = []
281 used_interfaces = set()
282 for kind in ("provides", "requires", "peer"):
283 for k, v in meta.get(kind, {}).items():
284 # ex: ["provides", "db", "mysql"]
285 specs.append([kind, k, v["interface"]])
286 used_interfaces.add(v["interface"])
287
288 for iface in layers["interfaces"]:
289 if iface.name not in used_interfaces:
290 # we shouldn't include something the charm doesn't use
291 log.warn("composer.yaml includes {} which isn't "
292 "used in metadata.yaml".format(
293 iface.name))
294 continue
295 for kind, relation_name, interface_name in specs:
296 if interface_name != iface.name:
297 continue
298 # COPY phase
299 plan.append(
300 charmtools.compose.tactics.InterfaceCopy(
301 iface, relation_name,
302 self.target, target_config)
303 )
304 # Link Phase
305 plan.append(
306 charmtools.compose.tactics.InterfaceBind(
307 iface, relation_name, kind,
308 self.target, target_config))
309 elif not charm_meta and layers["interfaces"]:
310 raise ValueError(
311 "Includes interfaces but no metadata.yaml to bind them")
312
313 def formulate_plan(self, layers):
314 """Build out a plan for each file in the various composed
315 layers, taking into account config at each layer"""
316 output_files = OrderedDict()
317 self.plan = self.plan_layers(layers, output_files)
318 self.plan_interfaces(layers, output_files, self.plan)
319 return self.plan
320
321 def exec_plan(self, plan=None, layers=None):
322 signatures = {}
323 cont = True
324 for phase in self.PHASES:
325 for tactic in plan:
326 if phase == "lint":
327 cont &= tactic.lint()
328 if cont is False and self.force is not True:
329 break
330 elif phase == "read":
331 # We use a read (into memory phase to make layer comps
332 # simpler)
333 tactic.read()
334 elif phase == "call":
335 tactic()
336 elif phase == "sign":
337 sig = tactic.sign()
338 if sig:
339 signatures.update(sig)
340 elif phase == "build":
341 tactic.build()
342 # write out the sigs
343 if "sign" in self.PHASES:
344 self.write_signatures(signatures, layers)
345
346 def write_signatures(self, signatures, layers):
347 sigs = self.target / ".composer.manifest"
348 signatures['.composer.manifest'] = ["composer", 'dynamic', 'unchecked']
349 sigs.write_text(json.dumps(dict(
350 signatures=signatures,
351 layers=layers,
352 ), indent=2))
353
354 def generate(self):
355 layers = self.fetch()
356 self.formulate_plan(layers)
357 self.exec_plan(self.plan, self.layers)
358
359 def validate(self):
360 p = self.target_dir / ".composer.manifest"
361 if not p.exists():
362 return [], [], []
363 ignorer = utils.ignore_matcher(DEFAULT_IGNORES)
364 a, c, d = utils.delta_signatures(p, ignorer)
365
366 for f in a:
367 log.warn(
368 "Added unexpected file, should be in a base layer: %s", f)
369 for f in c:
370 log.warn(
371 "Changed file owned by another layer: %s", f)
372 for f in d:
373 log.warn(
374 "Deleted a file owned by another layer: %s", f)
375 if a or c or d:
376 if self.force is True:
377 log.info(
378 "Continuing with known changes to target layer. "
379 "Changes will be overwritten")
380 else:
381 raise ValueError(
382 "Unable to continue due to unexpected modifications")
383 return a, c, d
384
385 def __call__(self):
386 self.find_or_create_repo()
387
388 log.debug(json.dumps(
389 self.status(), indent=2, sort_keys=True, default=str))
390 self.validate()
391 self.generate()
392
393 def inspect(self):
394 self.charm = path(self.charm).abspath()
395 inspector.inspect(self.charm)
396
397 def normalize_outputdir(self):
398 od = path(self.charm).normpath()
399 repo = os.environ.get('JUJU_REPOSITORY')
400 if repo:
401 repo = path(repo)
402 if repo.exists():
403 od = repo
404 elif ":" in od:
405 od = od.basename
406 log.info("Composing into {}".format(od))
407 self.output_dir = od
408
409
410def configLogging(composer):
411 global log
412 clifmt = utils.ColoredFormatter(
413 blessings.Terminal(),
414 '%(name)s: %(message)s')
415 root_logger = logging.getLogger()
416 clihandler = logging.StreamHandler(sys.stdout)
417 clihandler.setFormatter(clifmt)
418 if isinstance(composer.log_level, str):
419 composer.log_level = composer.log_level.upper()
420 root_logger.setLevel(composer.log_level)
421 log.setLevel(composer.log_level)
422 root_logger.addHandler(clihandler)
423 requests_logger = logging.getLogger("requests")
424 requests_logger.setLevel(logging.WARNING)
425
426
427def inspect(args=None):
428 composer = Composer()
429 parser = argparse.ArgumentParser()
430 parser.add_argument('-l', '--log-level', default=logging.INFO)
431 parser.add_argument('charm', nargs="?", default=".", type=path)
432 # Namespace will set the options as attrs of composer
433 parser.parse_args(args, namespace=composer)
434 configLogging(composer)
435 composer.inspect()
436
437
438def main(args=None):
439 composer = Composer()
440 parser = argparse.ArgumentParser(
441 description="Compose layers into a charm",
442 formatter_class=argparse.RawDescriptionHelpFormatter,)
443 parser.add_argument('-l', '--log-level', default=logging.INFO)
444 parser.add_argument('-f', '--force', action="store_true")
445 parser.add_argument('-o', '--output-dir', type=path)
446 parser.add_argument('-s', '--series', default="trusty")
447 parser.add_argument('--interface-service',
448 default="http://interfaces.juju.solutions")
449 parser.add_argument('-n', '--name',
450 help="Generate a charm of 'name' from 'charm'")
451 parser.add_argument('charm', nargs="?", default=".", type=path)
452 # Namespace will set the options as attrs of composer
453 parser.parse_args(args, namespace=composer)
454 # Monkey patch in the domain for the interface webservice
455 InterfaceFetcher.INTERFACE_DOMAIN = composer.interface_service
456 LayerFetcher.INTERFACE_DOMAIN = composer.interface_service
457 configLogging(composer)
458
459 if not composer.output_dir:
460 composer.normalize_outputdir()
461
462 composer()
463
464
465if __name__ == '__main__':
466 main()
0467
=== added file 'charmtools/compose/config.py'
--- charmtools/compose/config.py 1970-01-01 00:00:00 +0000
+++ charmtools/compose/config.py 2015-08-31 19:32:56 +0000
@@ -0,0 +1,113 @@
1from .tactics import DEFAULT_TACTICS, load_tactic
2
3
4import pathspec
5from ruamel import yaml
6import logging
7from path import path
8from otherstuf import chainstuf
9
10DEFAULT_IGNORES = [
11 ".bzr/",
12 ".git/",
13 "**/.ropeproject/",
14 "*.pyc",
15 "*~",
16 ".tox/",
17 "build/",
18]
19
20
21class ComposerConfig(chainstuf):
22 """Defaults for controlling the generator, each layer in
23 the inclusion graph can provide values, including things
24 like overrides, or warnings if things are overridden that
25 shouldn't be.
26 """
27 DEFAULT_FILE = "composer.yaml"
28
29 def __init__(self, *args, **kwargs):
30 super(ComposerConfig, self).__init__(*args, **kwargs)
31 self['_tactics'] = []
32 self.configured = False
33
34 def __getattr__(self, key):
35 return self[key]
36
37 def rget(self, key):
38 """Combine all the results from all the layers into a single iter"""
39 result = []
40 for m in self.maps:
41 r = m.get(key)
42 if r:
43 if isinstance(r, (list, tuple)):
44 result.extend(r)
45 else:
46 result.append(r)
47 return result
48
49 def configure(self, config_file, allow_missing=False):
50 config_file = path(config_file)
51 data = None
52 if not config_file.exists() and not allow_missing:
53 raise OSError("Missing Config File {}".format(config_file))
54 try:
55 if config_file.exists():
56 data = yaml.load(config_file.open())
57 self.configured = True
58 except yaml.parser.ParserError:
59 logging.critical("Malformed Config file: {}".format(config_file))
60 raise
61 if data:
62 self.update(data)
63 # look at any possible imports and use them to build tactics
64 tactics = self.get('tactics')
65 basedir = config_file.dirname()
66 if tactics:
67 for name in tactics:
68 tactic = load_tactic(name, basedir)
69 self._tactics.append(tactic)
70 return self
71
72 @classmethod
73 def from_config(cls, config_file, allow_missing=False):
74 c = cls()
75 c.configure(config_file, allow_missing)
76 return c
77
78 def add_config(self, config_file, allow_missing=False):
79 c = self.new_child()
80 c.configure(config_file, allow_missing)
81 return c
82
83 @property
84 def name(self):
85 return self.get('name')
86
87 @property
88 def ignores(self):
89 return self.rget('ignore') + DEFAULT_IGNORES
90
91 def tactics(self):
92 # XXX: combine from config layer
93 return self.rget('_tactics') + DEFAULT_TACTICS
94
95 def tactic(self, entity, current, target, next_config):
96 # Produce a tactic for the entity in question
97 # These will be accumulate through the layers
98 # and executed later
99 bd = current.directory
100 # Ignore handling
101 if next_config:
102 spec = pathspec.PathSpec.from_lines(pathspec.GitIgnorePattern,
103 next_config.ignores)
104 p = entity.relpath(bd)
105 matches = spec.match_files((p,))
106 if p in matches:
107 return None
108
109 for tactic in self.tactics():
110 if tactic.trigger(entity.relpath(bd)):
111 return tactic(target=target, entity=entity,
112 current=current, config=next_config)
113 return None
0114
=== added file 'charmtools/compose/diff_match_patch.py'
--- charmtools/compose/diff_match_patch.py 1970-01-01 00:00:00 +0000
+++ charmtools/compose/diff_match_patch.py 2015-08-31 19:32:56 +0000
@@ -0,0 +1,1919 @@
1#!/usr/bin/python2.4
2
3from __future__ import division
4
5"""Diff Match and Patch
6
7Copyright 2006 Google Inc.
8http://code.google.com/p/google-diff-match-patch/
9
10Licensed under the Apache License, Version 2.0 (the "License");
11you may not use this file except in compliance with the License.
12You may obtain a copy of the License at
13
14 http://www.apache.org/licenses/LICENSE-2.0
15
16Unless required by applicable law or agreed to in writing, software
17distributed under the License is distributed on an "AS IS" BASIS,
18WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
19See the License for the specific language governing permissions and
20limitations under the License.
21"""
22
23"""Functions for diff, match and patch.
24
25Computes the difference between two texts to create a patch.
26Applies the patch onto another text, allowing for errors.
27"""
28
29__author__ = 'fraser@google.com (Neil Fraser)'
30
31import math
32import re
33import sys
34import time
35import urllib
36
37class diff_match_patch:
38 """Class containing the diff, match and patch methods.
39
40 Also contains the behaviour settings.
41 """
42
43 def __init__(self):
44 """Inits a diff_match_patch object with default settings.
45 Redefine these in your program to override the defaults.
46 """
47
48 # Number of seconds to map a diff before giving up (0 for infinity).
49 self.Diff_Timeout = 1.0
50 # Cost of an empty edit operation in terms of edit characters.
51 self.Diff_EditCost = 4
52 # At what point is no match declared (0.0 = perfection, 1.0 = very loose).
53 self.Match_Threshold = 0.5
54 # How far to search for a match (0 = exact location, 1000+ = broad match).
55 # A match this many characters away from the expected location will add
56 # 1.0 to the score (0.0 is a perfect match).
57 self.Match_Distance = 1000
58 # When deleting a large block of text (over ~64 characters), how close do
59 # the contents have to be to match the expected contents. (0.0 = perfection,
60 # 1.0 = very loose). Note that Match_Threshold controls how closely the
61 # end points of a delete need to match.
62 self.Patch_DeleteThreshold = 0.5
63 # Chunk size for context length.
64 self.Patch_Margin = 4
65
66 # The number of bits in an int.
67 # Python has no maximum, thus to disable patch splitting set to 0.
68 # However to avoid long patches in certain pathological cases, use 32.
69 # Multiple short patches (using native ints) are much faster than long ones.
70 self.Match_MaxBits = 32
71
72 # DIFF FUNCTIONS
73
74 # The data structure representing a diff is an array of tuples:
75 # [(DIFF_DELETE, "Hello"), (DIFF_INSERT, "Goodbye"), (DIFF_EQUAL, " world.")]
76 # which means: delete "Hello", add "Goodbye" and keep " world."
77 DIFF_DELETE = -1
78 DIFF_INSERT = 1
79 DIFF_EQUAL = 0
80
81 def diff_main(self, text1, text2, checklines=True, deadline=None):
82 """Find the differences between two texts. Simplifies the problem by
83 stripping any common prefix or suffix off the texts before diffing.
84
85 Args:
86 text1: Old string to be diffed.
87 text2: New string to be diffed.
88 checklines: Optional speedup flag. If present and false, then don't run
89 a line-level diff first to identify the changed areas.
90 Defaults to true, which does a faster, slightly less optimal diff.
91 deadline: Optional time when the diff should be complete by. Used
92 internally for recursive calls. Users should set DiffTimeout instead.
93
94 Returns:
95 Array of changes.
96 """
97 # Set a deadline by which time the diff must be complete.
98 if deadline == None:
99 # Unlike in most languages, Python counts time in seconds.
100 if self.Diff_Timeout <= 0:
101 deadline = sys.maxint
102 else:
103 deadline = time.time() + self.Diff_Timeout
104
105 # Check for null inputs.
106 if text1 == None or text2 == None:
107 raise ValueError("Null inputs. (diff_main)")
108
109 # Check for equality (speedup).
110 if text1 == text2:
111 if text1:
112 return [(self.DIFF_EQUAL, text1)]
113 return []
114
115 # Trim off common prefix (speedup).
116 commonlength = self.diff_commonPrefix(text1, text2)
117 commonprefix = text1[:commonlength]
118 text1 = text1[commonlength:]
119 text2 = text2[commonlength:]
120
121 # Trim off common suffix (speedup).
122 commonlength = self.diff_commonSuffix(text1, text2)
123 if commonlength == 0:
124 commonsuffix = ''
125 else:
126 commonsuffix = text1[-commonlength:]
127 text1 = text1[:-commonlength]
128 text2 = text2[:-commonlength]
129
130 # Compute the diff on the middle block.
131 diffs = self.diff_compute(text1, text2, checklines, deadline)
132
133 # Restore the prefix and suffix.
134 if commonprefix:
135 diffs[:0] = [(self.DIFF_EQUAL, commonprefix)]
136 if commonsuffix:
137 diffs.append((self.DIFF_EQUAL, commonsuffix))
138 self.diff_cleanupMerge(diffs)
139 return diffs
140
141 def diff_compute(self, text1, text2, checklines, deadline):
142 """Find the differences between two texts. Assumes that the texts do not
143 have any common prefix or suffix.
144
145 Args:
146 text1: Old string to be diffed.
147 text2: New string to be diffed.
148 checklines: Speedup flag. If false, then don't run a line-level diff
149 first to identify the changed areas.
150 If true, then run a faster, slightly less optimal diff.
151 deadline: Time when the diff should be complete by.
152
153 Returns:
154 Array of changes.
155 """
156 if not text1:
157 # Just add some text (speedup).
158 return [(self.DIFF_INSERT, text2)]
159
160 if not text2:
161 # Just delete some text (speedup).
162 return [(self.DIFF_DELETE, text1)]
163
164 if len(text1) > len(text2):
165 (longtext, shorttext) = (text1, text2)
166 else:
167 (shorttext, longtext) = (text1, text2)
168 i = longtext.find(shorttext)
169 if i != -1:
170 # Shorter text is inside the longer text (speedup).
171 diffs = [(self.DIFF_INSERT, longtext[:i]), (self.DIFF_EQUAL, shorttext),
172 (self.DIFF_INSERT, longtext[i + len(shorttext):])]
173 # Swap insertions for deletions if diff is reversed.
174 if len(text1) > len(text2):
175 diffs[0] = (self.DIFF_DELETE, diffs[0][1])
176 diffs[2] = (self.DIFF_DELETE, diffs[2][1])
177 return diffs
178
179 if len(shorttext) == 1:
180 # Single character string.
181 # After the previous speedup, the character can't be an equality.
182 return [(self.DIFF_DELETE, text1), (self.DIFF_INSERT, text2)]
183
184 # Check to see if the problem can be split in two.
185 hm = self.diff_halfMatch(text1, text2)
186 if hm:
187 # A half-match was found, sort out the return data.
188 (text1_a, text1_b, text2_a, text2_b, mid_common) = hm
189 # Send both pairs off for separate processing.
190 diffs_a = self.diff_main(text1_a, text2_a, checklines, deadline)
191 diffs_b = self.diff_main(text1_b, text2_b, checklines, deadline)
192 # Merge the results.
193 return diffs_a + [(self.DIFF_EQUAL, mid_common)] + diffs_b
194
195 if checklines and len(text1) > 100 and len(text2) > 100:
196 return self.diff_lineMode(text1, text2, deadline)
197
198 return self.diff_bisect(text1, text2, deadline)
199
200 def diff_lineMode(self, text1, text2, deadline):
201 """Do a quick line-level diff on both strings, then rediff the parts for
202 greater accuracy.
203 This speedup can produce non-minimal diffs.
204
205 Args:
206 text1: Old string to be diffed.
207 text2: New string to be diffed.
208 deadline: Time when the diff should be complete by.
209
210 Returns:
211 Array of changes.
212 """
213
214 # Scan the text on a line-by-line basis first.
215 (text1, text2, linearray) = self.diff_linesToChars(text1, text2)
216
217 diffs = self.diff_main(text1, text2, False, deadline)
218
219 # Convert the diff back to original text.
220 self.diff_charsToLines(diffs, linearray)
221 # Eliminate freak matches (e.g. blank lines)
222 self.diff_cleanupSemantic(diffs)
223
224 # Rediff any replacement blocks, this time character-by-character.
225 # Add a dummy entry at the end.
226 diffs.append((self.DIFF_EQUAL, ''))
227 pointer = 0
228 count_delete = 0
229 count_insert = 0
230 text_delete = ''
231 text_insert = ''
232 while pointer < len(diffs):
233 if diffs[pointer][0] == self.DIFF_INSERT:
234 count_insert += 1
235 text_insert += diffs[pointer][1]
236 elif diffs[pointer][0] == self.DIFF_DELETE:
237 count_delete += 1
238 text_delete += diffs[pointer][1]
239 elif diffs[pointer][0] == self.DIFF_EQUAL:
240 # Upon reaching an equality, check for prior redundancies.
241 if count_delete >= 1 and count_insert >= 1:
242 # Delete the offending records and add the merged ones.
243 a = self.diff_main(text_delete, text_insert, False, deadline)
244 diffs[pointer - count_delete - count_insert : pointer] = a
245 pointer = pointer - count_delete - count_insert + len(a)
246 count_insert = 0
247 count_delete = 0
248 text_delete = ''
249 text_insert = ''
250
251 pointer += 1
252
253 diffs.pop() # Remove the dummy entry at the end.
254
255 return diffs
256
257 def diff_bisect(self, text1, text2, deadline):
258 """Find the 'middle snake' of a diff, split the problem in two
259 and return the recursively constructed diff.
260 See Myers 1986 paper: An O(ND) Difference Algorithm and Its Variations.
261
262 Args:
263 text1: Old string to be diffed.
264 text2: New string to be diffed.
265 deadline: Time at which to bail if not yet complete.
266
267 Returns:
268 Array of diff tuples.
269 """
270
271 # Cache the text lengths to prevent multiple calls.
272 text1_length = len(text1)
273 text2_length = len(text2)
274 max_d = (text1_length + text2_length + 1) // 2
275 v_offset = max_d
276 v_length = 2 * max_d
277 v1 = [-1] * v_length
278 v1[v_offset + 1] = 0
279 v2 = v1[:]
280 delta = text1_length - text2_length
281 # If the total number of characters is odd, then the front path will
282 # collide with the reverse path.
283 front = (delta % 2 != 0)
284 # Offsets for start and end of k loop.
285 # Prevents mapping of space beyond the grid.
286 k1start = 0
287 k1end = 0
288 k2start = 0
289 k2end = 0
290 for d in xrange(max_d):
291 # Bail out if deadline is reached.
292 if time.time() > deadline:
293 break
294
295 # Walk the front path one step.
296 for k1 in xrange(-d + k1start, d + 1 - k1end, 2):
297 k1_offset = v_offset + k1
298 if k1 == -d or (k1 != d and
299 v1[k1_offset - 1] < v1[k1_offset + 1]):
300 x1 = v1[k1_offset + 1]
301 else:
302 x1 = v1[k1_offset - 1] + 1
303 y1 = x1 - k1
304 while (x1 < text1_length and y1 < text2_length and
305 text1[x1] == text2[y1]):
306 x1 += 1
307 y1 += 1
308 v1[k1_offset] = x1
309 if x1 > text1_length:
310 # Ran off the right of the graph.
311 k1end += 2
312 elif y1 > text2_length:
313 # Ran off the bottom of the graph.
314 k1start += 2
315 elif front:
316 k2_offset = v_offset + delta - k1
317 if k2_offset >= 0 and k2_offset < v_length and v2[k2_offset] != -1:
318 # Mirror x2 onto top-left coordinate system.
319 x2 = text1_length - v2[k2_offset]
320 if x1 >= x2:
321 # Overlap detected.
322 return self.diff_bisectSplit(text1, text2, x1, y1, deadline)
323
324 # Walk the reverse path one step.
325 for k2 in xrange(-d + k2start, d + 1 - k2end, 2):
326 k2_offset = v_offset + k2
327 if k2 == -d or (k2 != d and
328 v2[k2_offset - 1] < v2[k2_offset + 1]):
329 x2 = v2[k2_offset + 1]
330 else:
331 x2 = v2[k2_offset - 1] + 1
332 y2 = x2 - k2
333 while (x2 < text1_length and y2 < text2_length and
334 text1[-x2 - 1] == text2[-y2 - 1]):
335 x2 += 1
336 y2 += 1
337 v2[k2_offset] = x2
338 if x2 > text1_length:
339 # Ran off the left of the graph.
340 k2end += 2
341 elif y2 > text2_length:
342 # Ran off the top of the graph.
343 k2start += 2
344 elif not front:
345 k1_offset = v_offset + delta - k2
346 if k1_offset >= 0 and k1_offset < v_length and v1[k1_offset] != -1:
347 x1 = v1[k1_offset]
348 y1 = v_offset + x1 - k1_offset
349 # Mirror x2 onto top-left coordinate system.
350 x2 = text1_length - x2
351 if x1 >= x2:
352 # Overlap detected.
353 return self.diff_bisectSplit(text1, text2, x1, y1, deadline)
354
355 # Diff took too long and hit the deadline or
356 # number of diffs equals number of characters, no commonality at all.
357 return [(self.DIFF_DELETE, text1), (self.DIFF_INSERT, text2)]
358
359 def diff_bisectSplit(self, text1, text2, x, y, deadline):
360 """Given the location of the 'middle snake', split the diff in two parts
361 and recurse.
362
363 Args:
364 text1: Old string to be diffed.
365 text2: New string to be diffed.
366 x: Index of split point in text1.
367 y: Index of split point in text2.
368 deadline: Time at which to bail if not yet complete.
369
370 Returns:
371 Array of diff tuples.
372 """
373 text1a = text1[:x]
374 text2a = text2[:y]
375 text1b = text1[x:]
376 text2b = text2[y:]
377
378 # Compute both diffs serially.
379 diffs = self.diff_main(text1a, text2a, False, deadline)
380 diffsb = self.diff_main(text1b, text2b, False, deadline)
381
382 return diffs + diffsb
383
384 def diff_linesToChars(self, text1, text2):
385 """Split two texts into an array of strings. Reduce the texts to a string
386 of hashes where each Unicode character represents one line.
387
388 Args:
389 text1: First string.
390 text2: Second string.
391
392 Returns:
393 Three element tuple, containing the encoded text1, the encoded text2 and
394 the array of unique strings. The zeroth element of the array of unique
395 strings is intentionally blank.
396 """
397 lineArray = [] # e.g. lineArray[4] == "Hello\n"
398 lineHash = {} # e.g. lineHash["Hello\n"] == 4
399
400 # "\x00" is a valid character, but various debuggers don't like it.
401 # So we'll insert a junk entry to avoid generating a null character.
402 lineArray.append('')
403
404 def diff_linesToCharsMunge(text):
405 """Split a text into an array of strings. Reduce the texts to a string
406 of hashes where each Unicode character represents one line.
407 Modifies linearray and linehash through being a closure.
408
409 Args:
410 text: String to encode.
411
412 Returns:
413 Encoded string.
414 """
415 chars = []
416 # Walk the text, pulling out a substring for each line.
417 # text.split('\n') would would temporarily double our memory footprint.
418 # Modifying text would create many large strings to garbage collect.
419 lineStart = 0
420 lineEnd = -1
421 while lineEnd < len(text) - 1:
422 lineEnd = text.find('\n', lineStart)
423 if lineEnd == -1:
424 lineEnd = len(text) - 1
425 line = text[lineStart:lineEnd + 1]
426 lineStart = lineEnd + 1
427
428 if line in lineHash:
429 chars.append(unichr(lineHash[line]))
430 else:
431 lineArray.append(line)
432 lineHash[line] = len(lineArray) - 1
433 chars.append(unichr(len(lineArray) - 1))
434 return "".join(chars)
435
436 chars1 = diff_linesToCharsMunge(text1)
437 chars2 = diff_linesToCharsMunge(text2)
438 return (chars1, chars2, lineArray)
439
440 def diff_charsToLines(self, diffs, lineArray):
441 """Rehydrate the text in a diff from a string of line hashes to real lines
442 of text.
443
444 Args:
445 diffs: Array of diff tuples.
446 lineArray: Array of unique strings.
447 """
448 for x in xrange(len(diffs)):
449 text = []
450 for char in diffs[x][1]:
451 text.append(lineArray[ord(char)])
452 diffs[x] = (diffs[x][0], "".join(text))
453
454 def diff_commonPrefix(self, text1, text2):
455 """Determine the common prefix of two strings.
456
457 Args:
458 text1: First string.
459 text2: Second string.
460
461 Returns:
462 The number of characters common to the start of each string.
463 """
464 # Quick check for common null cases.
465 if not text1 or not text2 or text1[0] != text2[0]:
466 return 0
467 # Binary search.
468 # Performance analysis: http://neil.fraser.name/news/2007/10/09/
469 pointermin = 0
470 pointermax = min(len(text1), len(text2))
471 pointermid = pointermax
472 pointerstart = 0
473 while pointermin < pointermid:
474 if text1[pointerstart:pointermid] == text2[pointerstart:pointermid]:
475 pointermin = pointermid
476 pointerstart = pointermin
477 else:
478 pointermax = pointermid
479 pointermid = (pointermax - pointermin) // 2 + pointermin
480 return pointermid
481
482 def diff_commonSuffix(self, text1, text2):
483 """Determine the common suffix of two strings.
484
485 Args:
486 text1: First string.
487 text2: Second string.
488
489 Returns:
490 The number of characters common to the end of each string.
491 """
492 # Quick check for common null cases.
493 if not text1 or not text2 or text1[-1] != text2[-1]:
494 return 0
495 # Binary search.
496 # Performance analysis: http://neil.fraser.name/news/2007/10/09/
497 pointermin = 0
498 pointermax = min(len(text1), len(text2))
499 pointermid = pointermax
500 pointerend = 0
501 while pointermin < pointermid:
502 if (text1[-pointermid:len(text1) - pointerend] ==
503 text2[-pointermid:len(text2) - pointerend]):
504 pointermin = pointermid
505 pointerend = pointermin
506 else:
507 pointermax = pointermid
508 pointermid = (pointermax - pointermin) // 2 + pointermin
509 return pointermid
510
511 def diff_commonOverlap(self, text1, text2):
512 """Determine if the suffix of one string is the prefix of another.
513
514 Args:
515 text1 First string.
516 text2 Second string.
517
518 Returns:
519 The number of characters common to the end of the first
520 string and the start of the second string.
521 """
522 # Cache the text lengths to prevent multiple calls.
523 text1_length = len(text1)
524 text2_length = len(text2)
525 # Eliminate the null case.
526 if text1_length == 0 or text2_length == 0:
527 return 0
528 # Truncate the longer string.
529 if text1_length > text2_length:
530 text1 = text1[-text2_length:]
531 elif text1_length < text2_length:
532 text2 = text2[:text1_length]
533 text_length = min(text1_length, text2_length)
534 # Quick check for the worst case.
535 if text1 == text2:
536 return text_length
537
538 # Start by looking for a single character match
539 # and increase length until no match is found.
540 # Performance analysis: http://neil.fraser.name/news/2010/11/04/
541 best = 0
542 length = 1
543 while True:
544 pattern = text1[-length:]
545 found = text2.find(pattern)
546 if found == -1:
547 return best
548 length += found
549 if found == 0 or text1[-length:] == text2[:length]:
550 best = length
551 length += 1
552
553 def diff_halfMatch(self, text1, text2):
554 """Do the two texts share a substring which is at least half the length of
555 the longer text?
556 This speedup can produce non-minimal diffs.
557
558 Args:
559 text1: First string.
560 text2: Second string.
561
562 Returns:
563 Five element Array, containing the prefix of text1, the suffix of text1,
564 the prefix of text2, the suffix of text2 and the common middle. Or None
565 if there was no match.
566 """
567 if self.Diff_Timeout <= 0:
568 # Don't risk returning a non-optimal diff if we have unlimited time.
569 return None
570 if len(text1) > len(text2):
571 (longtext, shorttext) = (text1, text2)
572 else:
573 (shorttext, longtext) = (text1, text2)
574 if len(longtext) < 4 or len(shorttext) * 2 < len(longtext):
575 return None # Pointless.
576
577 def diff_halfMatchI(longtext, shorttext, i):
578 """Does a substring of shorttext exist within longtext such that the
579 substring is at least half the length of longtext?
580 Closure, but does not reference any external variables.
581
582 Args:
583 longtext: Longer string.
584 shorttext: Shorter string.
585 i: Start index of quarter length substring within longtext.
586
587 Returns:
588 Five element Array, containing the prefix of longtext, the suffix of
589 longtext, the prefix of shorttext, the suffix of shorttext and the
590 common middle. Or None if there was no match.
591 """
592 seed = longtext[i:i + len(longtext) // 4]
593 best_common = ''
594 j = shorttext.find(seed)
595 while j != -1:
596 prefixLength = self.diff_commonPrefix(longtext[i:], shorttext[j:])
597 suffixLength = self.diff_commonSuffix(longtext[:i], shorttext[:j])
598 if len(best_common) < suffixLength + prefixLength:
599 best_common = (shorttext[j - suffixLength:j] +
600 shorttext[j:j + prefixLength])
601 best_longtext_a = longtext[:i - suffixLength]
602 best_longtext_b = longtext[i + prefixLength:]
603 best_shorttext_a = shorttext[:j - suffixLength]
604 best_shorttext_b = shorttext[j + prefixLength:]
605 j = shorttext.find(seed, j + 1)
606
607 if len(best_common) * 2 >= len(longtext):
608 return (best_longtext_a, best_longtext_b,
609 best_shorttext_a, best_shorttext_b, best_common)
610 else:
611 return None
612
613 # First check if the second quarter is the seed for a half-match.
614 hm1 = diff_halfMatchI(longtext, shorttext, (len(longtext) + 3) // 4)
615 # Check again based on the third quarter.
616 hm2 = diff_halfMatchI(longtext, shorttext, (len(longtext) + 1) // 2)
617 if not hm1 and not hm2:
618 return None
619 elif not hm2:
620 hm = hm1
621 elif not hm1:
622 hm = hm2
623 else:
624 # Both matched. Select the longest.
625 if len(hm1[4]) > len(hm2[4]):
626 hm = hm1
627 else:
628 hm = hm2
629
630 # A half-match was found, sort out the return data.
631 if len(text1) > len(text2):
632 (text1_a, text1_b, text2_a, text2_b, mid_common) = hm
633 else:
634 (text2_a, text2_b, text1_a, text1_b, mid_common) = hm
635 return (text1_a, text1_b, text2_a, text2_b, mid_common)
636
637 def diff_cleanupSemantic(self, diffs):
638 """Reduce the number of edits by eliminating semantically trivial
639 equalities.
640
641 Args:
642 diffs: Array of diff tuples.
643 """
644 changes = False
645 equalities = [] # Stack of indices where equalities are found.
646 lastequality = None # Always equal to diffs[equalities[-1]][1]
647 pointer = 0 # Index of current position.
648 # Number of chars that changed prior to the equality.
649 length_insertions1, length_deletions1 = 0, 0
650 # Number of chars that changed after the equality.
651 length_insertions2, length_deletions2 = 0, 0
652 while pointer < len(diffs):
653 if diffs[pointer][0] == self.DIFF_EQUAL: # Equality found.
654 equalities.append(pointer)
655 length_insertions1, length_insertions2 = length_insertions2, 0
656 length_deletions1, length_deletions2 = length_deletions2, 0
657 lastequality = diffs[pointer][1]
658 else: # An insertion or deletion.
659 if diffs[pointer][0] == self.DIFF_INSERT:
660 length_insertions2 += len(diffs[pointer][1])
661 else:
662 length_deletions2 += len(diffs[pointer][1])
663 # Eliminate an equality that is smaller or equal to the edits on both
664 # sides of it.
665 if (lastequality and (len(lastequality) <=
666 max(length_insertions1, length_deletions1)) and
667 (len(lastequality) <= max(length_insertions2, length_deletions2))):
668 # Duplicate record.
669 diffs.insert(equalities[-1], (self.DIFF_DELETE, lastequality))
670 # Change second copy to insert.
671 diffs[equalities[-1] + 1] = (self.DIFF_INSERT,
672 diffs[equalities[-1] + 1][1])
673 # Throw away the equality we just deleted.
674 equalities.pop()
675 # Throw away the previous equality (it needs to be reevaluated).
676 if len(equalities):
677 equalities.pop()
678 if len(equalities):
679 pointer = equalities[-1]
680 else:
681 pointer = -1
682 # Reset the counters.
683 length_insertions1, length_deletions1 = 0, 0
684 length_insertions2, length_deletions2 = 0, 0
685 lastequality = None
686 changes = True
687 pointer += 1
688
689 # Normalize the diff.
690 if changes:
691 self.diff_cleanupMerge(diffs)
692 self.diff_cleanupSemanticLossless(diffs)
693
694 # Find any overlaps between deletions and insertions.
695 # e.g: <del>abcxxx</del><ins>xxxdef</ins>
696 # -> <del>abc</del>xxx<ins>def</ins>
697 # e.g: <del>xxxabc</del><ins>defxxx</ins>
698 # -> <ins>def</ins>xxx<del>abc</del>
699 # Only extract an overlap if it is as big as the edit ahead or behind it.
700 pointer = 1
701 while pointer < len(diffs):
702 if (diffs[pointer - 1][0] == self.DIFF_DELETE and
703 diffs[pointer][0] == self.DIFF_INSERT):
704 deletion = diffs[pointer - 1][1]
705 insertion = diffs[pointer][1]
706 overlap_length1 = self.diff_commonOverlap(deletion, insertion)
707 overlap_length2 = self.diff_commonOverlap(insertion, deletion)
708 if overlap_length1 >= overlap_length2:
709 if (overlap_length1 >= len(deletion) / 2.0 or
710 overlap_length1 >= len(insertion) / 2.0):
711 # Overlap found. Insert an equality and trim the surrounding edits.
712 diffs.insert(pointer, (self.DIFF_EQUAL,
713 insertion[:overlap_length1]))
714 diffs[pointer - 1] = (self.DIFF_DELETE,
715 deletion[:len(deletion) - overlap_length1])
716 diffs[pointer + 1] = (self.DIFF_INSERT,
717 insertion[overlap_length1:])
718 pointer += 1
719 else:
720 if (overlap_length2 >= len(deletion) / 2.0 or
721 overlap_length2 >= len(insertion) / 2.0):
722 # Reverse overlap found.
723 # Insert an equality and swap and trim the surrounding edits.
724 diffs.insert(pointer, (self.DIFF_EQUAL, deletion[:overlap_length2]))
725 diffs[pointer - 1] = (self.DIFF_INSERT,
726 insertion[:len(insertion) - overlap_length2])
727 diffs[pointer + 1] = (self.DIFF_DELETE, deletion[overlap_length2:])
728 pointer += 1
729 pointer += 1
730 pointer += 1
731
732 def diff_cleanupSemanticLossless(self, diffs):
733 """Look for single edits surrounded on both sides by equalities
734 which can be shifted sideways to align the edit to a word boundary.
735 e.g: The c<ins>at c</ins>ame. -> The <ins>cat </ins>came.
736
737 Args:
738 diffs: Array of diff tuples.
739 """
740
741 def diff_cleanupSemanticScore(one, two):
742 """Given two strings, compute a score representing whether the
743 internal boundary falls on logical boundaries.
744 Scores range from 6 (best) to 0 (worst).
745 Closure, but does not reference any external variables.
746
747 Args:
748 one: First string.
749 two: Second string.
750
751 Returns:
752 The score.
753 """
754 if not one or not two:
755 # Edges are the best.
756 return 6
757
758 # Each port of this function behaves slightly differently due to
759 # subtle differences in each language's definition of things like
760 # 'whitespace'. Since this function's purpose is largely cosmetic,
761 # the choice has been made to use each language's native features
762 # rather than force total conformity.
763 char1 = one[-1]
764 char2 = two[0]
765 nonAlphaNumeric1 = not char1.isalnum()
766 nonAlphaNumeric2 = not char2.isalnum()
767 whitespace1 = nonAlphaNumeric1 and char1.isspace()
768 whitespace2 = nonAlphaNumeric2 and char2.isspace()
769 lineBreak1 = whitespace1 and (char1 == "\r" or char1 == "\n")
770 lineBreak2 = whitespace2 and (char2 == "\r" or char2 == "\n")
771 blankLine1 = lineBreak1 and self.BLANKLINEEND.search(one)
772 blankLine2 = lineBreak2 and self.BLANKLINESTART.match(two)
773
774 if blankLine1 or blankLine2:
775 # Five points for blank lines.
776 return 5
777 elif lineBreak1 or lineBreak2:
778 # Four points for line breaks.
779 return 4
780 elif nonAlphaNumeric1 and not whitespace1 and whitespace2:
781 # Three points for end of sentences.
782 return 3
783 elif whitespace1 or whitespace2:
784 # Two points for whitespace.
785 return 2
786 elif nonAlphaNumeric1 or nonAlphaNumeric2:
787 # One point for non-alphanumeric.
788 return 1
789 return 0
790
791 pointer = 1
792 # Intentionally ignore the first and last element (don't need checking).
793 while pointer < len(diffs) - 1:
794 if (diffs[pointer - 1][0] == self.DIFF_EQUAL and
795 diffs[pointer + 1][0] == self.DIFF_EQUAL):
796 # This is a single edit surrounded by equalities.
797 equality1 = diffs[pointer - 1][1]
798 edit = diffs[pointer][1]
799 equality2 = diffs[pointer + 1][1]
800
801 # First, shift the edit as far left as possible.
802 commonOffset = self.diff_commonSuffix(equality1, edit)
803 if commonOffset:
804 commonString = edit[-commonOffset:]
805 equality1 = equality1[:-commonOffset]
806 edit = commonString + edit[:-commonOffset]
807 equality2 = commonString + equality2
808
809 # Second, step character by character right, looking for the best fit.
810 bestEquality1 = equality1
811 bestEdit = edit
812 bestEquality2 = equality2
813 bestScore = (diff_cleanupSemanticScore(equality1, edit) +
814 diff_cleanupSemanticScore(edit, equality2))
815 while edit and equality2 and edit[0] == equality2[0]:
816 equality1 += edit[0]
817 edit = edit[1:] + equality2[0]
818 equality2 = equality2[1:]
819 score = (diff_cleanupSemanticScore(equality1, edit) +
820 diff_cleanupSemanticScore(edit, equality2))
821 # The >= encourages trailing rather than leading whitespace on edits.
822 if score >= bestScore:
823 bestScore = score
824 bestEquality1 = equality1
825 bestEdit = edit
826 bestEquality2 = equality2
827
828 if diffs[pointer - 1][1] != bestEquality1:
829 # We have an improvement, save it back to the diff.
830 if bestEquality1:
831 diffs[pointer - 1] = (diffs[pointer - 1][0], bestEquality1)
832 else:
833 del diffs[pointer - 1]
834 pointer -= 1
835 diffs[pointer] = (diffs[pointer][0], bestEdit)
836 if bestEquality2:
837 diffs[pointer + 1] = (diffs[pointer + 1][0], bestEquality2)
838 else:
839 del diffs[pointer + 1]
840 pointer -= 1
841 pointer += 1
842
843 # Define some regex patterns for matching boundaries.
844 BLANKLINEEND = re.compile(r"\n\r?\n$");
845 BLANKLINESTART = re.compile(r"^\r?\n\r?\n");
846
847 def diff_cleanupEfficiency(self, diffs):
848 """Reduce the number of edits by eliminating operationally trivial
849 equalities.
850
851 Args:
852 diffs: Array of diff tuples.
853 """
854 changes = False
855 equalities = [] # Stack of indices where equalities are found.
856 lastequality = None # Always equal to diffs[equalities[-1]][1]
857 pointer = 0 # Index of current position.
858 pre_ins = False # Is there an insertion operation before the last equality.
859 pre_del = False # Is there a deletion operation before the last equality.
860 post_ins = False # Is there an insertion operation after the last equality.
861 post_del = False # Is there a deletion operation after the last equality.
862 while pointer < len(diffs):
863 if diffs[pointer][0] == self.DIFF_EQUAL: # Equality found.
864 if (len(diffs[pointer][1]) < self.Diff_EditCost and
865 (post_ins or post_del)):
866 # Candidate found.
867 equalities.append(pointer)
868 pre_ins = post_ins
869 pre_del = post_del
870 lastequality = diffs[pointer][1]
871 else:
872 # Not a candidate, and can never become one.
873 equalities = []
874 lastequality = None
875
876 post_ins = post_del = False
877 else: # An insertion or deletion.
878 if diffs[pointer][0] == self.DIFF_DELETE:
879 post_del = True
880 else:
881 post_ins = True
882
883 # Five types to be split:
884 # <ins>A</ins><del>B</del>XY<ins>C</ins><del>D</del>
885 # <ins>A</ins>X<ins>C</ins><del>D</del>
886 # <ins>A</ins><del>B</del>X<ins>C</ins>
887 # <ins>A</del>X<ins>C</ins><del>D</del>
888 # <ins>A</ins><del>B</del>X<del>C</del>
889
890 if lastequality and ((pre_ins and pre_del and post_ins and post_del) or
891 ((len(lastequality) < self.Diff_EditCost / 2) and
892 (pre_ins + pre_del + post_ins + post_del) == 3)):
893 # Duplicate record.
894 diffs.insert(equalities[-1], (self.DIFF_DELETE, lastequality))
895 # Change second copy to insert.
896 diffs[equalities[-1] + 1] = (self.DIFF_INSERT,
897 diffs[equalities[-1] + 1][1])
898 equalities.pop() # Throw away the equality we just deleted.
899 lastequality = None
900 if pre_ins and pre_del:
901 # No changes made which could affect previous entry, keep going.
902 post_ins = post_del = True
903 equalities = []
904 else:
905 if len(equalities):
906 equalities.pop() # Throw away the previous equality.
907 if len(equalities):
908 pointer = equalities[-1]
909 else:
910 pointer = -1
911 post_ins = post_del = False
912 changes = True
913 pointer += 1
914
915 if changes:
916 self.diff_cleanupMerge(diffs)
917
918 def diff_cleanupMerge(self, diffs):
919 """Reorder and merge like edit sections. Merge equalities.
920 Any edit section can move as long as it doesn't cross an equality.
921
922 Args:
923 diffs: Array of diff tuples.
924 """
925 diffs.append((self.DIFF_EQUAL, '')) # Add a dummy entry at the end.
926 pointer = 0
927 count_delete = 0
928 count_insert = 0
929 text_delete = ''
930 text_insert = ''
931 while pointer < len(diffs):
932 if diffs[pointer][0] == self.DIFF_INSERT:
933 count_insert += 1
934 text_insert += diffs[pointer][1]
935 pointer += 1
936 elif diffs[pointer][0] == self.DIFF_DELETE:
937 count_delete += 1
938 text_delete += diffs[pointer][1]
939 pointer += 1
940 elif diffs[pointer][0] == self.DIFF_EQUAL:
941 # Upon reaching an equality, check for prior redundancies.
942 if count_delete + count_insert > 1:
943 if count_delete != 0 and count_insert != 0:
944 # Factor out any common prefixies.
945 commonlength = self.diff_commonPrefix(text_insert, text_delete)
946 if commonlength != 0:
947 x = pointer - count_delete - count_insert - 1
948 if x >= 0 and diffs[x][0] == self.DIFF_EQUAL:
949 diffs[x] = (diffs[x][0], diffs[x][1] +
950 text_insert[:commonlength])
951 else:
952 diffs.insert(0, (self.DIFF_EQUAL, text_insert[:commonlength]))
953 pointer += 1
954 text_insert = text_insert[commonlength:]
955 text_delete = text_delete[commonlength:]
956 # Factor out any common suffixies.
957 commonlength = self.diff_commonSuffix(text_insert, text_delete)
958 if commonlength != 0:
959 diffs[pointer] = (diffs[pointer][0], text_insert[-commonlength:] +
960 diffs[pointer][1])
961 text_insert = text_insert[:-commonlength]
962 text_delete = text_delete[:-commonlength]
963 # Delete the offending records and add the merged ones.
964 if count_delete == 0:
965 diffs[pointer - count_insert : pointer] = [
966 (self.DIFF_INSERT, text_insert)]
967 elif count_insert == 0:
968 diffs[pointer - count_delete : pointer] = [
969 (self.DIFF_DELETE, text_delete)]
970 else:
971 diffs[pointer - count_delete - count_insert : pointer] = [
972 (self.DIFF_DELETE, text_delete),
973 (self.DIFF_INSERT, text_insert)]
974 pointer = pointer - count_delete - count_insert + 1
975 if count_delete != 0:
976 pointer += 1
977 if count_insert != 0:
978 pointer += 1
979 elif pointer != 0 and diffs[pointer - 1][0] == self.DIFF_EQUAL:
980 # Merge this equality with the previous one.
981 diffs[pointer - 1] = (diffs[pointer - 1][0],
982 diffs[pointer - 1][1] + diffs[pointer][1])
983 del diffs[pointer]
984 else:
985 pointer += 1
986
987 count_insert = 0
988 count_delete = 0
989 text_delete = ''
990 text_insert = ''
991
992 if diffs[-1][1] == '':
993 diffs.pop() # Remove the dummy entry at the end.
994
995 # Second pass: look for single edits surrounded on both sides by equalities
996 # which can be shifted sideways to eliminate an equality.
997 # e.g: A<ins>BA</ins>C -> <ins>AB</ins>AC
998 changes = False
999 pointer = 1
1000 # Intentionally ignore the first and last element (don't need checking).
1001 while pointer < len(diffs) - 1:
1002 if (diffs[pointer - 1][0] == self.DIFF_EQUAL and
1003 diffs[pointer + 1][0] == self.DIFF_EQUAL):
1004 # This is a single edit surrounded by equalities.
1005 if diffs[pointer][1].endswith(diffs[pointer - 1][1]):
1006 # Shift the edit over the previous equality.
1007 diffs[pointer] = (diffs[pointer][0],
1008 diffs[pointer - 1][1] +
1009 diffs[pointer][1][:-len(diffs[pointer - 1][1])])
1010 diffs[pointer + 1] = (diffs[pointer + 1][0],
1011 diffs[pointer - 1][1] + diffs[pointer + 1][1])
1012 del diffs[pointer - 1]
1013 changes = True
1014 elif diffs[pointer][1].startswith(diffs[pointer + 1][1]):
1015 # Shift the edit over the next equality.
1016 diffs[pointer - 1] = (diffs[pointer - 1][0],
1017 diffs[pointer - 1][1] + diffs[pointer + 1][1])
1018 diffs[pointer] = (diffs[pointer][0],
1019 diffs[pointer][1][len(diffs[pointer + 1][1]):] +
1020 diffs[pointer + 1][1])
1021 del diffs[pointer + 1]
1022 changes = True
1023 pointer += 1
1024
1025 # If shifts were made, the diff needs reordering and another shift sweep.
1026 if changes:
1027 self.diff_cleanupMerge(diffs)
1028
1029 def diff_xIndex(self, diffs, loc):
1030 """loc is a location in text1, compute and return the equivalent location
1031 in text2. e.g. "The cat" vs "The big cat", 1->1, 5->8
1032
1033 Args:
1034 diffs: Array of diff tuples.
1035 loc: Location within text1.
1036
1037 Returns:
1038 Location within text2.
1039 """
1040 chars1 = 0
1041 chars2 = 0
1042 last_chars1 = 0
1043 last_chars2 = 0
1044 for x in xrange(len(diffs)):
1045 (op, text) = diffs[x]
1046 if op != self.DIFF_INSERT: # Equality or deletion.
1047 chars1 += len(text)
1048 if op != self.DIFF_DELETE: # Equality or insertion.
1049 chars2 += len(text)
1050 if chars1 > loc: # Overshot the location.
1051 break
1052 last_chars1 = chars1
1053 last_chars2 = chars2
1054
1055 if len(diffs) != x and diffs[x][0] == self.DIFF_DELETE:
1056 # The location was deleted.
1057 return last_chars2
1058 # Add the remaining len(character).
1059 return last_chars2 + (loc - last_chars1)
1060
1061 def diff_prettyHtml(self, diffs):
1062 """Convert a diff array into a pretty HTML report.
1063
1064 Args:
1065 diffs: Array of diff tuples.
1066
1067 Returns:
1068 HTML representation.
1069 """
1070 html = []
1071 for (op, data) in diffs:
1072 text = (data.replace("&", "&amp;").replace("<", "&lt;")
1073 .replace(">", "&gt;").replace("\n", "&para;<br>"))
1074 if op == self.DIFF_INSERT:
1075 html.append("<ins style=\"background:#e6ffe6;\">%s</ins>" % text)
1076 elif op == self.DIFF_DELETE:
1077 html.append("<del style=\"background:#ffe6e6;\">%s</del>" % text)
1078 elif op == self.DIFF_EQUAL:
1079 html.append("<span>%s</span>" % text)
1080 return "".join(html)
1081
1082 def diff_text1(self, diffs):
1083 """Compute and return the source text (all equalities and deletions).
1084
1085 Args:
1086 diffs: Array of diff tuples.
1087
1088 Returns:
1089 Source text.
1090 """
1091 text = []
1092 for (op, data) in diffs:
1093 if op != self.DIFF_INSERT:
1094 text.append(data)
1095 return "".join(text)
1096
1097 def diff_text2(self, diffs):
1098 """Compute and return the destination text (all equalities and insertions).
1099
1100 Args:
1101 diffs: Array of diff tuples.
1102
1103 Returns:
1104 Destination text.
1105 """
1106 text = []
1107 for (op, data) in diffs:
1108 if op != self.DIFF_DELETE:
1109 text.append(data)
1110 return "".join(text)
1111
1112 def diff_levenshtein(self, diffs):
1113 """Compute the Levenshtein distance; the number of inserted, deleted or
1114 substituted characters.
1115
1116 Args:
1117 diffs: Array of diff tuples.
1118
1119 Returns:
1120 Number of changes.
1121 """
1122 levenshtein = 0
1123 insertions = 0
1124 deletions = 0
1125 for (op, data) in diffs:
1126 if op == self.DIFF_INSERT:
1127 insertions += len(data)
1128 elif op == self.DIFF_DELETE:
1129 deletions += len(data)
1130 elif op == self.DIFF_EQUAL:
1131 # A deletion and an insertion is one substitution.
1132 levenshtein += max(insertions, deletions)
1133 insertions = 0
1134 deletions = 0
1135 levenshtein += max(insertions, deletions)
1136 return levenshtein
1137
1138 def diff_toDelta(self, diffs):
1139 """Crush the diff into an encoded string which describes the operations
1140 required to transform text1 into text2.
1141 E.g. =3\t-2\t+ing -> Keep 3 chars, delete 2 chars, insert 'ing'.
1142 Operations are tab-separated. Inserted text is escaped using %xx notation.
1143
1144 Args:
1145 diffs: Array of diff tuples.
1146
1147 Returns:
1148 Delta text.
1149 """
1150 text = []
1151 for (op, data) in diffs:
1152 if op == self.DIFF_INSERT:
1153 # High ascii will raise UnicodeDecodeError. Use Unicode instead.
1154 data = data.encode("utf-8")
1155 text.append("+" + urllib.quote(data, "!~*'();/?:@&=+$,# "))
1156 elif op == self.DIFF_DELETE:
1157 text.append("-%d" % len(data))
1158 elif op == self.DIFF_EQUAL:
1159 text.append("=%d" % len(data))
1160 return "\t".join(text)
1161
1162 def diff_fromDelta(self, text1, delta):
1163 """Given the original text1, and an encoded string which describes the
1164 operations required to transform text1 into text2, compute the full diff.
1165
1166 Args:
1167 text1: Source string for the diff.
1168 delta: Delta text.
1169
1170 Returns:
1171 Array of diff tuples.
1172
1173 Raises:
1174 ValueError: If invalid input.
1175 """
1176 if type(delta) == unicode:
1177 # Deltas should be composed of a subset of ascii chars, Unicode not
1178 # required. If this encode raises UnicodeEncodeError, delta is invalid.
1179 delta = delta.encode("ascii")
1180 diffs = []
1181 pointer = 0 # Cursor in text1
1182 tokens = delta.split("\t")
1183 for token in tokens:
1184 if token == "":
1185 # Blank tokens are ok (from a trailing \t).
1186 continue
1187 # Each token begins with a one character parameter which specifies the
1188 # operation of this token (delete, insert, equality).
1189 param = token[1:]
1190 if token[0] == "+":
1191 param = urllib.unquote(param).decode("utf-8")
1192 diffs.append((self.DIFF_INSERT, param))
1193 elif token[0] == "-" or token[0] == "=":
1194 try:
1195 n = int(param)
1196 except ValueError:
1197 raise ValueError("Invalid number in diff_fromDelta: " + param)
1198 if n < 0:
1199 raise ValueError("Negative number in diff_fromDelta: " + param)
1200 text = text1[pointer : pointer + n]
1201 pointer += n
1202 if token[0] == "=":
1203 diffs.append((self.DIFF_EQUAL, text))
1204 else:
1205 diffs.append((self.DIFF_DELETE, text))
1206 else:
1207 # Anything else is an error.
1208 raise ValueError("Invalid diff operation in diff_fromDelta: " +
1209 token[0])
1210 if pointer != len(text1):
1211 raise ValueError(
1212 "Delta length (%d) does not equal source text length (%d)." %
1213 (pointer, len(text1)))
1214 return diffs
1215
1216 # MATCH FUNCTIONS
1217
1218 def match_main(self, text, pattern, loc):
1219 """Locate the best instance of 'pattern' in 'text' near 'loc'.
1220
1221 Args:
1222 text: The text to search.
1223 pattern: The pattern to search for.
1224 loc: The location to search around.
1225
1226 Returns:
1227 Best match index or -1.
1228 """
1229 # Check for null inputs.
1230 if text == None or pattern == None:
1231 raise ValueError("Null inputs. (match_main)")
1232
1233 loc = max(0, min(loc, len(text)))
1234 if text == pattern:
1235 # Shortcut (potentially not guaranteed by the algorithm)
1236 return 0
1237 elif not text:
1238 # Nothing to match.
1239 return -1
1240 elif text[loc:loc + len(pattern)] == pattern:
1241 # Perfect match at the perfect spot! (Includes case of null pattern)
1242 return loc
1243 else:
1244 # Do a fuzzy compare.
1245 match = self.match_bitap(text, pattern, loc)
1246 return match
1247
1248 def match_bitap(self, text, pattern, loc):
1249 """Locate the best instance of 'pattern' in 'text' near 'loc' using the
1250 Bitap algorithm.
1251
1252 Args:
1253 text: The text to search.
1254 pattern: The pattern to search for.
1255 loc: The location to search around.
1256
1257 Returns:
1258 Best match index or -1.
1259 """
1260 # Python doesn't have a maxint limit, so ignore this check.
1261 #if self.Match_MaxBits != 0 and len(pattern) > self.Match_MaxBits:
1262 # raise ValueError("Pattern too long for this application.")
1263
1264 # Initialise the alphabet.
1265 s = self.match_alphabet(pattern)
1266
1267 def match_bitapScore(e, x):
1268 """Compute and return the score for a match with e errors and x location.
1269 Accesses loc and pattern through being a closure.
1270
1271 Args:
1272 e: Number of errors in match.
1273 x: Location of match.
1274
1275 Returns:
1276 Overall score for match (0.0 = good, 1.0 = bad).
1277 """
1278 accuracy = float(e) / len(pattern)
1279 proximity = abs(loc - x)
1280 if not self.Match_Distance:
1281 # Dodge divide by zero error.
1282 return proximity and 1.0 or accuracy
1283 return accuracy + (proximity / float(self.Match_Distance))
1284
1285 # Highest score beyond which we give up.
1286 score_threshold = self.Match_Threshold
1287 # Is there a nearby exact match? (speedup)
1288 best_loc = text.find(pattern, loc)
1289 if best_loc != -1:
1290 score_threshold = min(match_bitapScore(0, best_loc), score_threshold)
1291 # What about in the other direction? (speedup)
1292 best_loc = text.rfind(pattern, loc + len(pattern))
1293 if best_loc != -1:
1294 score_threshold = min(match_bitapScore(0, best_loc), score_threshold)
1295
1296 # Initialise the bit arrays.
1297 matchmask = 1 << (len(pattern) - 1)
1298 best_loc = -1
1299
1300 bin_max = len(pattern) + len(text)
1301 # Empty initialization added to appease pychecker.
1302 last_rd = None
1303 for d in xrange(len(pattern)):
1304 # Scan for the best match each iteration allows for one more error.
1305 # Run a binary search to determine how far from 'loc' we can stray at
1306 # this error level.
1307 bin_min = 0
1308 bin_mid = bin_max
1309 while bin_min < bin_mid:
1310 if match_bitapScore(d, loc + bin_mid) <= score_threshold:
1311 bin_min = bin_mid
1312 else:
1313 bin_max = bin_mid
1314 bin_mid = (bin_max - bin_min) // 2 + bin_min
1315
1316 # Use the result from this iteration as the maximum for the next.
1317 bin_max = bin_mid
1318 start = max(1, loc - bin_mid + 1)
1319 finish = min(loc + bin_mid, len(text)) + len(pattern)
1320
1321 rd = [0] * (finish + 2)
1322 rd[finish + 1] = (1 << d) - 1
1323 for j in xrange(finish, start - 1, -1):
1324 if len(text) <= j - 1:
1325 # Out of range.
1326 charMatch = 0
1327 else:
1328 charMatch = s.get(text[j - 1], 0)
1329 if d == 0: # First pass: exact match.
1330 rd[j] = ((rd[j + 1] << 1) | 1) & charMatch
1331 else: # Subsequent passes: fuzzy match.
1332 rd[j] = (((rd[j + 1] << 1) | 1) & charMatch) | (
1333 ((last_rd[j + 1] | last_rd[j]) << 1) | 1) | last_rd[j + 1]
1334 if rd[j] & matchmask:
1335 score = match_bitapScore(d, j - 1)
1336 # This match will almost certainly be better than any existing match.
1337 # But check anyway.
1338 if score <= score_threshold:
1339 # Told you so.
1340 score_threshold = score
1341 best_loc = j - 1
1342 if best_loc > loc:
1343 # When passing loc, don't exceed our current distance from loc.
1344 start = max(1, 2 * loc - best_loc)
1345 else:
1346 # Already passed loc, downhill from here on in.
1347 break
1348 # No hope for a (better) match at greater error levels.
1349 if match_bitapScore(d + 1, loc) > score_threshold:
1350 break
1351 last_rd = rd
1352 return best_loc
1353
1354 def match_alphabet(self, pattern):
1355 """Initialise the alphabet for the Bitap algorithm.
1356
1357 Args:
1358 pattern: The text to encode.
1359
1360 Returns:
1361 Hash of character locations.
1362 """
1363 s = {}
1364 for char in pattern:
1365 s[char] = 0
1366 for i in xrange(len(pattern)):
1367 s[pattern[i]] |= 1 << (len(pattern) - i - 1)
1368 return s
1369
1370 # PATCH FUNCTIONS
1371
1372 def patch_addContext(self, patch, text):
1373 """Increase the context until it is unique,
1374 but don't let the pattern expand beyond Match_MaxBits.
1375
1376 Args:
1377 patch: The patch to grow.
1378 text: Source text.
1379 """
1380 if len(text) == 0:
1381 return
1382 pattern = text[patch.start2 : patch.start2 + patch.length1]
1383 padding = 0
1384
1385 # Look for the first and last matches of pattern in text. If two different
1386 # matches are found, increase the pattern length.
1387 while (text.find(pattern) != text.rfind(pattern) and (self.Match_MaxBits ==
1388 0 or len(pattern) < self.Match_MaxBits - self.Patch_Margin -
1389 self.Patch_Margin)):
1390 padding += self.Patch_Margin
1391 pattern = text[max(0, patch.start2 - padding) :
1392 patch.start2 + patch.length1 + padding]
1393 # Add one chunk for good luck.
1394 padding += self.Patch_Margin
1395
1396 # Add the prefix.
1397 prefix = text[max(0, patch.start2 - padding) : patch.start2]
1398 if prefix:
1399 patch.diffs[:0] = [(self.DIFF_EQUAL, prefix)]
1400 # Add the suffix.
1401 suffix = text[patch.start2 + patch.length1 :
1402 patch.start2 + patch.length1 + padding]
1403 if suffix:
1404 patch.diffs.append((self.DIFF_EQUAL, suffix))
1405
1406 # Roll back the start points.
1407 patch.start1 -= len(prefix)
1408 patch.start2 -= len(prefix)
1409 # Extend lengths.
1410 patch.length1 += len(prefix) + len(suffix)
1411 patch.length2 += len(prefix) + len(suffix)
1412
1413 def patch_make(self, a, b=None, c=None):
1414 """Compute a list of patches to turn text1 into text2.
1415 Use diffs if provided, otherwise compute it ourselves.
1416 There are four ways to call this function, depending on what data is
1417 available to the caller:
1418 Method 1:
1419 a = text1, b = text2
1420 Method 2:
1421 a = diffs
1422 Method 3 (optimal):
1423 a = text1, b = diffs
1424 Method 4 (deprecated, use method 3):
1425 a = text1, b = text2, c = diffs
1426
1427 Args:
1428 a: text1 (methods 1,3,4) or Array of diff tuples for text1 to
1429 text2 (method 2).
1430 b: text2 (methods 1,4) or Array of diff tuples for text1 to
1431 text2 (method 3) or undefined (method 2).
1432 c: Array of diff tuples for text1 to text2 (method 4) or
1433 undefined (methods 1,2,3).
1434
1435 Returns:
1436 Array of Patch objects.
1437 """
1438 text1 = None
1439 diffs = None
1440 # Note that texts may arrive as 'str' or 'unicode'.
1441 if isinstance(a, basestring) and isinstance(b, basestring) and c is None:
1442 # Method 1: text1, text2
1443 # Compute diffs from text1 and text2.
1444 text1 = a
1445 diffs = self.diff_main(text1, b, True)
1446 if len(diffs) > 2:
1447 self.diff_cleanupSemantic(diffs)
1448 self.diff_cleanupEfficiency(diffs)
1449 elif isinstance(a, list) and b is None and c is None:
1450 # Method 2: diffs
1451 # Compute text1 from diffs.
1452 diffs = a
1453 text1 = self.diff_text1(diffs)
1454 elif isinstance(a, basestring) and isinstance(b, list) and c is None:
1455 # Method 3: text1, diffs
1456 text1 = a
1457 diffs = b
1458 elif (isinstance(a, basestring) and isinstance(b, basestring) and
1459 isinstance(c, list)):
1460 # Method 4: text1, text2, diffs
1461 # text2 is not used.
1462 text1 = a
1463 diffs = c
1464 else:
1465 raise ValueError("Unknown call format to patch_make.")
1466
1467 if not diffs:
1468 return [] # Get rid of the None case.
1469 patches = []
1470 patch = patch_obj()
1471 char_count1 = 0 # Number of characters into the text1 string.
1472 char_count2 = 0 # Number of characters into the text2 string.
1473 prepatch_text = text1 # Recreate the patches to determine context info.
1474 postpatch_text = text1
1475 for x in xrange(len(diffs)):
1476 (diff_type, diff_text) = diffs[x]
1477 if len(patch.diffs) == 0 and diff_type != self.DIFF_EQUAL:
1478 # A new patch starts here.
1479 patch.start1 = char_count1
1480 patch.start2 = char_count2
1481 if diff_type == self.DIFF_INSERT:
1482 # Insertion
1483 patch.diffs.append(diffs[x])
1484 patch.length2 += len(diff_text)
1485 postpatch_text = (postpatch_text[:char_count2] + diff_text +
1486 postpatch_text[char_count2:])
1487 elif diff_type == self.DIFF_DELETE:
1488 # Deletion.
1489 patch.length1 += len(diff_text)
1490 patch.diffs.append(diffs[x])
1491 postpatch_text = (postpatch_text[:char_count2] +
1492 postpatch_text[char_count2 + len(diff_text):])
1493 elif (diff_type == self.DIFF_EQUAL and
1494 len(diff_text) <= 2 * self.Patch_Margin and
1495 len(patch.diffs) != 0 and len(diffs) != x + 1):
1496 # Small equality inside a patch.
1497 patch.diffs.append(diffs[x])
1498 patch.length1 += len(diff_text)
1499 patch.length2 += len(diff_text)
1500
1501 if (diff_type == self.DIFF_EQUAL and
1502 len(diff_text) >= 2 * self.Patch_Margin):
1503 # Time for a new patch.
1504 if len(patch.diffs) != 0:
1505 self.patch_addContext(patch, prepatch_text)
1506 patches.append(patch)
1507 patch = patch_obj()
1508 # Unlike Unidiff, our patch lists have a rolling context.
1509 # http://code.google.com/p/google-diff-match-patch/wiki/Unidiff
1510 # Update prepatch text & pos to reflect the application of the
1511 # just completed patch.
1512 prepatch_text = postpatch_text
1513 char_count1 = char_count2
1514
1515 # Update the current character count.
1516 if diff_type != self.DIFF_INSERT:
1517 char_count1 += len(diff_text)
1518 if diff_type != self.DIFF_DELETE:
1519 char_count2 += len(diff_text)
1520
1521 # Pick up the leftover patch if not empty.
1522 if len(patch.diffs) != 0:
1523 self.patch_addContext(patch, prepatch_text)
1524 patches.append(patch)
1525 return patches
1526
1527 def patch_deepCopy(self, patches):
1528 """Given an array of patches, return another array that is identical.
1529
1530 Args:
1531 patches: Array of Patch objects.
1532
1533 Returns:
1534 Array of Patch objects.
1535 """
1536 patchesCopy = []
1537 for patch in patches:
1538 patchCopy = patch_obj()
1539 # No need to deep copy the tuples since they are immutable.
1540 patchCopy.diffs = patch.diffs[:]
1541 patchCopy.start1 = patch.start1
1542 patchCopy.start2 = patch.start2
1543 patchCopy.length1 = patch.length1
1544 patchCopy.length2 = patch.length2
1545 patchesCopy.append(patchCopy)
1546 return patchesCopy
1547
1548 def patch_apply(self, patches, text):
1549 """Merge a set of patches onto the text. Return a patched text, as well
1550 as a list of true/false values indicating which patches were applied.
1551
1552 Args:
1553 patches: Array of Patch objects.
1554 text: Old text.
1555
1556 Returns:
1557 Two element Array, containing the new text and an array of boolean values.
1558 """
1559 if not patches:
1560 return (text, [])
1561
1562 # Deep copy the patches so that no changes are made to originals.
1563 patches = self.patch_deepCopy(patches)
1564
1565 nullPadding = self.patch_addPadding(patches)
1566 text = nullPadding + text + nullPadding
1567 self.patch_splitMax(patches)
1568
1569 # delta keeps track of the offset between the expected and actual location
1570 # of the previous patch. If there are patches expected at positions 10 and
1571 # 20, but the first patch was found at 12, delta is 2 and the second patch
1572 # has an effective expected position of 22.
1573 delta = 0
1574 results = []
1575 for patch in patches:
1576 expected_loc = patch.start2 + delta
1577 text1 = self.diff_text1(patch.diffs)
1578 end_loc = -1
1579 if len(text1) > self.Match_MaxBits:
1580 # patch_splitMax will only provide an oversized pattern in the case of
1581 # a monster delete.
1582 start_loc = self.match_main(text, text1[:self.Match_MaxBits],
1583 expected_loc)
1584 if start_loc != -1:
1585 end_loc = self.match_main(text, text1[-self.Match_MaxBits:],
1586 expected_loc + len(text1) - self.Match_MaxBits)
1587 if end_loc == -1 or start_loc >= end_loc:
1588 # Can't find valid trailing context. Drop this patch.
1589 start_loc = -1
1590 else:
1591 start_loc = self.match_main(text, text1, expected_loc)
1592 if start_loc == -1:
1593 # No match found. :(
1594 results.append(False)
1595 # Subtract the delta for this failed patch from subsequent patches.
1596 delta -= patch.length2 - patch.length1
1597 else:
1598 # Found a match. :)
1599 results.append(True)
1600 delta = start_loc - expected_loc
1601 if end_loc == -1:
1602 text2 = text[start_loc : start_loc + len(text1)]
1603 else:
1604 text2 = text[start_loc : end_loc + self.Match_MaxBits]
1605 if text1 == text2:
1606 # Perfect match, just shove the replacement text in.
1607 text = (text[:start_loc] + self.diff_text2(patch.diffs) +
1608 text[start_loc + len(text1):])
1609 else:
1610 # Imperfect match.
1611 # Run a diff to get a framework of equivalent indices.
1612 diffs = self.diff_main(text1, text2, False)
1613 if (len(text1) > self.Match_MaxBits and
1614 self.diff_levenshtein(diffs) / float(len(text1)) >
1615 self.Patch_DeleteThreshold):
1616 # The end points match, but the content is unacceptably bad.
1617 results[-1] = False
1618 else:
1619 self.diff_cleanupSemanticLossless(diffs)
1620 index1 = 0
1621 for (op, data) in patch.diffs:
1622 if op != self.DIFF_EQUAL:
1623 index2 = self.diff_xIndex(diffs, index1)
1624 if op == self.DIFF_INSERT: # Insertion
1625 text = text[:start_loc + index2] + data + text[start_loc +
1626 index2:]
1627 elif op == self.DIFF_DELETE: # Deletion
1628 text = text[:start_loc + index2] + text[start_loc +
1629 self.diff_xIndex(diffs, index1 + len(data)):]
1630 if op != self.DIFF_DELETE:
1631 index1 += len(data)
1632 # Strip the padding off.
1633 text = text[len(nullPadding):-len(nullPadding)]
1634 return (text, results)
1635
1636 def patch_addPadding(self, patches):
1637 """Add some padding on text start and end so that edges can match
1638 something. Intended to be called only from within patch_apply.
1639
1640 Args:
1641 patches: Array of Patch objects.
1642
1643 Returns:
1644 The padding string added to each side.
1645 """
1646 paddingLength = self.Patch_Margin
1647 nullPadding = ""
1648 for x in xrange(1, paddingLength + 1):
1649 nullPadding += chr(x)
1650
1651 # Bump all the patches forward.
1652 for patch in patches:
1653 patch.start1 += paddingLength
1654 patch.start2 += paddingLength
1655
1656 # Add some padding on start of first diff.
1657 patch = patches[0]
1658 diffs = patch.diffs
1659 if not diffs or diffs[0][0] != self.DIFF_EQUAL:
1660 # Add nullPadding equality.
1661 diffs.insert(0, (self.DIFF_EQUAL, nullPadding))
1662 patch.start1 -= paddingLength # Should be 0.
1663 patch.start2 -= paddingLength # Should be 0.
1664 patch.length1 += paddingLength
1665 patch.length2 += paddingLength
1666 elif paddingLength > len(diffs[0][1]):
1667 # Grow first equality.
1668 extraLength = paddingLength - len(diffs[0][1])
1669 newText = nullPadding[len(diffs[0][1]):] + diffs[0][1]
1670 diffs[0] = (diffs[0][0], newText)
1671 patch.start1 -= extraLength
1672 patch.start2 -= extraLength
1673 patch.length1 += extraLength
1674 patch.length2 += extraLength
1675
1676 # Add some padding on end of last diff.
1677 patch = patches[-1]
1678 diffs = patch.diffs
1679 if not diffs or diffs[-1][0] != self.DIFF_EQUAL:
1680 # Add nullPadding equality.
1681 diffs.append((self.DIFF_EQUAL, nullPadding))
1682 patch.length1 += paddingLength
1683 patch.length2 += paddingLength
1684 elif paddingLength > len(diffs[-1][1]):
1685 # Grow last equality.
1686 extraLength = paddingLength - len(diffs[-1][1])
1687 newText = diffs[-1][1] + nullPadding[:extraLength]
1688 diffs[-1] = (diffs[-1][0], newText)
1689 patch.length1 += extraLength
1690 patch.length2 += extraLength
1691
1692 return nullPadding
1693
1694 def patch_splitMax(self, patches):
1695 """Look through the patches and break up any which are longer than the
1696 maximum limit of the match algorithm.
1697 Intended to be called only from within patch_apply.
1698
1699 Args:
1700 patches: Array of Patch objects.
1701 """
1702 patch_size = self.Match_MaxBits
1703 if patch_size == 0:
1704 # Python has the option of not splitting strings due to its ability
1705 # to handle integers of arbitrary precision.
1706 return
1707 for x in xrange(len(patches)):
1708 if patches[x].length1 <= patch_size:
1709 continue
1710 bigpatch = patches[x]
1711 # Remove the big old patch.
1712 del patches[x]
1713 x -= 1
1714 start1 = bigpatch.start1
1715 start2 = bigpatch.start2
1716 precontext = ''
1717 while len(bigpatch.diffs) != 0:
1718 # Create one of several smaller patches.
1719 patch = patch_obj()
1720 empty = True
1721 patch.start1 = start1 - len(precontext)
1722 patch.start2 = start2 - len(precontext)
1723 if precontext:
1724 patch.length1 = patch.length2 = len(precontext)
1725 patch.diffs.append((self.DIFF_EQUAL, precontext))
1726
1727 while (len(bigpatch.diffs) != 0 and
1728 patch.length1 < patch_size - self.Patch_Margin):
1729 (diff_type, diff_text) = bigpatch.diffs[0]
1730 if diff_type == self.DIFF_INSERT:
1731 # Insertions are harmless.
1732 patch.length2 += len(diff_text)
1733 start2 += len(diff_text)
1734 patch.diffs.append(bigpatch.diffs.pop(0))
1735 empty = False
1736 elif (diff_type == self.DIFF_DELETE and len(patch.diffs) == 1 and
1737 patch.diffs[0][0] == self.DIFF_EQUAL and
1738 len(diff_text) > 2 * patch_size):
1739 # This is a large deletion. Let it pass in one chunk.
1740 patch.length1 += len(diff_text)
1741 start1 += len(diff_text)
1742 empty = False
1743 patch.diffs.append((diff_type, diff_text))
1744 del bigpatch.diffs[0]
1745 else:
1746 # Deletion or equality. Only take as much as we can stomach.
1747 diff_text = diff_text[:patch_size - patch.length1 -
1748 self.Patch_Margin]
1749 patch.length1 += len(diff_text)
1750 start1 += len(diff_text)
1751 if diff_type == self.DIFF_EQUAL:
1752 patch.length2 += len(diff_text)
1753 start2 += len(diff_text)
1754 else:
1755 empty = False
1756
1757 patch.diffs.append((diff_type, diff_text))
1758 if diff_text == bigpatch.diffs[0][1]:
1759 del bigpatch.diffs[0]
1760 else:
1761 bigpatch.diffs[0] = (bigpatch.diffs[0][0],
1762 bigpatch.diffs[0][1][len(diff_text):])
1763
1764 # Compute the head context for the next patch.
1765 precontext = self.diff_text2(patch.diffs)
1766 precontext = precontext[-self.Patch_Margin:]
1767 # Append the end context for this patch.
1768 postcontext = self.diff_text1(bigpatch.diffs)[:self.Patch_Margin]
1769 if postcontext:
1770 patch.length1 += len(postcontext)
1771 patch.length2 += len(postcontext)
1772 if len(patch.diffs) != 0 and patch.diffs[-1][0] == self.DIFF_EQUAL:
1773 patch.diffs[-1] = (self.DIFF_EQUAL, patch.diffs[-1][1] +
1774 postcontext)
1775 else:
1776 patch.diffs.append((self.DIFF_EQUAL, postcontext))
1777
1778 if not empty:
1779 x += 1
1780 patches.insert(x, patch)
1781
1782 def patch_toText(self, patches):
1783 """Take a list of patches and return a textual representation.
1784
1785 Args:
1786 patches: Array of Patch objects.
1787
1788 Returns:
1789 Text representation of patches.
1790 """
1791 text = []
1792 for patch in patches:
1793 text.append(str(patch))
1794 return "".join(text)
1795
1796 def patch_fromText(self, textline):
1797 """Parse a textual representation of patches and return a list of patch
1798 objects.
1799
1800 Args:
1801 textline: Text representation of patches.
1802
1803 Returns:
1804 Array of Patch objects.
1805
1806 Raises:
1807 ValueError: If invalid input.
1808 """
1809 if type(textline) == unicode:
1810 # Patches should be composed of a subset of ascii chars, Unicode not
1811 # required. If this encode raises UnicodeEncodeError, patch is invalid.
1812 textline = textline.encode("ascii")
1813 patches = []
1814 if not textline:
1815 return patches
1816 text = textline.split('\n')
1817 while len(text) != 0:
1818 m = re.match("^@@ -(\d+),?(\d*) \+(\d+),?(\d*) @@$", text[0])
1819 if not m:
1820 raise ValueError("Invalid patch string: " + text[0])
1821 patch = patch_obj()
1822 patches.append(patch)
1823 patch.start1 = int(m.group(1))
1824 if m.group(2) == '':
1825 patch.start1 -= 1
1826 patch.length1 = 1
1827 elif m.group(2) == '0':
1828 patch.length1 = 0
1829 else:
1830 patch.start1 -= 1
1831 patch.length1 = int(m.group(2))
1832
1833 patch.start2 = int(m.group(3))
1834 if m.group(4) == '':
1835 patch.start2 -= 1
1836 patch.length2 = 1
1837 elif m.group(4) == '0':
1838 patch.length2 = 0
1839 else:
1840 patch.start2 -= 1
1841 patch.length2 = int(m.group(4))
1842
1843 del text[0]
1844
1845 while len(text) != 0:
1846 if text[0]:
1847 sign = text[0][0]
1848 else:
1849 sign = ''
1850 line = urllib.unquote(text[0][1:])
1851 line = line.decode("utf-8")
1852 if sign == '+':
1853 # Insertion.
1854 patch.diffs.append((self.DIFF_INSERT, line))
1855 elif sign == '-':
1856 # Deletion.
1857 patch.diffs.append((self.DIFF_DELETE, line))
1858 elif sign == ' ':
1859 # Minor equality.
1860 patch.diffs.append((self.DIFF_EQUAL, line))
1861 elif sign == '@':
1862 # Start of next patch.
1863 break
1864 elif sign == '':
1865 # Blank line? Whatever.
1866 pass
1867 else:
1868 # WTF?
1869 raise ValueError("Invalid patch mode: '%s'\n%s" % (sign, line))
1870 del text[0]
1871 return patches
1872
1873
1874class patch_obj:
1875 """Class representing one patch operation.
1876 """
1877
1878 def __init__(self):
1879 """Initializes with an empty list of diffs.
1880 """
1881 self.diffs = []
1882 self.start1 = None
1883 self.start2 = None
1884 self.length1 = 0
1885 self.length2 = 0
1886
1887 def __str__(self):
1888 """Emmulate GNU diff's format.
1889 Header: @@ -382,8 +481,9 @@
1890 Indicies are printed as 1-based, not 0-based.
1891
1892 Returns:
1893 The GNU diff string.
1894 """
1895 if self.length1 == 0:
1896 coords1 = str(self.start1) + ",0"
1897 elif self.length1 == 1:
1898 coords1 = str(self.start1 + 1)
1899 else:
1900 coords1 = str(self.start1 + 1) + "," + str(self.length1)
1901 if self.length2 == 0:
1902 coords2 = str(self.start2) + ",0"
1903 elif self.length2 == 1:
1904 coords2 = str(self.start2 + 1)
1905 else:
1906 coords2 = str(self.start2 + 1) + "," + str(self.length2)
1907 text = ["@@ -", coords1, " +", coords2, " @@\n"]
1908 # Escape the body of the patch with %xx notation.
1909 for (op, data) in self.diffs:
1910 if op == diff_match_patch.DIFF_INSERT:
1911 text.append("+")
1912 elif op == diff_match_patch.DIFF_DELETE:
1913 text.append("-")
1914 elif op == diff_match_patch.DIFF_EQUAL:
1915 text.append(" ")
1916 # High ascii will raise UnicodeDecodeError. Use Unicode instead.
1917 data = data.encode("utf-8")
1918 text.append(urllib.quote(data, "!~*'();/?:@&=+$,# ") + "\n")
1919 return "".join(text)
01920
=== added file 'charmtools/compose/fetchers.py'
--- charmtools/compose/fetchers.py 1970-01-01 00:00:00 +0000
+++ charmtools/compose/fetchers.py 2015-08-31 19:32:56 +0000
@@ -0,0 +1,117 @@
1import re
2import tempfile
3import os
4
5import requests
6from bundletester import fetchers
7from bundletester.fetchers import (git, # noqa
8 Fetcher,
9 get_fetcher,
10 FetchError)
11
12from path import path
13
14
15class RepoFetcher(fetchers.LocalFetcher):
16 @classmethod
17 def can_fetch(cls, url):
18 search_path = [os.getcwd(), os.environ.get("JUJU_REPOSITORY", ".")]
19 cp = os.environ.get("COMPOSER_PATH")
20 if cp:
21 search_path.extend(cp.split(":"))
22 for part in search_path:
23 p = (path(part) / url).normpath()
24 if p.exists():
25 return dict(path=p)
26 return {}
27
28fetchers.FETCHERS.insert(0, RepoFetcher)
29
30
31class InterfaceFetcher(fetchers.LocalFetcher):
32 # XXX: When hosted somewhere, fix this
33 INTERFACE_DOMAIN = "http://interfaces.juju.solutions"
34 NAMESPACE = "interface"
35 ENVIRON = "INTERFACE_PATH"
36 OPTIONAL_PREFIX = "juju-relation-"
37 ENDPOINT = "/api/v1/interface"
38
39 @classmethod
40 def can_fetch(cls, url):
41 # Search local path first, then
42 # the interface webservice
43 if url.startswith("{}:".format(cls.NAMESPACE)):
44 url = url[len(cls.NAMESPACE) + 1:]
45 search_path = [os.environ.get("JUJU_REPOSITORY", ".")]
46 cp = os.environ.get(cls.ENVIRON)
47 if cp:
48 search_path.extend(cp.split(os.pathsep))
49 for part in search_path:
50 p = (path(part) / url).normpath()
51 if p.exists():
52 return dict(path=p)
53
54 choices = [url]
55 if url.startswith(cls.OPTIONAL_PREFIX):
56 choices.append(url[len(cls.OPTIONAL_PREFIX):])
57 for choice in choices:
58 uri = "%s%s/%s/" % (
59 cls.INTERFACE_DOMAIN, cls.ENDPOINT, choice)
60 try:
61 result = requests.get(uri)
62 except:
63 result = None
64 if result and result.ok:
65 result = result.json()
66 if "repo" in result:
67 return result
68 return {}
69
70 def fetch(self, dir_):
71 if hasattr(self, "path"):
72 return super(InterfaceFetcher, self).fetch(dir_)
73 elif hasattr(self, "repo"):
74 # use the github fetcher for now
75 u = self.url[10:]
76 f = get_fetcher(self.repo)
77 if hasattr(f, "repo"):
78 basename = path(f.repo).name.splitext()[0]
79 else:
80 basename = u
81 res = f.fetch(dir_)
82 target = dir_ / basename
83 if res != target:
84 target.rmtree_p()
85 path(res).rename(target)
86 return target
87
88
89fetchers.FETCHERS.insert(0, InterfaceFetcher)
90
91
92class LayerFetcher(InterfaceFetcher):
93 INTERFACE_DOMAIN = "http://interfaces.juju.solutions"
94 NAMESPACE = "layer"
95 ENVIRON = "COMPOSER_PATH"
96 OPTIONAL_PREFIX = "juju-layer-"
97 ENDPOINT = "/api/v1/layer"
98
99fetchers.FETCHERS.insert(0, LayerFetcher)
100
101
102class LaunchpadGitFetcher(Fetcher):
103 # XXX: this should be upstreamed
104 MATCH = re.compile(r"""
105 ^(git:|https)?://git.launchpad.net/
106 (?P<repo>[^@]*)(@(?P<revision>.*))?$
107 """, re.VERBOSE)
108
109 def fetch(self, dir_):
110 dir_ = tempfile.mkdtemp(dir=dir_)
111 url = 'https://git.launchpad.net/' + self.repo
112 git('clone {} {}'.format(url, dir_))
113 if self.revision:
114 git('checkout {}'.format(self.revision), cwd=dir_)
115 return dir_
116
117fetchers.FETCHERS.append(LaunchpadGitFetcher)
0118
=== added file 'charmtools/compose/inspector.py'
--- charmtools/compose/inspector.py 1970-01-01 00:00:00 +0000
+++ charmtools/compose/inspector.py 2015-08-31 19:32:56 +0000
@@ -0,0 +1,101 @@
1# coding=utf-8
2import json
3from ruamel import yaml
4from charmtools.compose import config
5from charmtools import utils
6
7theme = {
8 0: "normal",
9 1: "green",
10 2: "cyan",
11 3: "magenta",
12 4: "yellow",
13 5: "red",
14}
15
16
17def scan_for(col, cur, depth):
18 for e, (rel, d) in col[cur:]:
19 if d and d == depth:
20 return True
21 return False
22
23
24def get_prefix(walk, cur, depth, next_depth):
25 guide = []
26 for i in range(depth):
27 # scan forward in walk from i seeing if a subsequent
28 # entry happens at each depth
29 if scan_for(walk, cur, i):
30 guide.append(" │ ")
31 else:
32 guide.append(" ")
33 if depth == next_depth:
34 prefix = " ├─── "
35 else:
36 prefix = " └─── "
37 return "{}{}".format("".join(guide), prefix)
38
39
40def inspect(charm):
41 tw = utils.TermWriter()
42 manp = charm / ".composer.manifest"
43 comp = charm / "composer.yaml"
44 if not manp.exists() or not comp.exists():
45 return
46 manifest = json.loads(manp.text())
47 composer = yaml.load(comp.open())
48 a, c, d = utils.delta_signatures(manp)
49
50 # ordered list of layers used for legend
51 layers = list(manifest['layers'])
52
53 def get_depth(e):
54 rel = e.relpath(charm)
55 depth = len(rel.splitall()) - 2
56 return rel, depth
57
58 def get_suffix(rel):
59 suffix = ""
60 if rel in a:
61 suffix = "+"
62 elif rel in c:
63 suffix = "*"
64 return suffix
65
66 def get_color(rel):
67 # name of layer this belongs to
68 color = tw.term.normal
69 if rel in manifest['signatures']:
70 layer = manifest['signatures'][rel][0]
71 layer_key = layers.index(layer)
72 color = getattr(tw, theme.get(layer_key, "normal"))
73 else:
74 if entry.isdir():
75 color = tw.blue
76 return color
77
78 tw.write("Inspect %s\n" % composer["is"])
79 for layer in layers:
80 tw.write("# {color}{layer}{t.normal}\n",
81 color=getattr(tw, theme.get(
82 layers.index(layer), "normal")),
83 layer=layer)
84 tw.write("\n")
85 tw.write("{t.blue}{target}{t.normal}\n", target=charm)
86
87 ignorer = utils.ignore_matcher(config.DEFAULT_IGNORES)
88 walk = sorted(utils.walk(charm, get_depth),
89 key=lambda x: x[1][0])
90 for i in range(len(walk) - 1):
91 entry, (rel, depth) = walk[i]
92 nEnt, (nrel, ndepth) = walk[i + 1]
93 if not ignorer(rel):
94 continue
95
96 tw.write("{prefix}{layerColor}{entry} "
97 "{t.bold}{suffix}{t.normal}\n",
98 prefix=get_prefix(walk, i, depth, ndepth),
99 layerColor=get_color(rel),
100 suffix=get_suffix(rel),
101 entry=rel.name)
0102
=== added file 'charmtools/compose/tactics.py'
--- charmtools/compose/tactics.py 1970-01-01 00:00:00 +0000
+++ charmtools/compose/tactics.py 2015-08-31 19:32:56 +0000
@@ -0,0 +1,449 @@
1import logging
2import json
3from ruamel import yaml
4
5from path import path
6from charmtools import utils
7
8log = logging.getLogger(__name__)
9
10
11class Tactic(object):
12 """
13 Tactics are first considered in the context of the config layer being
14 called the config layer will attempt to (using its author provided info)
15 create a tactic for a given file. That will later be intersected with any
16 later layers to create a final single plan for each element of the output
17 charm.
18
19 Callable that will implement some portion of the charm composition
20 Subclasses should implement __str__ and __call__ which should take whatever
21 actions are needed.
22 """
23 kind = "static" # used in signatures
24
25 def __init__(self, entity, current, target, config):
26 self.entity = entity
27 self._current = current
28 self._target = target
29 self._raw_data = None
30 self._config = config
31
32 def __call__(self):
33 raise NotImplementedError
34
35 def __str__(self):
36 return "{}: {} -> {}".format(
37 self.__class__.__name__, self.entity, self.target_file)
38
39 @property
40 def current(self):
41 """The file in the current layer under consideration"""
42 return self._current
43
44 @property
45 def target(self):
46 """The target (final) layer."""
47 return self._target
48
49 @property
50 def relpath(self):
51 return self.entity.relpath(self.current.directory)
52
53 @property
54 def target_file(self):
55 target = self.target.directory / self.relpath
56 return target
57
58 @property
59 def layer_name(self):
60 return self.current.directory.name
61
62 @property
63 def repo_path(self):
64 return path("/".join(self.current.directory.splitall()[-2:]))
65
66 @property
67 def config(self):
68 # Return the config of the layer *above* you
69 # as that is the one that controls your compositing
70 return self._config
71
72 def combine(self, existing):
73 """Produce a tactic informed by the last tactic for an entry.
74 This is when a rule in a higher level charm overrode something in
75 one of its bases for example."""
76 return self
77
78 @classmethod
79 def trigger(cls, relpath):
80 """Should the rule trigger for a given path object"""
81 return False
82
83 def sign(self):
84 """return sign in the form {relpath: (origin layer, SHA256)}
85 """
86 target = self.target_file
87 sig = {}
88 if target.exists() and target.isfile():
89 sig[self.relpath] = (self.current.url,
90 self.kind,
91 utils.sign(self.target_file))
92 return sig
93
94 def lint(self):
95 return True
96
97 def read(self):
98 return None
99
100 def build(self):
101 pass
102
103
104class ExactMatch(object):
105 FILENAME = None
106
107 @classmethod
108 def trigger(cls, relpath):
109 return cls.FILENAME == relpath
110
111
112class CopyTactic(Tactic):
113 def __call__(self):
114 if self.entity.isdir():
115 return
116 should_ignore = utils.ignore_matcher(self.target.config.ignores)
117 if not should_ignore(self.relpath):
118 return
119 target = self.target_file
120 log.debug("Copying %s: %s", self.layer_name, target)
121 # Ensure the path exists
122 target.dirname().makedirs_p()
123 if (self.entity != target) and not target.exists() \
124 or not self.entity.samefile(target):
125 data = self.read()
126 if data:
127 target.write_bytes(data)
128 self.entity.copymode(target)
129 else:
130 self.entity.copy2(target)
131
132 def __str__(self):
133 return "Copy {}".format(self.entity)
134
135 @classmethod
136 def trigger(cls, relpath):
137 return True
138
139
140class InterfaceCopy(Tactic):
141 def __init__(self, interface, relation_name, target, config):
142 self.interface = interface
143 self.relation_name = relation_name
144 self._target = target
145 self._config = config
146
147 @property
148 def target(self):
149 return self._target / "hooks/relations" / self.interface.name
150
151 def __call__(self):
152 # copy the entire tree into the
153 # hooks/relations/<interface>
154 # directory
155 log.debug("Copying Interface %s: %s",
156 self.interface.name, self.target)
157 # Ensure the path exists
158 if self.target.exists():
159 # XXX: fix this to do actual updates
160 return
161 ignorer = utils.ignore_matcher(self.config.ignores)
162 for entity, _ in utils.walk(self.interface.directory,
163 lambda x: True,
164 matcher=ignorer,
165 kind="files"):
166 target = entity.relpath(self.interface.directory)
167 target = (self.target / target).normpath()
168 target.parent.makedirs_p()
169 entity.copy2(target)
170 init = self.target / "__init__.py"
171 if not init.exists():
172 # ensure we can import from here directly
173 init.touch()
174
175 def __str__(self):
176 return "Copy Interface {}".format(self.interface.name)
177
178 def sign(self):
179 """return sign in the form {relpath: (origin layer, SHA256)}
180 """
181 sigs = {}
182 for entry, sig in utils.walk(self.target,
183 utils.sign, kind="files"):
184 relpath = entry.relpath(self._target.directory)
185 sigs[relpath] = (self.interface.url, "static", sig)
186 return sigs
187
188 def lint(self):
189 for entry in self.interface.directory.walkfiles():
190 if entry.splitext()[1] != ".py":
191 continue
192 relpath = entry.relpath(self._target.directory)
193 target = self._target.directory / relpath
194 if not target.exists():
195 continue
196 return utils.delta_python_dump(entry, target,
197 from_name=relpath)
198
199
200class InterfaceBind(InterfaceCopy):
201 def __init__(self, interface, relation_name, kind, target, config):
202 self.interface = interface
203 self.relation_name = relation_name
204 self.kind = kind
205 self._target = target
206 self._config = config
207
208 DEFAULT_BINDING = """#!/usr/bin/env python
209
210# Load modules from $CHARM_DIR/lib
211import sys
212sys.path.append('lib')
213
214# This will load and run the appropriate @hook and other decorated
215# handlers from $CHARM_DIR/reactive, $CHARM_DIR/hooks/reactive,
216# and $CHARM_DIR/hooks/relations.
217#
218# See https://jujucharms.com/docs/stable/getting-started-with-charms-reactive
219# for more information on this pattern.
220from charms.reactive import main
221main('{}')
222"""
223
224 def __call__(self):
225 for hook in ['joined', 'changed', 'broken', 'departed']:
226 target = self._target / "hooks" / "{}-relation-{}".format(
227 self.relation_name, hook)
228 if target.exists():
229 # XXX: warn
230 continue
231 target.parent.makedirs_p()
232 target.write_text(self.DEFAULT_BINDING.format(self.relation_name))
233 target.chmod(0755)
234
235 def sign(self):
236 """return sign in the form {relpath: (origin layer, SHA256)}
237 """
238 sigs = {}
239 for hook in ['joined', 'changed', 'broken', 'departed']:
240 target = self._target / "hooks" / "{}-relation-{}".format(
241 self.relation_name, hook)
242 rel = target.relpath(self._target.directory)
243 sigs[rel] = (self.interface.url,
244 "dynamic",
245 utils.sign(target))
246 return sigs
247
248 def __str__(self):
249 return "Bind Interface {}".format(self.interface.name)
250
251
252class ManifestTactic(ExactMatch, Tactic):
253 FILENAME = ".composer.manifest"
254
255 def __call__(self):
256 # Don't copy manifests, they are regenerated
257 pass
258
259
260class SerializedTactic(ExactMatch, Tactic):
261 kind = "dynamic"
262
263 def __init__(self, *args, **kwargs):
264 super(SerializedTactic, self).__init__(*args, **kwargs)
265 self.data = None
266
267 def combine(self, existing):
268 # Invoke the previous tactic
269 existing()
270 if existing.data is not None:
271 self.data = existing.data
272 return self
273
274 def __call__(self):
275 data = self.load(self.entity.open())
276 # self.data represents the product of previous layers
277 if self.data:
278 data = utils.deepmerge(self.data, data)
279
280 # Now apply any rules from config
281 config = self.config
282 if config:
283 section = config.get(self.section)
284 if section:
285 dels = section.get('deletes', [])
286 if self.prefix:
287 namespace = data[self.prefix]
288 else:
289 namespace = data
290 for key in dels:
291 utils.delete_path(key, namespace)
292 self.data = data
293 if not self.target_file.parent.exists():
294 self.target_file.parent.makedirs_p()
295 self.dump(data)
296 return data
297
298
299class YAMLTactic(SerializedTactic):
300 """Rule Driven YAML generation"""
301 prefix = None
302
303 def load(self, fn):
304 return yaml.load(fn, Loader=yaml.RoundTripLoader)
305
306 def dump(self, data):
307 yaml.dump(data, self.target_file.open('w'),
308 Dumper=yaml.RoundTripDumper,
309 default_flow_style=False)
310
311
312class JSONTactic(SerializedTactic):
313 """Rule Driven JSON generation"""
314 prefix = None
315
316 def load(self, fn):
317 return json.load(fn)
318
319 def dump(self, data):
320 json.dump(data, self.target_file.open('w'), indent=2)
321
322
323class ComposerYAML(YAMLTactic, ExactMatch):
324 FILENAME = "composer.yaml"
325
326 def read(self):
327 self._raw_data = self.load(self.entity.open())
328
329 def __call__(self):
330 # rewrite includes to be the current source
331 data = self._raw_data
332 if data is None:
333 return
334 # The split should result in the series/charm path only
335 # XXX: there will be strange interactions with cs: vs local:
336 if 'is' not in data:
337 data['is'] = str(self.current.url)
338 inc = data.get('includes', [])
339 norm = []
340 for i in inc:
341 if ":" in i:
342 norm.append(i)
343 else:
344 # Attempt to normalize to a repository base
345 norm.append("/".join(path(i).splitall()[-2:]))
346 if norm:
347 data['includes'] = norm
348 if not self.target_file.parent.exists():
349 self.target_file.parent.makedirs_p()
350 self.dump(data)
351 return data
352
353
354class MetadataYAML(YAMLTactic):
355 """Rule Driven metadata.yaml generation"""
356 section = "metadata"
357 FILENAME = "metadata.yaml"
358 KEY_ORDER = ["name", "summary", "maintainer",
359 "description", "tags",
360 "requires", "provides", "peers"]
361
362 def dump(self, data):
363 if not data:
364 return
365 final = yaml.comments.CommentedMap()
366 # assempt keys in know order
367 for k in self.KEY_ORDER:
368 if k in data:
369 final[k] = data[k]
370 missing = set(data.keys()) - set(self.KEY_ORDER)
371 for k in sorted(missing):
372 final[k] = data[k]
373 super(MetadataYAML, self).dump(final)
374
375
376class ConfigYAML(MetadataYAML):
377 """Rule driven config.yaml generation"""
378 section = "config"
379 prefix = "options"
380 FILENAME = "config.yaml"
381
382
383class InstallerTactic(Tactic):
384 def __str__(self):
385 return "Installing software to {}".format(self.relpath)
386
387 @classmethod
388 def trigger(cls, relpath):
389 ext = relpath.splitext()[1]
390 return ext in [".pypi", ]
391
392 def __call__(self):
393 # install package reference in trigger file
394 # in place directory of target
395 # XXX: Should this map multiline to "-r", self.entity
396 spec = self.entity.text().strip()
397 target = self.target_file.dirname()
398 target_dir = target / path(spec.split(" ", 1)[0]).normpath().namebase
399 log.debug("pip installing {} as {}".format(
400 spec, target_dir))
401 with utils.tempdir() as temp_dir:
402 # We do this dance so we don't have
403 # to guess package and .egg file names
404 # we move everything in the tempdir to the target
405 # and track it for later use in sign()
406 utils.Process(("pip",
407 "install",
408 "-U",
409 "--exists-action",
410 "i",
411 "-t",
412 temp_dir,
413 spec)).throw_on_error()()
414 dirs = temp_dir.listdir()
415 self._tracked = []
416 for d in dirs:
417 d.move(target)
418 self._tracked.append(target / d)
419
420 def sign(self):
421 """return sign in the form {relpath: (origin layer, SHA256)}
422 """
423 sigs = {}
424 for d in self._tracked:
425 for entry, sig in utils.walk(d,
426 utils.sign, kind="files"):
427 relpath = entry.relpath(self._target.directory)
428 sigs[relpath] = (self.current.url, "dynamic", sig)
429 return sigs
430
431
432def load_tactic(dpath, basedir):
433 """Load a tactic from the current layer using a dotted path. The last
434 element in the path should be a Tactic subclass
435 """
436 obj = utils.load_class(dpath, basedir)
437 if not issubclass(obj, Tactic):
438 raise ValueError("Expected to load a tactic for %s" % dpath)
439 return obj
440
441
442DEFAULT_TACTICS = [
443 ManifestTactic,
444 InstallerTactic,
445 MetadataYAML,
446 ConfigYAML,
447 ComposerYAML,
448 CopyTactic
449]
0450
=== added file 'charmtools/utils.py'
--- charmtools/utils.py 1970-01-01 00:00:00 +0000
+++ charmtools/utils.py 2015-08-31 19:32:56 +0000
@@ -0,0 +1,518 @@
1import copy
2import collections
3import hashlib
4import json
5import logging
6import os
7import re
8import subprocess
9import sys
10import tempfile
11import time
12from contextlib import contextmanager
13
14from .compose.diff_match_patch import diff_match_patch
15import blessings
16import pathspec
17from path import path
18
19log = logging.getLogger('utils')
20
21
22@contextmanager
23def cd(directory, make=False):
24 cwd = os.getcwd()
25 if not os.path.exists(directory) and make:
26 os.makedirs(directory)
27 os.chdir(directory)
28 try:
29 yield
30 finally:
31 os.chdir(cwd)
32
33
34@contextmanager
35def tempdir():
36 dirname = path(tempfile.mkdtemp())
37 with cd(dirname):
38 yield dirname
39 dirname.rmtree_p()
40
41
42def deepmerge(dest, src):
43 """
44 Deep merge of two dicts.
45
46 This is destructive (`dest` is modified), but values
47 from `src` are passed through `copy.deepcopy`.
48 """
49 for k, v in src.iteritems():
50 if dest.get(k) and isinstance(v, dict):
51 deepmerge(dest[k], v)
52 else:
53 dest[k] = copy.deepcopy(v)
54 return dest
55
56
57def delete_path(path, obj):
58 """Delete a dotted path from object, assuming each level is a dict"""
59 parts = path.split('.')
60 for p in parts[:-1]:
61 obj = obj[p]
62 del obj[parts[-1]]
63
64
65class NestedDict(dict):
66 def __init__(self, dict_or_iterable=None, **kwargs):
67 if dict_or_iterable:
68 if isinstance(dict_or_iterable, dict):
69 self.update(dict_or_iterable)
70 elif isinstance(dict_or_iterable, collections.Iterable):
71 for k, v in dict_or_iterable:
72 self[k] = v
73 if kwargs:
74 self.update(kwargs)
75
76 def __setitem__(self, key, value):
77 key = key.split('.')
78 o = self
79 for part in key[:-1]:
80 o = o.setdefault(part, self.__class__())
81 dict.__setitem__(o, key[-1], value)
82
83 def __getitem__(self, key):
84 o = self
85 if '.' in key:
86 parts = key.split('.')
87 key = parts[-1]
88 for part in parts[:-1]:
89 o = o[part]
90
91 return dict.__getitem__(o, key)
92
93 def __getattr__(self, key):
94 try:
95 return self[key]
96 except KeyError:
97 raise AttributeError(key)
98
99 def get(self, key, default=None):
100 try:
101 return self[key]
102 except KeyError:
103 return default
104
105 def update(self, other):
106 deepmerge(self, other)
107
108
109class ProcessResult(object):
110 def __init__(self, command, exit_code, stdout, stderr):
111 self.command = command
112 self.exit_code = exit_code
113 self.stdout = stdout
114 self.stderr = stderr
115
116 def __repr__(self):
117 return '<ProcessResult "%s" result %s>' % (self.cmd, self.exit_code)
118
119 @property
120 def cmd(self):
121 return ' '.join(self.command)
122
123 @property
124 def output(self):
125 result = ''
126 if self.stdout:
127 result += self.stdout
128 if self.stderr:
129 result += self.stderr
130 return result.strip()
131
132 @property
133 def json(self):
134 if self.stdout:
135 return json.loads(self.stdout)
136 return None
137
138 def __eq__(self, other):
139 return self.exit_code == other
140
141 def __bool__(self):
142 return self.exit_code == 0
143
144 __nonzero__ = __bool__
145
146 def throw_on_error(self):
147 if not bool(self):
148 raise subprocess.CalledProcessError(
149 self.exit_code, self.command, output=self.output)
150
151
152class Process(object):
153 def __init__(self, command=None, throw=False, log=log, **kwargs):
154 if isinstance(command, str):
155 command = (command, )
156 self.command = command
157 self._throw_on_error = False
158 self.log = log
159 self._kw = kwargs
160
161 def __repr__(self):
162 return "<Command %s>" % (self.command, )
163
164 def throw_on_error(self, throw=True):
165 self._throw_on_error = throw
166 return self
167
168 def __call__(self, *args, **kw):
169 kwargs = dict(stdout=subprocess.PIPE,
170 stderr=subprocess.STDOUT)
171 if self._kw:
172 kwargs.update(self._kw)
173 kwargs.update(kw)
174 if self.command:
175 all_args = self.command + args
176 else:
177 all_args = args
178 if 'env' not in kwargs:
179 kwargs['env'] = os.environ
180
181 p = subprocess.Popen(all_args, **kwargs)
182 stdout, stderr = p.communicate()
183 self.log.debug(stdout)
184 stdout = stdout.strip()
185 if stderr is not None:
186 stderr = stderr.strip()
187 self.log.debug(stderr)
188 exit_code = p.poll()
189 result = ProcessResult(all_args, exit_code, stdout, stderr)
190 self.log.debug("process: %s (%d)", result.cmd, result.exit_code)
191 if self._throw_on_error:
192 result.throw_on_error()
193 return result
194
195command = Process
196
197
198class Commander(object):
199 def __init__(self, log=log):
200 self.log = log
201
202 def set_log(self, logger):
203 self.log = logger
204
205 def __getattr__(self, key):
206 return command((key,), log=self.log)
207
208 def check(self, *args, **kwargs):
209 kwargs.update({'log': self.log})
210 return command(command=args, **kwargs).throw_on_error()
211
212 def __call__(self, *args, **kwargs):
213 kwargs.update({'log': self.log})
214 return command(command=args, shell=True, **kwargs)
215
216
217sh = Commander()
218dig = Process(('dig', '+short'))
219api_endpoints = Process(('juju', 'api-endpoints'))
220
221
222def wait_for(timeout, interval, *callbacks, **kwargs):
223 """
224 Repeatedly try callbacks until all return True
225
226 This will wait interval seconds between attempts and will error out
227 after timeout has been exceeded.
228
229 Callbacks will be called with the container as their argument.
230
231 Setting timeout to zero will loop until cancelled, power runs outs,
232 hardware fails, or the heat death of the universe.
233 """
234 start = time.time()
235 if timeout:
236 end = start + timeout
237 else:
238 end = 0
239
240 bar = kwargs.get('bar', None)
241 message = kwargs.get('message', None)
242 once = 1
243 while True:
244 passes = True
245 if end > 0 and time.time() > end:
246 raise OSError("Timeout exceeded in wait_for")
247 if bar:
248 bar.next(once, message=message)
249 if once == 1:
250 once = 0
251 if int(time.time()) % interval == 0:
252 for callback in callbacks:
253 result = callback()
254 passes = passes & bool(result)
255 if passes is False:
256 break
257 if passes is True:
258 break
259 time.sleep(1)
260
261
262def until(*callbacks, **kwargs):
263 return wait_for(0, 20, *callbacks, **kwargs)
264
265
266def retry(attempts, *callbacks, **kwargs):
267 """
268 Repeatedly try callbacks a fixed number of times or until all return True
269 """
270 for attempt in xrange(attempts):
271 if 'bar' in kwargs:
272 kwargs['bar'].next(attempt == 0, message=kwargs.get('message'))
273 for callback in callbacks:
274 if not callback():
275 break
276 else:
277 break
278 else:
279 raise OSError("Retry attempts exceeded")
280 return True
281
282
283def which(program):
284 def is_exe(fpath):
285 return os.path.isfile(fpath) and os.access(fpath, os.X_OK)
286
287 fpath, fname = os.path.split(program)
288 if fpath:
289 if is_exe(program):
290 return program
291 else:
292 for fpath in os.environ["PATH"].split(os.pathsep):
293 fpath = fpath.strip('"')
294 exe_file = os.path.join(fpath, program)
295 if is_exe(exe_file):
296 return exe_file
297 return None
298
299
300def load_class(dpath, workingdir=None):
301 # we expect the last element of the path
302 if not workingdir:
303 workingdir = os.getcwd()
304 with cd(workingdir):
305 modpath, classname = dpath.rsplit('.', 1)
306 modpath = path(modpath.replace(".", "/"))
307 if not modpath.exists():
308 modpath += ".py"
309 if not modpath.exists():
310 raise OSError("Unable to load {} from {}".format(
311 dpath, workingdir))
312 namespace = {}
313 execfile(modpath, globals(), namespace)
314 klass = namespace.get(classname)
315 if klass is None:
316 raise ImportError("Unable to load class {} at {}".format(
317 classname, dpath))
318 return klass
319
320
321def walk(pathobj, fn, matcher=None, kind=None, **kwargs):
322 """walk pathobj calling fn on each matched entry yielding each
323 result. If kind is 'file' or 'dir' only that type ofd entry will
324 be walked. matcher is an optional function returning bool indicating
325 if the entry should be processed.
326 """
327 p = path(pathobj)
328 walker = p.walk
329 if kind == "files":
330 walker = p.walkfiles
331 elif kind == "dir":
332 walker = p.walkdir
333
334 for entry in walker():
335 relpath = entry.relpath(pathobj)
336 if matcher and not matcher(relpath):
337 continue
338 yield (entry, fn(entry, **kwargs))
339
340
341def ignore_matcher(ignores=[]):
342 spec = pathspec.PathSpec.from_lines(pathspec.GitIgnorePattern, ignores)
343
344 def matcher(entity):
345 return entity not in spec.match_files((entity,))
346 return matcher
347
348
349def sign(pathobj):
350 p = path(pathobj)
351 if not p.isfile():
352 return None
353 return hashlib.sha256(p.bytes()).hexdigest()
354
355
356def delta_signatures(manifest_filename, ignorer=None):
357 md = path(manifest_filename)
358 repo = md.normpath().dirname()
359
360 expected = json.load(md.open())
361 current = {}
362 for rel, sig in walk(repo, sign):
363 rel = rel.relpath(repo)
364 current[rel] = sig
365 add, change, delete = set(), set(), set()
366
367 for p, s in current.items():
368 fp = repo / p
369 if not fp.isfile():
370 continue
371 if ignorer and not ignorer(p):
372 continue
373
374 if p not in expected["signatures"]:
375 add.add(p)
376 continue
377 # layer, kind, sig
378 # don't include items generated only for the last layer
379 if expected["signatures"][p][0] == "composer":
380 continue
381 if expected["signatures"][p][2] != s:
382 change.add(p)
383
384 for p, d in expected["signatures"].items():
385 if p not in current:
386 delete.add(path(p))
387 return add, change, delete
388
389
390class ColoredFormatter(logging.Formatter):
391
392 def __init__(self, terminal, *args, **kwargs):
393 super(ColoredFormatter, self).__init__(*args, **kwargs)
394 self._terminal = terminal
395
396 def format(self, record):
397 output = super(ColoredFormatter, self).format(record)
398 if record.levelno >= logging.CRITICAL:
399 line_color = self._terminal.bold_yellow_on_red
400 elif record.levelno >= logging.ERROR:
401 line_color = self._terminal.red
402 elif record.levelno >= logging.WARNING:
403 line_color = self._terminal.yellow
404 elif record.levelno >= logging.INFO:
405 line_color = self._terminal.green
406 else:
407 line_color = self._terminal.cyan
408 return line_color(output)
409
410
411class TermWriter(object):
412 def __init__(self, fp=None, term=None):
413 if fp is None:
414 fp = sys.stdout
415 self.fp = fp
416 if term is None:
417 term = blessings.Terminal()
418 self.term = term
419
420 def __getattr__(self, key):
421 return getattr(self.term, key)
422
423 def write(self, msg, *args, **kwargs):
424 if 't' in kwargs:
425 raise ValueError("Using reserved token 't' in TermWriter.write")
426 kwargs['t'] = self.term
427 self.fp.write(msg.format(*args, **kwargs))
428
429
430class _O(dict):
431 def __getattr__(self, k):
432 return self[k]
433
434REACTIVE_PATTERNS = [
435 re.compile("\s*@when"),
436 re.compile(".set_state\(")
437]
438
439
440def delta_python(orig, dest, patterns=REACTIVE_PATTERNS, context=2):
441 """Delta two python files looking for certain patterns"""
442 if isinstance(orig, path):
443 od = orig.text()
444 elif hasattr(orig, 'read'):
445 od = orig.read()
446 else:
447 raise TypeError("Expected path() or file(), got %s" % type(orig))
448 if isinstance(dest, path):
449 dd = dest.text()
450 elif hasattr(orig, 'read'):
451 dd = dest.read()
452 else:
453 raise TypeError("Expected path() or file(), got %s" % type(dest))
454
455 differ = diff_match_patch()
456 linect = 0
457 lastMatch = None
458 for res in differ.diff_main(od, dd):
459 if res[0] == diff_match_patch.DIFF_EQUAL:
460 linect += res[1].count('\n')
461 lastMatch = res[:]
462 continue
463 elif res[0] == diff_match_patch.DIFF_INSERT:
464 linect += res[1].count('\n')
465 else:
466 linect -= res[1].count('\n')
467
468 for p in patterns:
469 if p.search(lastMatch[1]):
470 yield [linect, lastMatch, res]
471 break
472
473
474def delta_python_dump(orig, dest, patterns=REACTIVE_PATTERNS,
475 context=2, term=None,
476 from_name=None, to_name=None):
477 if term is None:
478 term = TermWriter()
479
480 def norm_sources(orig, dest):
481 if from_name:
482 oname = from_name
483 else:
484 oname = orig
485 if to_name:
486 dname = to_name
487 else:
488 dname = dest
489 return _O({'orig_name': oname, 'dest_name': dname})
490
491 def prefix_lines(lines, lineno):
492 if isinstance(lines, str):
493 lines = lines.splitlines()
494 for i, l in enumerate(lines):
495 lines[i] = "%-5d| %s" % (lineno + i, l)
496 return "\n".join(lines)
497
498 i = 0
499 for lineno, last, current in delta_python(orig, dest, patterns, context):
500 # pull enough context
501 if last:
502 context_lines = last[1].splitlines()[-context:]
503 message = norm_sources(orig, dest)
504 message['context'] = prefix_lines(context_lines, lineno - context)
505 message['lineno'] = lineno
506 message['delta'] = current[1]
507 s = {diff_match_patch.DIFF_EQUAL: term.normal,
508 diff_match_patch.DIFF_INSERT: term.green,
509 diff_match_patch.DIFF_DELETE: term.red}[current[0]]
510 message['status_color'] = s
511 # output message
512 term.write("{t.bold}{m.orig_name}{t.normal} --> "
513 "{t.bold}{m.dest_name}{t.normal}:\n",
514 m=message)
515 term.write("{m.context}{m.status_color}{m.delta}{t.normal}\n",
516 m=message)
517 i += 1
518 return i == 0
0519
=== added file 'doc/source/compose-intro.md'
--- doc/source/compose-intro.md 1970-01-01 00:00:00 +0000
+++ doc/source/compose-intro.md 2015-08-31 19:32:56 +0000
@@ -0,0 +1,18 @@
1charm compose/refresh combines various included layers to produce an output
2charm. These layers can be maintained and updated separately and then the
3refresh process can be used to regenerate the charm.
4
5COMPOSER_PATH is a ':' delimited path list used to resolve local include matches.
6INTERFACE_PATH is the directory from which interfaces will be resolved.
7
8Examples:
9charm compose -o /tmp/out trusty/mycharm
10
11Will generate /tmp/out/trusty/mycharm will all the includes specified.
12
13WORKFLOW
14========
15
16Typically you'll make changes in the layer owning the file(s) in queustion
17and then recompose the charm and deploy/upgrade-charm that. You'll not
18want to edit the generated charm directly.
019
=== added file 'doc/source/composer.md'
--- doc/source/composer.md 1970-01-01 00:00:00 +0000
+++ doc/source/composer.md 2015-08-31 19:32:56 +0000
@@ -0,0 +1,123 @@
1Juju Charm Composition
2======================
3
4Status | *Alpha*
5------- -------
6
7This is a Prototype designed to flush out requirements around Charm
8Composition. Today its very common to fork charms for minor changes or to have
9to use subordinate charms to take advantages of frameworks where you need to
10deploy a custom workload to an existing runtime. With charm composition you
11should be able to include from a charm that provides the runtime (or just some
12well contained feature set) and maintain you're delta as a 'layer' that gets
13composed with its base to produce a new charm.
14
15This process should be runnable repeatedly allowing charms to be regenerated.
16
17
18This work is currently feature incomplete but does allow the generation of
19simple charms and useful basic composition. It is my hope that this will
20encourage discussion of the feature set needed to one day have charm
21composition supported natively in juju-core.
22
23
24Today the system can be run as follows:
25
26 ./juju_compose.py -o <output_repo> <charm to build from>
27
28So you might use the included (very unrealistic) test case as like:
29
30 ./juju_compose -o out -n foo tests/trusty/tester
31
32Running this should produce a charm in out/trusty/foo which is composed
33according to the composer.yaml file in tests/trusty/tester. While this isn't
34documented yet it shows some of the basic features of diverting hooks (for
35pre/post hooks support), replacing files, merging metadata.yaml changes, etc.
36
37It should be enough to give you an idea how it works. In order for this example
38to run you'll need to pip install bundletester as it shares some code with that
39project.
40
41Theory
42======
43
44A generated charm is composed of layers. The generator acts almost like a
45compiler taking the input from each layer and producing an output file in the
46resultant charm.
47
48The generator keeps track of which layer owns each file and allows layers to
49update files they own should the charm be refreshed later.
50
51The generated charm itself should be treated as immutable. The top layer that
52was used to generate it is where user level modifications should live.
53
54
55Setting Up your Repo
56====================
57This currently allows for two new ENV variables when run
58 COMPOSER_PATH: a ':' separated list of JUJU_REPOSITORY that should be searched for includes
59 INTERFACE_PATH: a ':' separated list of paths to resolve interface:_name_ includes from.
60
61JUJU_REPOSITORY entries take the usual format *series*/*charm*
62INTERFACE repos take the format of *interface_name*. Where interface_name is
63the name as it appears in the metadata.yaml
64
65Composition Types
66=================
67
68Each file in each layer gets matched by a single Tactic. Tactics implement how
69the data in a file moves from one layer to the next (and finally to the target
70charm). By default this will be a simple copy but in the cases of certain files
71(mostly known YAML files like metadata.yaml and config.yaml) each layer is
72combined with the previous layers before being written.
73
74Normally the default tactics are fine but you have the ability in the
75composer.yaml to list a set of Tactics objects that will be checked before the
76default and control how data moves from one layer to the next.
77
78
79composer.yaml
80=============
81Each layer used to build a charm can have a composer.yaml file. The top layer
82(the one actually invoked from the command line) must. These tell the generator what do,
83ranging from which base layers to include, to which interfaces. They also allow for
84the inclusion of specialized directives for processing some types of files.
85
86 Keys:
87 includes: ["trusty/mysql", "interface:mysql"]
88 tactics: [ dottedpath.toTacticClass, ]
89 config:
90 deletes:
91 - key names
92 metadata:
93 deletes:
94 - key names
95
96
97Includes is a list of one or more layers and interfaces that should be
98composited. Those layers may themselves have other includes and/or
99interfaces.
100
101Tactics is a list of Tactics to be loaded. See juju_compose.tactics.Tactics for
102the default interface. You'll typically need to implement at least a trigger() method
103and a __call__() method.
104
105config and metadata take optional lists of keys to remove from config.yaml and
106metadata.yaml when generating their data. This allows for charms to, for
107example, narrow what they expose to clients.
108
109
110Inspect
111=======
112
113If you've already generated a charm you can see which layers own which files by
114using the include **juju inspect [charmdir]*** command. This should render a
115tree of the files in the color of each layer. Each layers assigned color is
116presented in a legend at the top of the output.
117
118TODO:
119- lint about methods in base layer not provided/extended in lower
120layers
121
122
123
0124
=== removed file 'ez_setup.py'
--- ez_setup.py 2013-02-25 21:54:26 +0000
+++ ez_setup.py 1970-01-01 00:00:00 +0000
@@ -1,272 +0,0 @@
1#!python
2
3# NOTE TO LAUNCHPAD DEVELOPERS: This is a bootstrapping file from the
4# setuptools project. It is imported by our setup.py.
5
6"""Bootstrap setuptools installation
7
8If you want to use setuptools in your package's setup.py, just include this
9file in the same directory with it, and add this to the top of your setup.py::
10
11 from ez_setup import use_setuptools
12 use_setuptools()
13
14If you want to require a specific version of setuptools, set a download
15mirror, or use an alternate download directory, you can do so by supplying
16the appropriate options to ``use_setuptools()``.
17
18This file can also be run as a script to install or upgrade setuptools.
19"""
20import sys
21DEFAULT_VERSION = "0.6c11"
22DEFAULT_URL = "http://pypi.python.org/packages/%s/s/setuptools/" \
23 % sys.version[:3]
24
25md5_data = {
26 'setuptools-0.6b1-py2.3.egg': '8822caf901250d848b996b7f25c6e6ca',
27 'setuptools-0.6b1-py2.4.egg': 'b79a8a403e4502fbb85ee3f1941735cb',
28 'setuptools-0.6b2-py2.3.egg': '5657759d8a6d8fc44070a9d07272d99b',
29 'setuptools-0.6b2-py2.4.egg': '4996a8d169d2be661fa32a6e52e4f82a',
30 'setuptools-0.6b3-py2.3.egg': 'bb31c0fc7399a63579975cad9f5a0618',
31 'setuptools-0.6b3-py2.4.egg': '38a8c6b3d6ecd22247f179f7da669fac',
32 'setuptools-0.6b4-py2.3.egg': '62045a24ed4e1ebc77fe039aa4e6f7e5',
33 'setuptools-0.6b4-py2.4.egg': '4cb2a185d228dacffb2d17f103b3b1c4',
34 'setuptools-0.6c1-py2.3.egg': 'b3f2b5539d65cb7f74ad79127f1a908c',
35 'setuptools-0.6c1-py2.4.egg': 'b45adeda0667d2d2ffe14009364f2a4b',
36 'setuptools-0.6c10-py2.3.egg': 'ce1e2ab5d3a0256456d9fc13800a7090',
37 'setuptools-0.6c10-py2.4.egg': '57d6d9d6e9b80772c59a53a8433a5dd4',
38 'setuptools-0.6c10-py2.5.egg': 'de46ac8b1c97c895572e5e8596aeb8c7',
39 'setuptools-0.6c10-py2.6.egg': '58ea40aef06da02ce641495523a0b7f5',
40 'setuptools-0.6c11-py2.3.egg': '2baeac6e13d414a9d28e7ba5b5a596de',
41 'setuptools-0.6c11-py2.4.egg': 'bd639f9b0eac4c42497034dec2ec0c2b',
42 'setuptools-0.6c11-py2.5.egg': '64c94f3bf7a72a13ec83e0b24f2749b2',
43 'setuptools-0.6c11-py2.6.egg': 'bfa92100bd772d5a213eedd356d64086',
44 'setuptools-0.6c2-py2.3.egg': 'f0064bf6aa2b7d0f3ba0b43f20817c27',
45 'setuptools-0.6c2-py2.4.egg': '616192eec35f47e8ea16cd6a122b7277',
46 'setuptools-0.6c3-py2.3.egg': 'f181fa125dfe85a259c9cd6f1d7b78fa',
47 'setuptools-0.6c3-py2.4.egg': 'e0ed74682c998bfb73bf803a50e7b71e',
48 'setuptools-0.6c3-py2.5.egg': 'abef16fdd61955514841c7c6bd98965e',
49 'setuptools-0.6c4-py2.3.egg': 'b0b9131acab32022bfac7f44c5d7971f',
50 'setuptools-0.6c4-py2.4.egg': '2a1f9656d4fbf3c97bf946c0a124e6e2',
51 'setuptools-0.6c4-py2.5.egg': '8f5a052e32cdb9c72bcf4b5526f28afc',
52 'setuptools-0.6c5-py2.3.egg': 'ee9fd80965da04f2f3e6b3576e9d8167',
53 'setuptools-0.6c5-py2.4.egg': 'afe2adf1c01701ee841761f5bcd8aa64',
54 'setuptools-0.6c5-py2.5.egg': 'a8d3f61494ccaa8714dfed37bccd3d5d',
55 'setuptools-0.6c6-py2.3.egg': '35686b78116a668847237b69d549ec20',
56 'setuptools-0.6c6-py2.4.egg': '3c56af57be3225019260a644430065ab',
57 'setuptools-0.6c6-py2.5.egg': 'b2f8a7520709a5b34f80946de5f02f53',
58 'setuptools-0.6c7-py2.3.egg': '209fdf9adc3a615e5115b725658e13e2',
59 'setuptools-0.6c7-py2.4.egg': '5a8f954807d46a0fb67cf1f26c55a82e',
60 'setuptools-0.6c7-py2.5.egg': '45d2ad28f9750e7434111fde831e8372',
61 'setuptools-0.6c8-py2.3.egg': '50759d29b349db8cfd807ba8303f1902',
62 'setuptools-0.6c8-py2.4.egg': 'cba38d74f7d483c06e9daa6070cce6de',
63 'setuptools-0.6c8-py2.5.egg': '1721747ee329dc150590a58b3e1ac95b',
64 'setuptools-0.6c9-py2.3.egg': 'a83c4020414807b496e4cfbe08507c03',
65 'setuptools-0.6c9-py2.4.egg': '260a2be2e5388d66bdaee06abec6342a',
66 'setuptools-0.6c9-py2.5.egg': 'fe67c3e5a17b12c0e7c541b7ea43a8e6',
67 'setuptools-0.6c9-py2.6.egg': 'ca37b1ff16fa2ede6e19383e7b59245a',
68}
69
70import os
71import sys
72
73try:
74 from hashlib import md5
75except ImportError:
76 from md5 import md5
77
78
79def _validate_md5(egg_name, data):
80 if egg_name in md5_data:
81 digest = md5(data).hexdigest()
82 if digest != md5_data[egg_name]:
83 print >>sys.stderr, (
84 "md5 validation of %s failed! (Possible download problem?)"
85 % egg_name
86 )
87 sys.exit(2)
88 return data
89
90
91def use_setuptools(
92 version=DEFAULT_VERSION, download_base=DEFAULT_URL, to_dir=os.curdir,
93 download_delay=15
94):
95 """Automatically find/download setuptools and make it available on sys.path
96
97 `version` should be a valid setuptools version number that is available
98 as an egg for download under the `download_base` URL (which should end with
99 a '/'). `to_dir` is the directory where setuptools will be downloaded, if
100 it is not already available. If `download_delay` is specified, it should
101 be the number of seconds that will be paused before initiating a download,
102 should one be required. If an older version of setuptools is installed,
103 this routine will print a message to ``sys.stderr`` and raise SystemExit in
104 an attempt to abort the calling script.
105 """
106 was_imported = 'pkg_resources' in sys.modules \
107 or 'setuptools' in sys.modules
108
109 def do_download():
110 egg = download_setuptools(version, download_base, to_dir,
111 download_delay)
112 sys.path.insert(0, egg)
113 import setuptools
114 setuptools.bootstrap_install_from = egg
115 try:
116 import pkg_resources
117 except ImportError:
118 return do_download()
119 try:
120 pkg_resources.require("setuptools>=" + version)
121 return
122 except pkg_resources.VersionConflict, e:
123 if was_imported:
124 print >>sys.stderr, (
125 "The required version of setuptools (>=%s) is not available, and\n"
126 "can't be installed while this script is running. Please install\n"
127 " a more recent version first, using 'easy_install -U setuptools'."
128 "\n\n(Currently using %r)"
129 ) % (version, e.args[0])
130 sys.exit(2)
131 else:
132 del pkg_resources, sys.modules['pkg_resources'] # reload ok
133 return do_download()
134 except pkg_resources.DistributionNotFound:
135 return do_download()
136
137
138def download_setuptools(
139 version=DEFAULT_VERSION, download_base=DEFAULT_URL, to_dir=os.curdir,
140 delay=15
141):
142 """Download setuptools from a specified location and return its filename
143
144 `version` should be a valid setuptools version number that is available
145 as an egg for download under the `download_base` URL (which should end
146 with a '/'). `to_dir` is the directory where the egg will be downloaded.
147 `delay` is the number of seconds to pause before an actual download
148 attempt.
149 """
150 import urllib2
151 import shutil
152 egg_name = "setuptools-%s-py%s.egg" % (version, sys.version[:3])
153 url = download_base + egg_name
154 saveto = os.path.join(to_dir, egg_name)
155 src = dst = None
156 if not os.path.exists(saveto): # Avoid repeated downloads
157 try:
158 from distutils import log
159 if delay:
160 log.warn("""
161---------------------------------------------------------------------------
162This script requires setuptools version %s to run (even to display
163help). I will attempt to download it for you (from
164%s), but
165you may need to enable firewall access for this script first.
166I will start the download in %d seconds.
167
168(Note: if this machine does not have network access, please obtain the file
169
170 %s
171
172and place it in this directory before rerunning this script.)
173---------------------------------------------------------------------------""",
174 version, download_base, delay, url
175 )
176 from time import sleep
177 sleep(delay)
178 log.warn("Downloading %s", url)
179 src = urllib2.urlopen(url)
180 # Read/write all in one block, so we don't create a corrupt file
181 # if the download is interrupted.
182 data = _validate_md5(egg_name, src.read())
183 dst = open(saveto, "wb")
184 dst.write(data)
185 finally:
186 if src:
187 src.close()
188 if dst:
189 dst.close()
190 return os.path.realpath(saveto)
191
192
193def main(argv, version=DEFAULT_VERSION):
194 """Install or upgrade setuptools and EasyInstall"""
195 try:
196 import setuptools
197 except ImportError:
198 egg = None
199 try:
200 egg = download_setuptools(version, delay=0)
201 sys.path.insert(0, egg)
202 from setuptools.command.easy_install import main
203 return main(list(argv) + [egg]) # we're done here
204 finally:
205 if egg and os.path.exists(egg):
206 os.unlink(egg)
207 else:
208 if setuptools.__version__ == '0.0.1':
209 print >>sys.stderr, (
210 "You have an obsolete version of setuptools installed. Please\n"
211 "remove it from your system entirely before rerunning this script."
212 )
213 sys.exit(2)
214
215 req = "setuptools>=" + version
216 import pkg_resources
217 try:
218 pkg_resources.require(req)
219 except pkg_resources.VersionConflict:
220 try:
221 from setuptools.command.easy_install import main
222 except ImportError:
223 from easy_install import main
224 main(list(argv) + [download_setuptools(delay=0)])
225 sys.exit(0) # try to force an exit
226 else:
227 if argv:
228 from setuptools.command.easy_install import main
229 main(argv)
230 else:
231 print "Setuptools version", version, "or greater has been " \
232 + "installed."
233 print '(Run "ez_setup.py -U setuptools" to reinstall or upgrade.)'
234
235
236def update_md5(filenames):
237 """Update our built-in md5 registry"""
238
239 import re
240
241 for name in filenames:
242 base = os.path.basename(name)
243 f = open(name, 'rb')
244 md5_data[base] = md5(f.read()).hexdigest()
245 f.close()
246
247 data = [" %r: %r,\n" % it for it in md5_data.items()]
248 data.sort()
249 repl = "".join(data)
250
251 import inspect
252 srcfile = inspect.getsourcefile(sys.modules[__name__])
253 f = open(srcfile, 'rb')
254 src = f.read()
255 f.close()
256
257 match = re.search("\nmd5_data = {\n([^}]+)}", src)
258 if not match:
259 print >>sys.stderr, "Internal error!"
260 sys.exit(2)
261
262 src = src[:match.start(1)] + repl + src[match.end(1):]
263 f = open(srcfile, 'w')
264 f.write(src)
265 f.close()
266
267
268if __name__ == '__main__':
269 if len(sys.argv) > 2 and sys.argv[1] == '--md5update':
270 update_md5(sys.argv[2:])
271 else:
272 main(sys.argv[1:])
2730
=== modified file 'helpers/python/charmhelpers/tests/test_charmhelpers.py'
--- helpers/python/charmhelpers/tests/test_charmhelpers.py 2013-08-21 04:04:39 +0000
+++ helpers/python/charmhelpers/tests/test_charmhelpers.py 2015-08-31 19:32:56 +0000
@@ -3,7 +3,7 @@
3import unittest3import unittest
4import yaml4import yaml
55
6from simplejson import dumps6from json import dumps
7from StringIO import StringIO7from StringIO import StringIO
8from testtools import TestCase8from testtools import TestCase
99
1010
=== modified file 'requirements.txt'
--- requirements.txt 2015-08-24 16:35:22 +0000
+++ requirements.txt 2015-08-31 19:32:56 +0000
@@ -1,17 +1,22 @@
1PyYAML==3.111PyYAML==3.11
2blessings==1.6
3bundletester==0.5.2
4bzr>=2.6.0
5charmworldlib>=0.4.2
6coverage==3.7.1
7flake8==1.6.2
8httplib2==0.7.7
9juju-deployer==0.4.3
10jujubundlelib>=0.1.9
11jujuclient==0.50.1
2launchpadlib==1.10.212launchpadlib==1.10.2
13mock==1.0.1
3nose==1.2.114nose==1.2.1
4requests==1.1.0
5lazr.authentication==0.1.2
6lazr.restfulclient==0.13.1
7lazr.uri==1.0.3
8simplejson==2.2.1
9wadllib==1.3.1
10httplib2==0.7.7
11oauth==1.0.115oauth==1.0.1
12flake8==1.6.216otherstuf==1.1.0
13mock==1.0.117path.py==7.4
14coverage==3.7.118pathspec==0.3.3
15charmworldlib>=0.3.019pip>=7.1.2
16bzr>=2.6.020requests==2.7.0
17jujubundlelib>=0.1.921responses==0.4.0
22ruamel.yaml==0.10.2
1823
=== added directory 'scripts'
=== removed directory 'scripts'
=== added file 'scripts/packages.sh'
--- scripts/packages.sh 1970-01-01 00:00:00 +0000
+++ scripts/packages.sh 2015-08-31 19:32:56 +0000
@@ -0,0 +1,19 @@
1#!/bin/bash
2
3function apt_install() {
4 packages=$@
5 missing=()
6 for p in $packages; do
7 if ! dpkg-query -s $p &> /dev/null; then
8 missing+=($p)
9 fi
10 done
11 if [ -n "${missing}" ]; then
12 sudo apt-get update
13 sudo apt-get install -y ${missing}
14 return 1
15 fi
16 return 0
17}
18apt_install $@
19exit $?
020
=== removed file 'scripts/test'
--- scripts/test 2014-05-27 21:23:41 +0000
+++ scripts/test 1970-01-01 00:00:00 +0000
@@ -1,7 +0,0 @@
1#!/bin/bash
2# If the --pdb switch is passed, inject --pdb-failures too.
3if [[ $* =~ --pdb( .*|$) ]]
4then
5 extra_args="--pdb-failures"
6fi
7bin/nosetests --exe --with-id $extra_args $@
80
=== added file 'setup.cfg'
--- setup.cfg 1970-01-01 00:00:00 +0000
+++ setup.cfg 2015-08-31 19:32:56 +0000
@@ -0,0 +1,8 @@
1[nosetests]
2verbosity=1
3detailed-errors=1
4#pdb=1
5#pdb-failures=1
6logging-level=INFO
7
8
09
=== modified file 'setup.py'
--- setup.py 2015-08-24 16:35:22 +0000
+++ setup.py 2015-08-31 19:32:56 +0000
@@ -3,11 +3,6 @@
3# Copyright 2012 Canonical Ltd. This software is licensed under the3# Copyright 2012 Canonical Ltd. This software is licensed under the
4# GNU General Public License version 3 (see the file LICENSE).4# GNU General Public License version 3 (see the file LICENSE).
55
6import ez_setup
7
8
9ez_setup.use_setuptools()
10
11from setuptools import setup, find_packages6from setuptools import setup, find_packages
127
138
@@ -18,7 +13,9 @@
18 exclude=["*.tests", "*.tests.*", "tests.*", "tests"]),13 exclude=["*.tests", "*.tests.*", "tests.*", "tests"]),
19 install_requires=['launchpadlib', 'argparse', 'cheetah', 'pyyaml',14 install_requires=['launchpadlib', 'argparse', 'cheetah', 'pyyaml',
20 'pycrypto', 'paramiko', 'bzr', 'requests',15 'pycrypto', 'paramiko', 'bzr', 'requests',
21 'charmworldlib', 'jujubundlelib'],16 'charmworldlib', 'blessings', 'ruamel.yaml',
17 'pathspec', 'bundletester', 'otherstuf', "path.py",
18 "jujubundlelib"],
22 include_package_data=True,19 include_package_data=True,
23 maintainer='Marco Ceppi',20 maintainer='Marco Ceppi',
24 maintainer_email='marco@ceppi.net',21 maintainer_email='marco@ceppi.net',
@@ -33,27 +30,30 @@
33 entry_points={30 entry_points={
34 'console_scripts': [31 'console_scripts': [
35 'charm = charmtools:charm',32 'charm = charmtools:charm',
36 'juju-charm = charmtools:charm',33 'charm-add = charmtools.generate:main',
37 'juju-bundle = charmtools:bundle',34 'charm-compose = charmtools.compose:main',
38 'juju-test = charmtools.test:main',35 'charm-create = charmtools.create:main',
36 'charm-generate = charmtools.generate:main',
39 'charm-get = charmtools.get:main',37 'charm-get = charmtools.get:main',
40 'charm-getall = charmtools.getall:main',38 'charm-getall = charmtools.getall:main',
41 'charm-proof = charmtools.proof:main',39 'charm-help = charmtools.cli:usage',
42 'charm-create = charmtools.create:main',40 'charm-info = charmtools.info:main',
41 'charm-inspect = charmtools.compose:inspect',
43 'charm-list = charmtools.list:main',42 'charm-list = charmtools.list:main',
44 'charm-promulgate = charmtools.promulgate:main',43 'charm-promulgate = charmtools.promulgate:main',
44 'charm-proof = charmtools.proof:main',
45 'charm-refresh = charmtools.compose:main',
45 'charm-review = charmtools.review:main',46 'charm-review = charmtools.review:main',
46 'charm-review-queue = charmtools.review_queue:main',47 'charm-review-queue = charmtools.review_queue:main',
47 'charm-search = charmtools.search:main',48 'charm-search = charmtools.search:main',
48 'charm-subscribers = charmtools.subscribers:main',49 'charm-subscribers = charmtools.subscribers:main',
50 'charm-test = charmtools.test:main',
49 'charm-unpromulgate = charmtools.unpromulgate:main',51 'charm-unpromulgate = charmtools.unpromulgate:main',
50 'charm-update = charmtools.update:main',52 'charm-update = charmtools.update:main',
51 'charm-version = charmtools.version:main',53 'charm-version = charmtools.version:main',
52 'charm-help = charmtools.cli:usage',54 'juju-bundle = charmtools:bundle',
53 'charm-test = charmtools.test:main',55 'juju-charm = charmtools:charm',
54 'charm-info = charmtools.info:main',56 'juju-test = charmtools.test:main',
55 'charm-generate = charmtools.generate:main',
56 'charm-add = charmtools.generate:main',
57 ],57 ],
58 'charmtools.templates': [58 'charmtools.templates': [
59 'bash = charmtools.templates.bash:BashCharmTemplate',59 'bash = charmtools.templates.bash:BashCharmTemplate',
6060
=== added directory 'tests/interfaces'
=== added directory 'tests/interfaces/mysql'
=== added file 'tests/interfaces/mysql/interface.yaml'
--- tests/interfaces/mysql/interface.yaml 1970-01-01 00:00:00 +0000
+++ tests/interfaces/mysql/interface.yaml 2015-08-31 19:32:56 +0000
@@ -0,0 +1,1 @@
1name: mysql
02
=== added file 'tests/interfaces/mysql/provides.py'
--- tests/interfaces/mysql/provides.py 1970-01-01 00:00:00 +0000
+++ tests/interfaces/mysql/provides.py 2015-08-31 19:32:56 +0000
@@ -0,0 +1,1 @@
1"provides"
02
=== added file 'tests/interfaces/mysql/requires.py'
--- tests/interfaces/mysql/requires.py 1970-01-01 00:00:00 +0000
+++ tests/interfaces/mysql/requires.py 2015-08-31 19:32:56 +0000
@@ -0,0 +1,1 @@
1"requires"
02
=== modified file 'tests/test_charm_generate.py'
--- tests/test_charm_generate.py 2014-11-05 22:09:11 +0000
+++ tests/test_charm_generate.py 2015-08-31 19:32:56 +0000
@@ -24,13 +24,6 @@
24 'provides': 'nointerface'})24 'provides': 'nointerface'})
2525
26 @patch('charmtools.generate.Charm')26 @patch('charmtools.generate.Charm')
27 @patch('charmtools.generate.shutil')
28 def test_copy_file(self, msh, mcharm):
29 m = mcharm.return_value.is_charm.return_value = True
30 copy_file('1.ex', '/tmp')
31 msh.copy.assert_called()
32
33 @patch('charmtools.generate.Charm')
34 def test_not_charm(self, mcharm):27 def test_not_charm(self, mcharm):
35 mcharm.return_value.is_charm.return_value = False28 mcharm.return_value.is_charm.return_value = False
36 self.assertRaises(Exception, copy_file, '1.ex', '/no-charm')29 self.assertRaises(Exception, copy_file, '1.ex', '/no-charm')
3730
=== added file 'tests/test_compose.py'
--- tests/test_compose.py 1970-01-01 00:00:00 +0000
+++ tests/test_compose.py 2015-08-31 19:32:56 +0000
@@ -0,0 +1,239 @@
1from charmtools import compose
2from charmtools import utils
3from path import path
4from ruamel import yaml
5import json
6import logging
7import mock
8import os
9import pkg_resources
10import responses
11import unittest
12
13
14class TestCompose(unittest.TestCase):
15 def setUp(self):
16 dirname = pkg_resources.resource_filename(__name__, "")
17 os.environ["COMPOSER_PATH"] = path(dirname)
18 os.environ["INTERFACE_PATH"] = path(dirname) / "interfaces"
19 path("out").rmtree_p()
20
21 def tearDown(self):
22 path("out").rmtree_p()
23
24 def test_tester_compose(self):
25 composer = compose.Composer()
26 composer.log_level = "WARNING"
27 composer.output_dir = "out"
28 composer.series = "trusty"
29 composer.name = "foo"
30 composer.charm = "trusty/tester"
31 composer()
32 base = path('out/trusty/foo')
33 self.assertTrue(base.exists())
34
35 # Verify ignore rules applied
36 self.assertFalse((base / ".bzr").exists())
37
38 # Metadata should have combined provides fields
39 metadata = base / "metadata.yaml"
40 self.assertTrue(metadata.exists())
41 metadata_data = yaml.load(metadata.open())
42 self.assertIn("shared-db", metadata_data['provides'])
43 self.assertIn("storage", metadata_data['provides'])
44
45 # Config should have keys but not the ones in deletes
46 config = base / "config.yaml"
47 self.assertTrue(config.exists())
48 config_data = yaml.load(config.open())['options']
49 self.assertIn("bind-address", config_data)
50 self.assertNotIn("vip", config_data)
51
52 cyaml = base / "composer.yaml"
53 self.assertTrue(cyaml.exists())
54 cyaml_data = yaml.load(cyaml.open())
55 self.assertEquals(cyaml_data['includes'], ['trusty/mysql'])
56 self.assertEquals(cyaml_data['is'], 'foo')
57
58 self.assertTrue((base / "hooks/config-changed").exists())
59
60 # Files from the top layer as overrides
61 start = base / "hooks/start"
62 self.assertTrue(start.exists())
63 self.assertIn("Overridden", start.text())
64
65 self.assertTrue((base / "README.md").exists())
66 self.assertEqual("dynamic tactics", (base / "README.md").text())
67
68 sigs = base / ".composer.manifest"
69 self.assertTrue(sigs.exists())
70 data = json.load(sigs.open())
71 self.assertEquals(data['signatures']["README.md"], [
72 u'foo',
73 "static",
74 u'cfac20374288c097975e9f25a0d7c81783acdbc81'
75 '24302ff4a731a4aea10de99'])
76
77 self.assertEquals(data["signatures"]['metadata.yaml'], [
78 u'foo',
79 "dynamic",
80 u'8dd9059eae849c61a1bd3d8de7f96a418e'
81 u'f8b4bf5d9c058c413b5169e2783815',
82 ])
83
84 def test_regenerate_inplace(self):
85 # take a generated example where a base layer has changed
86 # regenerate in place
87 # make some assertions
88 composer = compose.Composer()
89 composer.log_level = "WARNING"
90 composer.output_dir = "out"
91 composer.series = "trusty"
92 composer.name = "foo"
93 composer.charm = "trusty/b"
94 composer()
95 base = path('out/trusty/foo')
96 self.assertTrue(base.exists())
97
98 # verify the 1st gen worked
99 self.assertTrue((base / "a").exists())
100 self.assertTrue((base / "README.md").exists())
101
102 # now regenerate from the target
103 with utils.cd("out/trusty/foo"):
104 composer = compose.Composer()
105 composer.log_level = "WARNING"
106 composer.output_dir = path(os.getcwd())
107 composer.series = "trusty"
108 # The generate target and source are now the same
109 composer.name = "foo"
110 composer.charm = "."
111 composer()
112 base = composer.output_dir
113 self.assertTrue(base.exists())
114
115 # Check that the generated composer makes sense
116 cy = base / "composer.yaml"
117 config = yaml.load(cy.open())
118 self.assertEquals(config["includes"], ["trusty/a", "interface:mysql"])
119 self.assertEquals(config["is"], "foo")
120
121 # We can even run it more than once
122 composer()
123 cy = base / "composer.yaml"
124 config = yaml.load(cy.open())
125 self.assertEquals(config["includes"], ["trusty/a", "interface:mysql"])
126 self.assertEquals(config["is"], "foo")
127
128 # We included an interface, we should be able to assert things about it
129 # in its final form as well
130 provides = base / "hooks/relations/mysql/provides.py"
131 requires = base / "hooks/relations/mysql/requires.py"
132 self.assertTrue(provides.exists())
133 self.assertTrue(requires.exists())
134
135 # and that we generated the hooks themselves
136 for kind in ["joined", "changed", "broken", "departed"]:
137 self.assertTrue((base / "hooks" /
138 "mysql-relation-{}".format(kind)).exists())
139
140 # and ensure we have an init file (the interface doesn't its added)
141 init = base / "hooks/relations/mysql/__init__.py"
142 self.assertTrue(init.exists())
143
144 @responses.activate
145 def test_remote_interface(self):
146 # XXX: this test does pull the git repo in the response
147 responses.add(responses.GET,
148 "http://interfaces.juju.solutions/api/v1/interface/pgsql/",
149 body='''{
150 "id": "pgsql",
151 "name": "pgsql4",
152 "repo":
153 "https://github.com/bcsaller/juju-relation-pgsql.git",
154 "_id": {
155 "$oid": "55a471959c1d246feae487e5"
156 },
157 "version": 1
158 }''',
159 content_type="application/json")
160 composer = compose.Composer()
161 composer.log_level = "WARNING"
162 composer.output_dir = "out"
163 composer.series = "trusty"
164 composer.name = "foo"
165 composer.charm = "trusty/c-reactive"
166 composer()
167 base = path('out/trusty/foo')
168 self.assertTrue(base.exists())
169
170 # basics
171 self.assertTrue((base / "a").exists())
172 self.assertTrue((base / "README.md").exists())
173 # show that we pulled the interface from github
174 init = base / "hooks/relations/pgsql/__init__.py"
175 self.assertTrue(init.exists())
176 main = base / "hooks/reactive/main.py"
177 self.assertTrue(main.exists())
178
179 @mock.patch("charmtools.utils.Process")
180 @responses.activate
181 def test_remote_layer(self, mcall):
182 # XXX: this test does pull the git repo in the response
183 responses.add(responses.GET,
184 "http://interfaces.juju.solutions/api/v1/layer/basic/",
185 body='''{
186 "id": "basic",
187 "name": "basic",
188 "repo":
189 "https://git.launchpad.net/~bcsaller/charms/+source/basic",
190 "_id": {
191 "$oid": "55a471959c1d246feae487e5"
192 },
193 "version": 1
194 }''',
195 content_type="application/json")
196 composer = compose.Composer()
197 composer.log_level = "WARNING"
198 composer.output_dir = "out"
199 composer.series = "trusty"
200 composer.name = "foo"
201 composer.charm = "trusty/use-layers"
202 # remove the sign phase
203 composer.PHASES = composer.PHASES[:-2]
204
205 composer()
206 base = path('out/trusty/foo')
207 self.assertTrue(base.exists())
208
209 # basics
210 self.assertTrue((base / "README.md").exists())
211
212 # show that we pulled charmhelpers from the basic layer as well
213 mcall.assert_called_with(("pip", "install", "-U",
214 '--exists-action', 'i',
215 "-t", mock.ANY,
216 mock.ANY))
217
218
219 @mock.patch("charmtools.utils.Process")
220 def test_pypi_installer(self, mcall):
221 composer = compose.Composer()
222 composer.log_level = "WARN"
223 composer.output_dir = "out"
224 composer.series = "trusty"
225 composer.name = "foo"
226 composer.charm = "trusty/chlayer"
227
228 # remove the sign phase
229 composer.PHASES = composer.PHASES[:-2]
230 composer()
231 mcall.assert_called_with(("pip", "install", "-U",
232 '--exists-action', 'i',
233 "-t", mock.ANY,
234 "charmhelpers"))
235
236
237if __name__ == '__main__':
238 logging.basicConfig()
239 unittest.main()
0240
=== added file 'tests/test_config.py'
--- tests/test_config.py 1970-01-01 00:00:00 +0000
+++ tests/test_config.py 2015-08-31 19:32:56 +0000
@@ -0,0 +1,29 @@
1import logging
2import unittest
3
4from charmtools.compose.config import ComposerConfig
5
6
7class TestConfig(unittest.TestCase):
8 def test_rget(self):
9 c = ComposerConfig()
10 c['a'] = 1
11 c = c.new_child()
12 c['a'] = 99
13 c['b'] = "alpha"
14 self.assertEqual(c.get('a'), 99)
15 self.assertEqual(c.get('b'), "alpha")
16 self.assertEqual(c.rget('a'), [99, 1])
17
18 def test_tactics(self):
19 # configure from empty and a layer with tactics
20 c = ComposerConfig()
21 c._tactics = ['a', 'b', 'c']
22 c = c.new_child()
23 c._tactics = ['d', 'c']
24 self.assertEqual(c.tactics()[:5], ['d', 'c', 'a', 'b', 'c'])
25
26
27if __name__ == '__main__':
28 logging.basicConfig()
29 unittest.main()
030
=== modified file 'tests/test_juju_test.py'
--- tests/test_juju_test.py 2014-06-10 21:26:52 +0000
+++ tests/test_juju_test.py 2015-08-31 19:32:56 +0000
@@ -714,15 +714,6 @@
714 call('Failed to grab logs for dummy/0')]714 call('Failed to grab logs for dummy/0')]
715 o.log.warn.assert_has_calls(expected_warns)715 o.log.warn.assert_has_calls(expected_warns)
716716
717 @patch('subprocess.check_output')
718 @patch.object(juju_test.Orchestra, 'print_status')
719 def test_orchestra_perform(self, mprint_status, mcheck_output):
720 args = Arguments(tests='dummy', juju_env='testing', timeout=1)
721 c = juju_test.Conductor(args)
722 o = juju_test.Orchestra(c, 'test/dummy')
723 o.perform()
724 mprint_status.assert_called_once()
725
726717
727class TestCfgTest(unittest.TestCase):718class TestCfgTest(unittest.TestCase):
728 test_config = '''\719 test_config = '''\
729720
=== added file 'tests/test_utils.py'
--- tests/test_utils.py 1970-01-01 00:00:00 +0000
+++ tests/test_utils.py 2015-08-31 19:32:56 +0000
@@ -0,0 +1,43 @@
1from unittest import TestCase
2from charmtools import utils
3from StringIO import StringIO
4
5
6class TestUtils(TestCase):
7
8 def test_delta_python(self):
9 a = StringIO("""
10 def foo(n):
11 return n * 2
12
13
14 @when('db.ready')
15 def react(db):
16 print db
17 """)
18
19 b = StringIO("""
20 def foo(n):
21 return n * 2
22
23
24 @when('db.ready', 'bar')
25 def react(db):
26 print db
27 """)
28
29 result = StringIO()
30 t = utils.TermWriter(fp=result)
31 rc = utils.delta_python_dump(a, b, utils.REACTIVE_PATTERNS,
32 context=3,
33 term=t,
34 from_name="Alpha",
35 to_name="Beta")
36 # return code here indicates that there was a diff
37 self.assertFalse(rc)
38 result.seek(0)
39 output = result.read()
40 self.assertIn("Alpha", output)
41 self.assertIn("Beta", output)
42 self.assertIn("@when('db.ready'", output)
43 self.assertIn("bar", output)
044
=== added directory 'tests/trusty'
=== added directory 'tests/trusty/a'
=== added file 'tests/trusty/a/README.md'
--- tests/trusty/a/README.md 1970-01-01 00:00:00 +0000
+++ tests/trusty/a/README.md 2015-08-31 19:32:56 +0000
@@ -0,0 +1,1 @@
1From A
02
=== added file 'tests/trusty/a/a'
--- tests/trusty/a/a 1970-01-01 00:00:00 +0000
+++ tests/trusty/a/a 2015-08-31 19:32:56 +0000
@@ -0,0 +1,1 @@
1from a
02
=== added directory 'tests/trusty/b'
=== added file 'tests/trusty/b/README.md'
--- tests/trusty/b/README.md 1970-01-01 00:00:00 +0000
+++ tests/trusty/b/README.md 2015-08-31 19:32:56 +0000
@@ -0,0 +1,1 @@
1This is an overridden readme file
02
=== added file 'tests/trusty/b/composer.yaml'
--- tests/trusty/b/composer.yaml 1970-01-01 00:00:00 +0000
+++ tests/trusty/b/composer.yaml 2015-08-31 19:32:56 +0000
@@ -0,0 +1,1 @@
1includes: ["trusty/a", "interface:mysql"]
02
=== added file 'tests/trusty/b/metadata.yaml'
--- tests/trusty/b/metadata.yaml 1970-01-01 00:00:00 +0000
+++ tests/trusty/b/metadata.yaml 2015-08-31 19:32:56 +0000
@@ -0,0 +1,11 @@
1name: b
2summary: An imagined extension to the a charm
3maintainer: None
4description: |
5 Test layer b
6categories:
7 - app
8requires:
9 mysql:
10 interface: mysql
11
012
=== added directory 'tests/trusty/c'
=== added directory 'tests/trusty/c-reactive'
=== added file 'tests/trusty/c-reactive/README.md'
--- tests/trusty/c-reactive/README.md 1970-01-01 00:00:00 +0000
+++ tests/trusty/c-reactive/README.md 2015-08-31 19:32:56 +0000
@@ -0,0 +1,1 @@
1This is an overridden readme file
02
=== added file 'tests/trusty/c-reactive/composer.yaml'
--- tests/trusty/c-reactive/composer.yaml 1970-01-01 00:00:00 +0000
+++ tests/trusty/c-reactive/composer.yaml 2015-08-31 19:32:56 +0000
@@ -0,0 +1,1 @@
1includes: ["trusty/c"]
02
=== added directory 'tests/trusty/c-reactive/hooks'
=== added directory 'tests/trusty/c-reactive/hooks/reactive'
=== added file 'tests/trusty/c-reactive/hooks/reactive/main.py'
--- tests/trusty/c-reactive/hooks/reactive/main.py 1970-01-01 00:00:00 +0000
+++ tests/trusty/c-reactive/hooks/reactive/main.py 2015-08-31 19:32:56 +0000
@@ -0,0 +1,6 @@
1from charmhelpers.core.reactive import when
2from charmhelpers.core import hookenv
3
4@when('db.database.available')
5def pretend_we_have_db(pgsql):
6 hookenv.log("Got db: %s:%s" %(pgsql.host(), pgsql.database()))
07
=== added file 'tests/trusty/c/README.md'
--- tests/trusty/c/README.md 1970-01-01 00:00:00 +0000
+++ tests/trusty/c/README.md 2015-08-31 19:32:56 +0000
@@ -0,0 +1,1 @@
1This is an overridden readme file
02
=== added file 'tests/trusty/c/composer.yaml'
--- tests/trusty/c/composer.yaml 1970-01-01 00:00:00 +0000
+++ tests/trusty/c/composer.yaml 2015-08-31 19:32:56 +0000
@@ -0,0 +1,1 @@
1includes: ["trusty/a", "interface:pgsql"]
02
=== added file 'tests/trusty/c/metadata.yaml'
--- tests/trusty/c/metadata.yaml 1970-01-01 00:00:00 +0000
+++ tests/trusty/c/metadata.yaml 2015-08-31 19:32:56 +0000
@@ -0,0 +1,11 @@
1name: c
2summary: An imagined extension to the a charm
3maintainer: None
4description: |
5 Test layer c
6categories:
7 - app
8requires:
9 db:
10 interface: pgsql
11
012
=== added directory 'tests/trusty/chlayer'
=== added directory 'tests/trusty/chlayer/hooks'
=== added file 'tests/trusty/chlayer/hooks/charmhelpers.pypi'
--- tests/trusty/chlayer/hooks/charmhelpers.pypi 1970-01-01 00:00:00 +0000
+++ tests/trusty/chlayer/hooks/charmhelpers.pypi 2015-08-31 19:32:56 +0000
@@ -0,0 +1,1 @@
1charmhelpers
02
=== added directory 'tests/trusty/mysql'
=== added file 'tests/trusty/mysql/.bzrignore'
--- tests/trusty/mysql/.bzrignore 1970-01-01 00:00:00 +0000
+++ tests/trusty/mysql/.bzrignore 2015-08-31 19:32:56 +0000
@@ -0,0 +1,2 @@
1bin/
2.venv
03
=== added file 'tests/trusty/mysql/Makefile'
--- tests/trusty/mysql/Makefile 1970-01-01 00:00:00 +0000
+++ tests/trusty/mysql/Makefile 2015-08-31 19:32:56 +0000
@@ -0,0 +1,24 @@
1#!/usr/bin/make
2PYTHON := /usr/bin/env python
3export PYTHONPATH := hooks
4
5virtualenv:
6 virtualenv .venv
7 .venv/bin/pip install flake8 nose mock six
8
9lint: virtualenv
10 .venv/bin/flake8 --exclude hooks/charmhelpers hooks
11 @charm proof
12
13test: virtualenv
14 @echo Starting tests...
15 @sudo apt-get install python-six
16 @.venv/bin/nosetests --nologcapture unit_tests
17
18bin/charm_helpers_sync.py:
19 @mkdir -p bin
20 @bzr cat lp:charm-helpers/tools/charm_helpers_sync/charm_helpers_sync.py \
21 > bin/charm_helpers_sync.py
22
23sync: bin/charm_helpers_sync.py
24 $(PYTHON) bin/charm_helpers_sync.py -c charm-helpers.yaml
025
=== added file 'tests/trusty/mysql/README.md'
--- tests/trusty/mysql/README.md 1970-01-01 00:00:00 +0000
+++ tests/trusty/mysql/README.md 2015-08-31 19:32:56 +0000
@@ -0,0 +1,133 @@
1# Overview
2
3[MySQL](http://www.mysql.com) is a fast, stable and true multi-user, multi-threaded SQL database server. SQL (Structured Query Language) is the most popular database query language in the world. The main goals of MySQL are speed, robustness and ease of use.
4
5This charm also can deploy [Percona Server](http://www.percona.com/software/percona-server) is fork of MySQL by Percona Inc. which focuses on maximizing performance, particularly for heavy workloads. It is a drop-in replacement for MySQL and features XtraDB, a drop-in replacement for the InnoDB storage engine.
6
7# Usage
8
9## General Usage
10
11To deploy a MySQL service:
12
13 juju deploy mysql
14
15Once deployed, you can retrieve the MySQL root user password by logging in to the machine via `juju ssh` and readin the `/var/lib/mysql/mysql.passwd` file. To log in as root MySQL User at the MySQL console you can issue the following:
16
17 juju ssh mysql/0
18 mysql -u root -p`sudo cat /var/lib/mysql/mysql.passwd`
19
20## Backups
21
22The charm supports simple backups. To enable them set `backup_schedule` option. Optionally you can override default `backup_dir` and/or `backup_retention_count`:
23
24 juju set mysql backup_schedule="45 5 * * *" # cron formatted schedule
25 juju set mysql backup_dir="/mnt/backup"
26 juju set mysql backup_retention_count=28
27
28# Scale Out Usage
29
30## Replication
31
32MySQL supports the ability to replicate databases to slave instances. This
33allows you, for example, to load balance read queries across multiple slaves or
34use a slave to perform backups, all whilst not impeding the master's
35performance.
36
37To deploy a slave:
38
39 # deploy second service
40 juju deploy mysql mysql-slave
41
The diff has been truncated for viewing.

Subscribers

People subscribed via source and target branches