Merge ~chad.smith/cloud-init:analyze into cloud-init:master
- Git
- lp:~chad.smith/cloud-init
- analyze
- Merge into master
Status: | Merged | ||||
---|---|---|---|---|---|
Merge reported by: | Chad Smith | ||||
Merged at revision: | 1de597f3d5341eddb4af7ec555e15897e7579e83 | ||||
Proposed branch: | ~chad.smith/cloud-init:analyze | ||||
Merge into: | cloud-init:master | ||||
Diff against target: |
1141 lines (+995/-33) 10 files modified
Makefile (+1/-1) cloudinit/analyze/__init__.py (+0/-0) cloudinit/analyze/__main__.py (+155/-0) cloudinit/analyze/dump.py (+176/-0) cloudinit/analyze/show.py (+207/-0) cloudinit/analyze/tests/test_dump.py (+210/-0) cloudinit/cmd/main.py (+15/-29) doc/rtd/index.rst (+1/-0) doc/rtd/topics/debugging.rst (+146/-0) tests/unittests/test_cli.py (+84/-3) |
||||
Related bugs: |
|
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Server Team CI bot | continuous-integration | Approve | |
Scott Moser | Approve | ||
Chad Smith | Approve | ||
Ryan Harper | Needs Fixing | ||
Review via email: mp+328819@code.launchpad.net |
Commit message
Description of the change
tools: Add tooling for basic cloud-init performance analysis.
This branch adds cloudinit-analyze into cloud-init proper. It adds an "analyze" subcommand to the cloud-init command line utility for quick performance assessment of cloud-init stages and events.
On a cloud-init configured instance, running "cloud-init analyze blame" will now report which cloud-init events cost the most wall time. This allows for quick assessment of the most costly stages of cloud-init.
This functionality is pulled from Ryan Harper's analyze work.
The cloudinit-analyze main script itself has been refactored a bit for inclusion as a subcommand of cloud-init CLI. There will be a followup branch at some point which will optionally instrument detailed strace profiling, but that approach needs a bit more discussion first.
This branch also adds:
* additional debugging topic to the sphinx-generated docs describing cloud-init analyze, dump and show as well as cloud-init single usage.
* Updates the Makefile unittests target to include cloudinit directory because we now have unittests within that package.
LP: #1709761
Server Team CI bot (server-team-bot) wrote : | # |
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:cf8e6825a2d
https:/
Executed test runs:
SUCCESS: Checkout
FAILED: Unit & Style Tests
Click here to trigger a rebuild:
https:/
Ryan Harper (raharper) wrote : | # |
Thanks for getting this started.
Ryan Harper (raharper) wrote : | # |
Took the integrated version for a test-drive, here's the results
http://
tl;dr need to fix the dump_events() from raising exception on both sources; it's OK to accept both; the code prefers the rawdata over reading it from the input file handle.
Also noted the initial experience should have a quick summary like systemd-analyze does (one-line summary)
Ryan Harper (raharper) : | # |
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:19b08e84e2f
https:/
Executed test runs:
SUCCESS: Checkout
FAILED: Unit & Style Tests
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:4c65f2c3003
https:/
Executed test runs:
SUCCESS: Checkout
FAILED: Unit & Style Tests
Click here to trigger a rebuild:
https:/
Chad Smith (chad.smith) wrote : | # |
all comments addressed.
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:4c60df58ed7
https:/
Executed test runs:
SUCCESS: Checkout
FAILED: Unit & Style Tests
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:7228b6bdae0
https:/
Executed test runs:
SUCCESS: Checkout
FAILED: Unit & Style Tests
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:f5f7ebeb13c
https:/
Executed test runs:
SUCCESS: Checkout
FAILED: Unit & Style Tests
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:2fdfab74878
https:/
Executed test runs:
SUCCESS: Checkout
FAILED: Unit & Style Tests
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:fa27fddb663
https:/
Executed test runs:
SUCCESS: Checkout
FAILED: Unit & Style Tests
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:053bae69d83
https:/
Executed test runs:
SUCCESS: Checkout
FAILED: Unit & Style Tests
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:869a360232b
https:/
Executed test runs:
SUCCESS: Checkout
SUCCESS: Unit & Style Tests
SUCCESS: Ubuntu LTS: Build
SUCCESS: Ubuntu LTS: Integration
FAILED: CentOS 6 & 7: Build & Test
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:202a1eacc67
https:/
Executed test runs:
SUCCESS: Checkout
SUCCESS: Unit & Style Tests
SUCCESS: Ubuntu LTS: Build
SUCCESS: Ubuntu LTS: Integration
FAILED: CentOS 6 & 7: Build & Test
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:4054cc8a282
https:/
Executed test runs:
SUCCESS: Checkout
SUCCESS: Unit & Style Tests
SUCCESS: Ubuntu LTS: Build
SUCCESS: Ubuntu LTS: Integration
FAILED: MAAS Compatability Testing
Click here to trigger a rebuild:
https:/
Ryan Harper (raharper) : | # |
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:f7504a2a684
https:/
Executed test runs:
SUCCESS: Checkout
FAILED: Unit & Style Tests
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:f31e269c993
https:/
Executed test runs:
SUCCESS: Checkout
SUCCESS: Unit & Style Tests
FAILED: Ubuntu LTS: Build
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
PASSED: Continuous integration, rev:c810c18c63b
https:/
Executed test runs:
SUCCESS: Checkout
SUCCESS: Unit & Style Tests
SUCCESS: Ubuntu LTS: Build
SUCCESS: Ubuntu LTS: Integration
SUCCESS: MAAS Compatability Testing
IN_PROGRESS: Declarative: Post Actions
Click here to trigger a rebuild:
https:/
Ryan Harper (raharper) : | # |
Ryan Harper (raharper) wrote : | # |
couple more tweaks to docs
Server Team CI bot (server-team-bot) wrote : | # |
PASSED: Continuous integration, rev:ff8b5aabeda
https:/
Executed test runs:
SUCCESS: Checkout
SUCCESS: Unit & Style Tests
SUCCESS: Ubuntu LTS: Build
SUCCESS: Ubuntu LTS: Integration
SUCCESS: MAAS Compatability Testing
IN_PROGRESS: Declarative: Post Actions
Click here to trigger a rebuild:
https:/
Scott Moser (smoser) wrote : | # |
please spell check and line-wrap your commit message.
some comments inline.
generally we really dont want to make the default cloud-init run on boot be *slower* because we're we want to profile it.
Chad Smith (chad.smith) wrote : | # |
> please spell check and line-wrap your commit message.
> some comments inline.
Just reworded a bit of the description
>
> generally we really dont want to make the default cloud-init run on boot be
> *slower* because we're we want to profile it.
Thankfully we don't do any of that, this branch only adds a CLI tool we can call to parse cloud-init.log files.
Chad Smith (chad.smith) : | # |
Chad Smith (chad.smith) : | # |
Server Team CI bot (server-team-bot) wrote : | # |
PASSED: Continuous integration, rev:d4ae03007ae
https:/
Executed test runs:
SUCCESS: Checkout
SUCCESS: Unit & Style Tests
SUCCESS: Ubuntu LTS: Build
SUCCESS: Ubuntu LTS: Integration
SUCCESS: MAAS Compatability Testing
IN_PROGRESS: Declarative: Post Actions
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
PASSED: Continuous integration, rev:6750733930b
https:/
Executed test runs:
SUCCESS: Checkout
SUCCESS: Unit & Style Tests
SUCCESS: Ubuntu LTS: Build
SUCCESS: Ubuntu LTS: Integration
SUCCESS: MAAS Compatability Testing
IN_PROGRESS: Declarative: Post Actions
Click here to trigger a rebuild:
https:/
Chad Smith (chad.smith) : | # |
Scott Moser (smoser) wrote : | # |
some responses. thanks for the conditional loading.
I suggested we drop the doc on profiling sections as they're actually present in this merge proposal.
In the future if those come back, I'd like for the prompt to change between systems. I guess I wasn't clear before on your profiling sections show:
# IO tracing on the cloud-image disk and seed device
$ ssh -p 2222 ubuntu@localhost
$ cd /var/lib/
$ sudo zcat vda1.blktrace.* | blkparse -i - -s
I think that it is more clear as:
$ ssh -p 2222 ubuntu@localhost
% cd /var/lib ...
or even
$ ssh -p 2222 ubuntu@localhost
ubuntu@guest$ cd /var/lib/ ...
Ryan Harper (raharper) wrote : | # |
On Fri, Aug 18, 2017 at 4:11 PM, Scott Moser <email address hidden> wrote:
> Review: Needs Information
>
> please spell check and line-wrap your commit message.
> some comments inline.
>
> generally we really dont want to make the default cloud-init run on boot
> be *slower* because we're we want to profile it.
>
>
> Diff comments:
>
> > diff --git a/Makefile b/Makefile
> > index f280911..9e7f4ee 100644
> > --- a/Makefile
> > +++ b/Makefile
> > @@ -48,7 +48,7 @@ pyflakes3:
> > @$(CWD)
> >
> > unittest: clean_pyc
> > - nosetests $(noseopts) tests/unittests
> > + nosetests $(noseopts) tests/unittests cloudinit
>
> mention that in your changelog message.
>
> >
> > unittest3: clean_pyc
> > nosetests3 $(noseopts) tests/unittests
> > diff --git a/cloudinit/
> > index 139e03b..a977f0d 100644
> > --- a/cloudinit/
> > +++ b/cloudinit/
> > @@ -22,6 +22,7 @@ import traceback
> > from cloudinit import patcher
> > patcher.patch() # noqa
> >
> > +from cloudinit.
>
> This import means the boot time run of cloud-init incurs the cost of
> loading those files off disk an generally parsing this.
>
> We had discussed at least at one point hiding this behind 'devel'
> argument. If that was the case, then we could look at the first argument
> and import devel things if necessary.
>
> Ie 'cloud-init init' would not ever import the analyze, but 'cloud-init
> devel' would. As well as ideally, 'cloud-init help' would.
>
While I agree in general with the concept, and I'm not objecting to the
conditional loading; analyze as a module does not
touch any new imports that cloudinit/
(except for the 3 files in analyze subdir)
If we're worried about import/stat/load, then we should attack the rather
large util.py which is imported in many places
and that's just one of the modules that main.py is importing. Adding
analyze will have no significant impact.
Here's main.py's imports[1]:
'cloudinit/
'cloudinit/
'cloudinit/
'cloudinit/
'cloudinit/
'cloudinit/
'cloudinit/
'cloudinit/
'cloudinit/
'cloudinit/
'cloudinit/
'cloudinit/
'cloudinit/
'cloudinit/
'cloudinit/
'cloudinit/
'cloudinit/
Server Team CI bot (server-team-bot) wrote : | # |
PASSED: Continuous integration, rev:6c5ff9b2ca5
https:/
Executed test runs:
SUCCESS: Checkout
SUCCESS: Unit & Style Tests
SUCCESS: Ubuntu LTS: Build
SUCCESS: Ubuntu LTS: Integration
SUCCESS: MAAS Compatability Testing
IN_PROGRESS: Declarative: Post Actions
Click here to trigger a rebuild:
https:/
Scott Moser (smoser) wrote : | # |
my 2 comments/requests
a.) un-change your 'iff' change (if and only if).
b.) in your commit message drop the link to the rharper branch that
might not exist at some point in the future.
Instead just say "from Ryan Harper's work" or something.
Scott Moser (smoser) wrote : | # |
with those 2 changes i approve.
Server Team CI bot (server-team-bot) wrote : | # |
PASSED: Continuous integration, rev:f39496a87db
https:/
Executed test runs:
SUCCESS: Checkout
SUCCESS: Unit & Style Tests
SUCCESS: Ubuntu LTS: Build
SUCCESS: Ubuntu LTS: Integration
SUCCESS: MAAS Compatability Testing
IN_PROGRESS: Declarative: Post Actions
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
PASSED: Continuous integration, rev:1de597f3d53
https:/
Executed test runs:
SUCCESS: Checkout
SUCCESS: Unit & Style Tests
SUCCESS: Ubuntu LTS: Build
SUCCESS: Ubuntu LTS: Integration
SUCCESS: MAAS Compatability Testing
IN_PROGRESS: Declarative: Post Actions
Click here to trigger a rebuild:
https:/
There was an error fetching revisions from git servers. Please try again in a few minutes. If the problem persists, contact Launchpad support.
Preview Diff
1 | diff --git a/Makefile b/Makefile |
2 | index f280911..9e7f4ee 100644 |
3 | --- a/Makefile |
4 | +++ b/Makefile |
5 | @@ -48,7 +48,7 @@ pyflakes3: |
6 | @$(CWD)/tools/run-pyflakes3 |
7 | |
8 | unittest: clean_pyc |
9 | - nosetests $(noseopts) tests/unittests |
10 | + nosetests $(noseopts) tests/unittests cloudinit |
11 | |
12 | unittest3: clean_pyc |
13 | nosetests3 $(noseopts) tests/unittests |
14 | diff --git a/cloudinit/analyze/__init__.py b/cloudinit/analyze/__init__.py |
15 | new file mode 100644 |
16 | index 0000000..e69de29 |
17 | --- /dev/null |
18 | +++ b/cloudinit/analyze/__init__.py |
19 | diff --git a/cloudinit/analyze/__main__.py b/cloudinit/analyze/__main__.py |
20 | new file mode 100644 |
21 | index 0000000..71cba4f |
22 | --- /dev/null |
23 | +++ b/cloudinit/analyze/__main__.py |
24 | @@ -0,0 +1,155 @@ |
25 | +# Copyright (C) 2017 Canonical Ltd. |
26 | +# |
27 | +# This file is part of cloud-init. See LICENSE file for license information. |
28 | + |
29 | +import argparse |
30 | +import re |
31 | +import sys |
32 | + |
33 | +from . import dump |
34 | +from . import show |
35 | + |
36 | + |
37 | +def get_parser(parser=None): |
38 | + if not parser: |
39 | + parser = argparse.ArgumentParser( |
40 | + prog='cloudinit-analyze', |
41 | + description='Devel tool: Analyze cloud-init logs and data') |
42 | + subparsers = parser.add_subparsers(title='Subcommands', dest='subcommand') |
43 | + subparsers.required = True |
44 | + |
45 | + parser_blame = subparsers.add_parser( |
46 | + 'blame', help='Print list of executed stages ordered by time to init') |
47 | + parser_blame.add_argument( |
48 | + '-i', '--infile', action='store', dest='infile', |
49 | + default='/var/log/cloud-init.log', |
50 | + help='specify where to read input.') |
51 | + parser_blame.add_argument( |
52 | + '-o', '--outfile', action='store', dest='outfile', default='-', |
53 | + help='specify where to write output. ') |
54 | + parser_blame.set_defaults(action=('blame', analyze_blame)) |
55 | + |
56 | + parser_show = subparsers.add_parser( |
57 | + 'show', help='Print list of in-order events during execution') |
58 | + parser_show.add_argument('-f', '--format', action='store', |
59 | + dest='print_format', default='%I%D @%Es +%ds', |
60 | + help='specify formatting of output.') |
61 | + parser_show.add_argument('-i', '--infile', action='store', |
62 | + dest='infile', default='/var/log/cloud-init.log', |
63 | + help='specify where to read input.') |
64 | + parser_show.add_argument('-o', '--outfile', action='store', |
65 | + dest='outfile', default='-', |
66 | + help='specify where to write output.') |
67 | + parser_show.set_defaults(action=('show', analyze_show)) |
68 | + parser_dump = subparsers.add_parser( |
69 | + 'dump', help='Dump cloud-init events in JSON format') |
70 | + parser_dump.add_argument('-i', '--infile', action='store', |
71 | + dest='infile', default='/var/log/cloud-init.log', |
72 | + help='specify where to read input. ') |
73 | + parser_dump.add_argument('-o', '--outfile', action='store', |
74 | + dest='outfile', default='-', |
75 | + help='specify where to write output. ') |
76 | + parser_dump.set_defaults(action=('dump', analyze_dump)) |
77 | + return parser |
78 | + |
79 | + |
80 | +def analyze_blame(name, args): |
81 | + """Report a list of records sorted by largest time delta. |
82 | + |
83 | + For example: |
84 | + 30.210s (init-local) searching for datasource |
85 | + 8.706s (init-network) reading and applying user-data |
86 | + 166ms (modules-config) .... |
87 | + 807us (modules-final) ... |
88 | + |
89 | + We generate event records parsing cloud-init logs, formatting the output |
90 | + and sorting by record data ('delta') |
91 | + """ |
92 | + (infh, outfh) = configure_io(args) |
93 | + blame_format = ' %ds (%n)' |
94 | + r = re.compile('(^\s+\d+\.\d+)', re.MULTILINE) |
95 | + for idx, record in enumerate(show.show_events(_get_events(infh), |
96 | + blame_format)): |
97 | + srecs = sorted(filter(r.match, record), reverse=True) |
98 | + outfh.write('-- Boot Record %02d --\n' % (idx + 1)) |
99 | + outfh.write('\n'.join(srecs) + '\n') |
100 | + outfh.write('\n') |
101 | + outfh.write('%d boot records analyzed\n' % (idx + 1)) |
102 | + |
103 | + |
104 | +def analyze_show(name, args): |
105 | + """Generate output records using the 'standard' format to printing events. |
106 | + |
107 | + Example output follows: |
108 | + Starting stage: (init-local) |
109 | + ... |
110 | + Finished stage: (init-local) 0.105195 seconds |
111 | + |
112 | + Starting stage: (init-network) |
113 | + ... |
114 | + Finished stage: (init-network) 0.339024 seconds |
115 | + |
116 | + Starting stage: (modules-config) |
117 | + ... |
118 | + Finished stage: (modules-config) 0.NNN seconds |
119 | + |
120 | + Starting stage: (modules-final) |
121 | + ... |
122 | + Finished stage: (modules-final) 0.NNN seconds |
123 | + """ |
124 | + (infh, outfh) = configure_io(args) |
125 | + for idx, record in enumerate(show.show_events(_get_events(infh), |
126 | + args.print_format)): |
127 | + outfh.write('-- Boot Record %02d --\n' % (idx + 1)) |
128 | + outfh.write('The total time elapsed since completing an event is' |
129 | + ' printed after the "@" character.\n') |
130 | + outfh.write('The time the event takes is printed after the "+" ' |
131 | + 'character.\n\n') |
132 | + outfh.write('\n'.join(record) + '\n') |
133 | + outfh.write('%d boot records analyzed\n' % (idx + 1)) |
134 | + |
135 | + |
136 | +def analyze_dump(name, args): |
137 | + """Dump cloud-init events in json format""" |
138 | + (infh, outfh) = configure_io(args) |
139 | + outfh.write(dump.json_dumps(_get_events(infh)) + '\n') |
140 | + |
141 | + |
142 | +def _get_events(infile): |
143 | + rawdata = None |
144 | + events, rawdata = show.load_events(infile, None) |
145 | + if not events: |
146 | + events, _ = dump.dump_events(rawdata=rawdata) |
147 | + return events |
148 | + |
149 | + |
150 | +def configure_io(args): |
151 | + """Common parsing and setup of input/output files""" |
152 | + if args.infile == '-': |
153 | + infh = sys.stdin |
154 | + else: |
155 | + try: |
156 | + infh = open(args.infile, 'r') |
157 | + except (FileNotFoundError, PermissionError): |
158 | + sys.stderr.write('Cannot open file %s\n' % args.infile) |
159 | + sys.exit(1) |
160 | + |
161 | + if args.outfile == '-': |
162 | + outfh = sys.stdout |
163 | + else: |
164 | + try: |
165 | + outfh = open(args.outfile, 'w') |
166 | + except PermissionError: |
167 | + sys.stderr.write('Cannot open file %s\n' % args.outfile) |
168 | + sys.exit(1) |
169 | + |
170 | + return (infh, outfh) |
171 | + |
172 | + |
173 | +if __name__ == '__main__': |
174 | + parser = get_parser() |
175 | + args = parser.parse_args() |
176 | + (name, action_functor) = args.action |
177 | + action_functor(name, args) |
178 | + |
179 | +# vi: ts=4 expandtab |
180 | diff --git a/cloudinit/analyze/dump.py b/cloudinit/analyze/dump.py |
181 | new file mode 100644 |
182 | index 0000000..ca4da49 |
183 | --- /dev/null |
184 | +++ b/cloudinit/analyze/dump.py |
185 | @@ -0,0 +1,176 @@ |
186 | +# This file is part of cloud-init. See LICENSE file for license information. |
187 | + |
188 | +import calendar |
189 | +from datetime import datetime |
190 | +import json |
191 | +import sys |
192 | + |
193 | +from cloudinit import util |
194 | + |
195 | +stage_to_description = { |
196 | + 'finished': 'finished running cloud-init', |
197 | + 'init-local': 'starting search for local datasources', |
198 | + 'init-network': 'searching for network datasources', |
199 | + 'init': 'searching for network datasources', |
200 | + 'modules-config': 'running config modules', |
201 | + 'modules-final': 'finalizing modules', |
202 | + 'modules': 'running modules for', |
203 | + 'single': 'running single module ', |
204 | +} |
205 | + |
206 | +# logger's asctime format |
207 | +CLOUD_INIT_ASCTIME_FMT = "%Y-%m-%d %H:%M:%S,%f" |
208 | + |
209 | +# journctl -o short-precise |
210 | +CLOUD_INIT_JOURNALCTL_FMT = "%b %d %H:%M:%S.%f %Y" |
211 | + |
212 | +# other |
213 | +DEFAULT_FMT = "%b %d %H:%M:%S %Y" |
214 | + |
215 | + |
216 | +def parse_timestamp(timestampstr): |
217 | + # default syslog time does not include the current year |
218 | + months = [calendar.month_abbr[m] for m in range(1, 13)] |
219 | + if timestampstr.split()[0] in months: |
220 | + # Aug 29 22:55:26 |
221 | + FMT = DEFAULT_FMT |
222 | + if '.' in timestampstr: |
223 | + FMT = CLOUD_INIT_JOURNALCTL_FMT |
224 | + dt = datetime.strptime(timestampstr + " " + |
225 | + str(datetime.now().year), |
226 | + FMT) |
227 | + timestamp = dt.strftime("%s.%f") |
228 | + elif "," in timestampstr: |
229 | + # 2016-09-12 14:39:20,839 |
230 | + dt = datetime.strptime(timestampstr, CLOUD_INIT_ASCTIME_FMT) |
231 | + timestamp = dt.strftime("%s.%f") |
232 | + else: |
233 | + # allow date(1) to handle other formats we don't expect |
234 | + timestamp = parse_timestamp_from_date(timestampstr) |
235 | + |
236 | + return float(timestamp) |
237 | + |
238 | + |
239 | +def parse_timestamp_from_date(timestampstr): |
240 | + out, _ = util.subp(['date', '+%s.%3N', '-d', timestampstr]) |
241 | + timestamp = out.strip() |
242 | + return float(timestamp) |
243 | + |
244 | + |
245 | +def parse_ci_logline(line): |
246 | + # Stage Starts: |
247 | + # Cloud-init v. 0.7.7 running 'init-local' at \ |
248 | + # Fri, 02 Sep 2016 19:28:07 +0000. Up 1.0 seconds. |
249 | + # Cloud-init v. 0.7.7 running 'init' at \ |
250 | + # Fri, 02 Sep 2016 19:28:08 +0000. Up 2.0 seconds. |
251 | + # Cloud-init v. 0.7.7 finished at |
252 | + # Aug 29 22:55:26 test1 [CLOUDINIT] handlers.py[DEBUG]: \ |
253 | + # finish: modules-final: SUCCESS: running modules for final |
254 | + # 2016-08-30T21:53:25.972325+00:00 y1 [CLOUDINIT] handlers.py[DEBUG]: \ |
255 | + # finish: modules-final: SUCCESS: running modules for final |
256 | + # |
257 | + # Nov 03 06:51:06.074410 x2 cloud-init[106]: [CLOUDINIT] util.py[DEBUG]: \ |
258 | + # Cloud-init v. 0.7.8 running 'init-local' at \ |
259 | + # Thu, 03 Nov 2016 06:51:06 +0000. Up 1.0 seconds. |
260 | + # |
261 | + # 2017-05-22 18:02:01,088 - util.py[DEBUG]: Cloud-init v. 0.7.9 running \ |
262 | + # 'init-local' at Mon, 22 May 2017 18:02:01 +0000. Up 2.0 seconds. |
263 | + |
264 | + separators = [' - ', ' [CLOUDINIT] '] |
265 | + found = False |
266 | + for sep in separators: |
267 | + if sep in line: |
268 | + found = True |
269 | + break |
270 | + |
271 | + if not found: |
272 | + return None |
273 | + |
274 | + (timehost, eventstr) = line.split(sep) |
275 | + |
276 | + # journalctl -o short-precise |
277 | + if timehost.endswith(":"): |
278 | + timehost = " ".join(timehost.split()[0:-1]) |
279 | + |
280 | + if "," in timehost: |
281 | + timestampstr, extra = timehost.split(",") |
282 | + timestampstr += ",%s" % extra.split()[0] |
283 | + if ' ' in extra: |
284 | + hostname = extra.split()[-1] |
285 | + else: |
286 | + hostname = timehost.split()[-1] |
287 | + timestampstr = timehost.split(hostname)[0].strip() |
288 | + if 'Cloud-init v.' in eventstr: |
289 | + event_type = 'start' |
290 | + if 'running' in eventstr: |
291 | + stage_and_timestamp = eventstr.split('running')[1].lstrip() |
292 | + event_name, _ = stage_and_timestamp.split(' at ') |
293 | + event_name = event_name.replace("'", "").replace(":", "-") |
294 | + if event_name == "init": |
295 | + event_name = "init-network" |
296 | + else: |
297 | + # don't generate a start for the 'finished at' banner |
298 | + return None |
299 | + event_description = stage_to_description[event_name] |
300 | + else: |
301 | + (pymodloglvl, event_type, event_name) = eventstr.split()[0:3] |
302 | + event_description = eventstr.split(event_name)[1].strip() |
303 | + |
304 | + event = { |
305 | + 'name': event_name.rstrip(":"), |
306 | + 'description': event_description, |
307 | + 'timestamp': parse_timestamp(timestampstr), |
308 | + 'origin': 'cloudinit', |
309 | + 'event_type': event_type.rstrip(":"), |
310 | + } |
311 | + if event['event_type'] == "finish": |
312 | + result = event_description.split(":")[0] |
313 | + desc = event_description.split(result)[1].lstrip(':').strip() |
314 | + event['result'] = result |
315 | + event['description'] = desc.strip() |
316 | + |
317 | + return event |
318 | + |
319 | + |
320 | +def json_dumps(data): |
321 | + return json.dumps(data, indent=1, sort_keys=True, |
322 | + separators=(',', ': ')) |
323 | + |
324 | + |
325 | +def dump_events(cisource=None, rawdata=None): |
326 | + events = [] |
327 | + event = None |
328 | + CI_EVENT_MATCHES = ['start:', 'finish:', 'Cloud-init v.'] |
329 | + |
330 | + if not any([cisource, rawdata]): |
331 | + raise ValueError('Either cisource or rawdata parameters are required') |
332 | + |
333 | + if rawdata: |
334 | + data = rawdata.splitlines() |
335 | + else: |
336 | + data = cisource.readlines() |
337 | + |
338 | + for line in data: |
339 | + for match in CI_EVENT_MATCHES: |
340 | + if match in line: |
341 | + try: |
342 | + event = parse_ci_logline(line) |
343 | + except ValueError: |
344 | + sys.stderr.write('Skipping invalid entry\n') |
345 | + if event: |
346 | + events.append(event) |
347 | + |
348 | + return events, data |
349 | + |
350 | + |
351 | +def main(): |
352 | + if len(sys.argv) > 1: |
353 | + cisource = open(sys.argv[1]) |
354 | + else: |
355 | + cisource = sys.stdin |
356 | + |
357 | + return json_dumps(dump_events(cisource)) |
358 | + |
359 | + |
360 | +if __name__ == "__main__": |
361 | + print(main()) |
362 | diff --git a/cloudinit/analyze/show.py b/cloudinit/analyze/show.py |
363 | new file mode 100644 |
364 | index 0000000..3b356bb |
365 | --- /dev/null |
366 | +++ b/cloudinit/analyze/show.py |
367 | @@ -0,0 +1,207 @@ |
368 | +# Copyright (C) 2016 Canonical Ltd. |
369 | +# |
370 | +# Author: Ryan Harper <ryan.harper@canonical.com> |
371 | +# |
372 | +# This file is part of cloud-init. See LICENSE file for license information. |
373 | + |
374 | +import base64 |
375 | +import datetime |
376 | +import json |
377 | +import os |
378 | + |
379 | +from cloudinit import util |
380 | + |
381 | +# An event: |
382 | +''' |
383 | +{ |
384 | + "description": "executing late commands", |
385 | + "event_type": "start", |
386 | + "level": "INFO", |
387 | + "name": "cmd-install/stage-late" |
388 | + "origin": "cloudinit", |
389 | + "timestamp": 1461164249.1590767, |
390 | +}, |
391 | + |
392 | + { |
393 | + "description": "executing late commands", |
394 | + "event_type": "finish", |
395 | + "level": "INFO", |
396 | + "name": "cmd-install/stage-late", |
397 | + "origin": "cloudinit", |
398 | + "result": "SUCCESS", |
399 | + "timestamp": 1461164249.1590767 |
400 | + } |
401 | + |
402 | +''' |
403 | +format_key = { |
404 | + '%d': 'delta', |
405 | + '%D': 'description', |
406 | + '%E': 'elapsed', |
407 | + '%e': 'event_type', |
408 | + '%I': 'indent', |
409 | + '%l': 'level', |
410 | + '%n': 'name', |
411 | + '%o': 'origin', |
412 | + '%r': 'result', |
413 | + '%t': 'timestamp', |
414 | + '%T': 'total_time', |
415 | +} |
416 | + |
417 | +formatting_help = " ".join(["{0}: {1}".format(k.replace('%', '%%'), v) |
418 | + for k, v in format_key.items()]) |
419 | + |
420 | + |
421 | +def format_record(msg, event): |
422 | + for i, j in format_key.items(): |
423 | + if i in msg: |
424 | + # ensure consistent formatting of time values |
425 | + if j in ['delta', 'elapsed', 'timestamp']: |
426 | + msg = msg.replace(i, "{%s:08.5f}" % j) |
427 | + else: |
428 | + msg = msg.replace(i, "{%s}" % j) |
429 | + return msg.format(**event) |
430 | + |
431 | + |
432 | +def dump_event_files(event): |
433 | + content = dict((k, v) for k, v in event.items() if k not in ['content']) |
434 | + files = content['files'] |
435 | + saved = [] |
436 | + for f in files: |
437 | + fname = f['path'] |
438 | + fn_local = os.path.basename(fname) |
439 | + fcontent = base64.b64decode(f['content']).decode('ascii') |
440 | + util.write_file(fn_local, fcontent) |
441 | + saved.append(fn_local) |
442 | + |
443 | + return saved |
444 | + |
445 | + |
446 | +def event_name(event): |
447 | + if event: |
448 | + return event.get('name') |
449 | + return None |
450 | + |
451 | + |
452 | +def event_type(event): |
453 | + if event: |
454 | + return event.get('event_type') |
455 | + return None |
456 | + |
457 | + |
458 | +def event_parent(event): |
459 | + if event: |
460 | + return event_name(event).split("/")[0] |
461 | + return None |
462 | + |
463 | + |
464 | +def event_timestamp(event): |
465 | + return float(event.get('timestamp')) |
466 | + |
467 | + |
468 | +def event_datetime(event): |
469 | + return datetime.datetime.utcfromtimestamp(event_timestamp(event)) |
470 | + |
471 | + |
472 | +def delta_seconds(t1, t2): |
473 | + return (t2 - t1).total_seconds() |
474 | + |
475 | + |
476 | +def event_duration(start, finish): |
477 | + return delta_seconds(event_datetime(start), event_datetime(finish)) |
478 | + |
479 | + |
480 | +def event_record(start_time, start, finish): |
481 | + record = finish.copy() |
482 | + record.update({ |
483 | + 'delta': event_duration(start, finish), |
484 | + 'elapsed': delta_seconds(start_time, event_datetime(start)), |
485 | + 'indent': '|' + ' ' * (event_name(start).count('/') - 1) + '`->', |
486 | + }) |
487 | + |
488 | + return record |
489 | + |
490 | + |
491 | +def total_time_record(total_time): |
492 | + return 'Total Time: %3.5f seconds\n' % total_time |
493 | + |
494 | + |
495 | +def generate_records(events, blame_sort=False, |
496 | + print_format="(%n) %d seconds in %I%D", |
497 | + dump_files=False, log_datafiles=False): |
498 | + |
499 | + sorted_events = sorted(events, key=lambda x: x['timestamp']) |
500 | + records = [] |
501 | + start_time = None |
502 | + total_time = 0.0 |
503 | + stage_start_time = {} |
504 | + stages_seen = [] |
505 | + boot_records = [] |
506 | + |
507 | + unprocessed = [] |
508 | + for e in range(0, len(sorted_events)): |
509 | + event = events[e] |
510 | + try: |
511 | + next_evt = events[e + 1] |
512 | + except IndexError: |
513 | + next_evt = None |
514 | + |
515 | + if event_type(event) == 'start': |
516 | + if event.get('name') in stages_seen: |
517 | + records.append(total_time_record(total_time)) |
518 | + boot_records.append(records) |
519 | + records = [] |
520 | + start_time = None |
521 | + total_time = 0.0 |
522 | + |
523 | + if start_time is None: |
524 | + stages_seen = [] |
525 | + start_time = event_datetime(event) |
526 | + stage_start_time[event_parent(event)] = start_time |
527 | + |
528 | + # see if we have a pair |
529 | + if event_name(event) == event_name(next_evt): |
530 | + if event_type(next_evt) == 'finish': |
531 | + records.append(format_record(print_format, |
532 | + event_record(start_time, |
533 | + event, |
534 | + next_evt))) |
535 | + else: |
536 | + # This is a parent event |
537 | + records.append("Starting stage: %s" % event.get('name')) |
538 | + unprocessed.append(event) |
539 | + stages_seen.append(event.get('name')) |
540 | + continue |
541 | + else: |
542 | + prev_evt = unprocessed.pop() |
543 | + if event_name(event) == event_name(prev_evt): |
544 | + record = event_record(start_time, prev_evt, event) |
545 | + records.append(format_record("Finished stage: " |
546 | + "(%n) %d seconds ", |
547 | + record) + "\n") |
548 | + total_time += record.get('delta') |
549 | + else: |
550 | + # not a match, put it back |
551 | + unprocessed.append(prev_evt) |
552 | + |
553 | + records.append(total_time_record(total_time)) |
554 | + boot_records.append(records) |
555 | + return boot_records |
556 | + |
557 | + |
558 | +def show_events(events, print_format): |
559 | + return generate_records(events, print_format=print_format) |
560 | + |
561 | + |
562 | +def load_events(infile, rawdata=None): |
563 | + if rawdata: |
564 | + data = rawdata.read() |
565 | + else: |
566 | + data = infile.read() |
567 | + |
568 | + j = None |
569 | + try: |
570 | + j = json.loads(data) |
571 | + except json.JSONDecodeError: |
572 | + pass |
573 | + |
574 | + return j, data |
575 | diff --git a/cloudinit/analyze/tests/test_dump.py b/cloudinit/analyze/tests/test_dump.py |
576 | new file mode 100644 |
577 | index 0000000..2c0885d |
578 | --- /dev/null |
579 | +++ b/cloudinit/analyze/tests/test_dump.py |
580 | @@ -0,0 +1,210 @@ |
581 | +# This file is part of cloud-init. See LICENSE file for license information. |
582 | + |
583 | +from datetime import datetime |
584 | +from textwrap import dedent |
585 | + |
586 | +from cloudinit.analyze.dump import ( |
587 | + dump_events, parse_ci_logline, parse_timestamp) |
588 | +from cloudinit.util import subp, write_file |
589 | +from tests.unittests.helpers import CiTestCase |
590 | + |
591 | + |
592 | +class TestParseTimestamp(CiTestCase): |
593 | + |
594 | + def test_parse_timestamp_handles_cloud_init_default_format(self): |
595 | + """Logs with cloud-init detailed formats will be properly parsed.""" |
596 | + trusty_fmt = '%Y-%m-%d %H:%M:%S,%f' |
597 | + trusty_stamp = '2016-09-12 14:39:20,839' |
598 | + |
599 | + parsed = parse_timestamp(trusty_stamp) |
600 | + |
601 | + # convert ourselves |
602 | + dt = datetime.strptime(trusty_stamp, trusty_fmt) |
603 | + expected = float(dt.strftime('%s.%f')) |
604 | + |
605 | + # use date(1) |
606 | + out, _err = subp(['date', '+%s.%3N', '-d', trusty_stamp]) |
607 | + timestamp = out.strip() |
608 | + date_ts = float(timestamp) |
609 | + |
610 | + self.assertEqual(expected, parsed) |
611 | + self.assertEqual(expected, date_ts) |
612 | + self.assertEqual(date_ts, parsed) |
613 | + |
614 | + def test_parse_timestamp_handles_syslog_adding_year(self): |
615 | + """Syslog timestamps lack a year. Add year and properly parse.""" |
616 | + syslog_fmt = '%b %d %H:%M:%S %Y' |
617 | + syslog_stamp = 'Aug 08 15:12:51' |
618 | + |
619 | + # convert stamp ourselves by adding the missing year value |
620 | + year = datetime.now().year |
621 | + dt = datetime.strptime(syslog_stamp + " " + str(year), syslog_fmt) |
622 | + expected = float(dt.strftime('%s.%f')) |
623 | + parsed = parse_timestamp(syslog_stamp) |
624 | + |
625 | + # use date(1) |
626 | + out, _ = subp(['date', '+%s.%3N', '-d', syslog_stamp]) |
627 | + timestamp = out.strip() |
628 | + date_ts = float(timestamp) |
629 | + |
630 | + self.assertEqual(expected, parsed) |
631 | + self.assertEqual(expected, date_ts) |
632 | + self.assertEqual(date_ts, parsed) |
633 | + |
634 | + def test_parse_timestamp_handles_journalctl_format_adding_year(self): |
635 | + """Journalctl precise timestamps lack a year. Add year and parse.""" |
636 | + journal_fmt = '%b %d %H:%M:%S.%f %Y' |
637 | + journal_stamp = 'Aug 08 17:15:50.606811' |
638 | + |
639 | + # convert stamp ourselves by adding the missing year value |
640 | + year = datetime.now().year |
641 | + dt = datetime.strptime(journal_stamp + " " + str(year), journal_fmt) |
642 | + expected = float(dt.strftime('%s.%f')) |
643 | + parsed = parse_timestamp(journal_stamp) |
644 | + |
645 | + # use date(1) |
646 | + out, _ = subp(['date', '+%s.%6N', '-d', journal_stamp]) |
647 | + timestamp = out.strip() |
648 | + date_ts = float(timestamp) |
649 | + |
650 | + self.assertEqual(expected, parsed) |
651 | + self.assertEqual(expected, date_ts) |
652 | + self.assertEqual(date_ts, parsed) |
653 | + |
654 | + def test_parse_unexpected_timestamp_format_with_date_command(self): |
655 | + """Dump sends unexpected timestamp formats to data for processing.""" |
656 | + new_fmt = '%H:%M %m/%d %Y' |
657 | + new_stamp = '17:15 08/08' |
658 | + |
659 | + # convert stamp ourselves by adding the missing year value |
660 | + year = datetime.now().year |
661 | + dt = datetime.strptime(new_stamp + " " + str(year), new_fmt) |
662 | + expected = float(dt.strftime('%s.%f')) |
663 | + parsed = parse_timestamp(new_stamp) |
664 | + |
665 | + # use date(1) |
666 | + out, _ = subp(['date', '+%s.%6N', '-d', new_stamp]) |
667 | + timestamp = out.strip() |
668 | + date_ts = float(timestamp) |
669 | + |
670 | + self.assertEqual(expected, parsed) |
671 | + self.assertEqual(expected, date_ts) |
672 | + self.assertEqual(date_ts, parsed) |
673 | + |
674 | + |
675 | +class TestParseCILogLine(CiTestCase): |
676 | + |
677 | + def test_parse_logline_returns_none_without_separators(self): |
678 | + """When no separators are found, parse_ci_logline returns None.""" |
679 | + expected_parse_ignores = [ |
680 | + '', '-', 'adsf-asdf', '2017-05-22 18:02:01,088', 'CLOUDINIT'] |
681 | + for parse_ignores in expected_parse_ignores: |
682 | + self.assertIsNone(parse_ci_logline(parse_ignores)) |
683 | + |
684 | + def test_parse_logline_returns_event_for_cloud_init_logs(self): |
685 | + """parse_ci_logline returns an event parse from cloud-init format.""" |
686 | + line = ( |
687 | + "2017-08-08 20:05:07,147 - util.py[DEBUG]: Cloud-init v. 0.7.9" |
688 | + " running 'init-local' at Tue, 08 Aug 2017 20:05:07 +0000. Up" |
689 | + " 6.26 seconds.") |
690 | + dt = datetime.strptime( |
691 | + '2017-08-08 20:05:07,147', '%Y-%m-%d %H:%M:%S,%f') |
692 | + timestamp = float(dt.strftime('%s.%f')) |
693 | + expected = { |
694 | + 'description': 'starting search for local datasources', |
695 | + 'event_type': 'start', |
696 | + 'name': 'init-local', |
697 | + 'origin': 'cloudinit', |
698 | + 'timestamp': timestamp} |
699 | + self.assertEqual(expected, parse_ci_logline(line)) |
700 | + |
701 | + def test_parse_logline_returns_event_for_journalctl_logs(self): |
702 | + """parse_ci_logline returns an event parse from journalctl format.""" |
703 | + line = ("Nov 03 06:51:06.074410 x2 cloud-init[106]: [CLOUDINIT]" |
704 | + " util.py[DEBUG]: Cloud-init v. 0.7.8 running 'init-local' at" |
705 | + " Thu, 03 Nov 2016 06:51:06 +0000. Up 1.0 seconds.") |
706 | + year = datetime.now().year |
707 | + dt = datetime.strptime( |
708 | + 'Nov 03 06:51:06.074410 %d' % year, '%b %d %H:%M:%S.%f %Y') |
709 | + timestamp = float(dt.strftime('%s.%f')) |
710 | + expected = { |
711 | + 'description': 'starting search for local datasources', |
712 | + 'event_type': 'start', |
713 | + 'name': 'init-local', |
714 | + 'origin': 'cloudinit', |
715 | + 'timestamp': timestamp} |
716 | + self.assertEqual(expected, parse_ci_logline(line)) |
717 | + |
718 | + def test_parse_logline_returns_event_for_finish_events(self): |
719 | + """parse_ci_logline returns a finish event for a parsed log line.""" |
720 | + line = ('2016-08-30 21:53:25.972325+00:00 y1 [CLOUDINIT]' |
721 | + ' handlers.py[DEBUG]: finish: modules-final: SUCCESS: running' |
722 | + ' modules for final') |
723 | + expected = { |
724 | + 'description': 'running modules for final', |
725 | + 'event_type': 'finish', |
726 | + 'name': 'modules-final', |
727 | + 'origin': 'cloudinit', |
728 | + 'result': 'SUCCESS', |
729 | + 'timestamp': 1472594005.972} |
730 | + self.assertEqual(expected, parse_ci_logline(line)) |
731 | + |
732 | + |
733 | +SAMPLE_LOGS = dedent("""\ |
734 | +Nov 03 06:51:06.074410 x2 cloud-init[106]: [CLOUDINIT] util.py[DEBUG]:\ |
735 | + Cloud-init v. 0.7.8 running 'init-local' at Thu, 03 Nov 2016\ |
736 | + 06:51:06 +0000. Up 1.0 seconds. |
737 | +2016-08-30 21:53:25.972325+00:00 y1 [CLOUDINIT] handlers.py[DEBUG]: finish:\ |
738 | + modules-final: SUCCESS: running modules for final |
739 | +""") |
740 | + |
741 | + |
742 | +class TestDumpEvents(CiTestCase): |
743 | + maxDiff = None |
744 | + |
745 | + def test_dump_events_with_rawdata(self): |
746 | + """Rawdata is split and parsed into a tuple of events and data""" |
747 | + events, data = dump_events(rawdata=SAMPLE_LOGS) |
748 | + expected_data = SAMPLE_LOGS.splitlines() |
749 | + year = datetime.now().year |
750 | + dt1 = datetime.strptime( |
751 | + 'Nov 03 06:51:06.074410 %d' % year, '%b %d %H:%M:%S.%f %Y') |
752 | + timestamp1 = float(dt1.strftime('%s.%f')) |
753 | + expected_events = [{ |
754 | + 'description': 'starting search for local datasources', |
755 | + 'event_type': 'start', |
756 | + 'name': 'init-local', |
757 | + 'origin': 'cloudinit', |
758 | + 'timestamp': timestamp1}, { |
759 | + 'description': 'running modules for final', |
760 | + 'event_type': 'finish', |
761 | + 'name': 'modules-final', |
762 | + 'origin': 'cloudinit', |
763 | + 'result': 'SUCCESS', |
764 | + 'timestamp': 1472594005.972}] |
765 | + self.assertEqual(expected_events, events) |
766 | + self.assertEqual(expected_data, data) |
767 | + |
768 | + def test_dump_events_with_cisource(self): |
769 | + """Cisource file is read and parsed into a tuple of events and data.""" |
770 | + tmpfile = self.tmp_path('logfile') |
771 | + write_file(tmpfile, SAMPLE_LOGS) |
772 | + events, data = dump_events(cisource=open(tmpfile)) |
773 | + year = datetime.now().year |
774 | + dt1 = datetime.strptime( |
775 | + 'Nov 03 06:51:06.074410 %d' % year, '%b %d %H:%M:%S.%f %Y') |
776 | + timestamp1 = float(dt1.strftime('%s.%f')) |
777 | + expected_events = [{ |
778 | + 'description': 'starting search for local datasources', |
779 | + 'event_type': 'start', |
780 | + 'name': 'init-local', |
781 | + 'origin': 'cloudinit', |
782 | + 'timestamp': timestamp1}, { |
783 | + 'description': 'running modules for final', |
784 | + 'event_type': 'finish', |
785 | + 'name': 'modules-final', |
786 | + 'origin': 'cloudinit', |
787 | + 'result': 'SUCCESS', |
788 | + 'timestamp': 1472594005.972}] |
789 | + self.assertEqual(expected_events, events) |
790 | + self.assertEqual(SAMPLE_LOGS.splitlines(), [d.strip() for d in data]) |
791 | diff --git a/cloudinit/cmd/main.py b/cloudinit/cmd/main.py |
792 | index 139e03b..9c0ac86 100644 |
793 | --- a/cloudinit/cmd/main.py |
794 | +++ b/cloudinit/cmd/main.py |
795 | @@ -50,13 +50,6 @@ WELCOME_MSG_TPL = ("Cloud-init v. {version} running '{action}' at " |
796 | # Module section template |
797 | MOD_SECTION_TPL = "cloud_%s_modules" |
798 | |
799 | -# Things u can query on |
800 | -QUERY_DATA_TYPES = [ |
801 | - 'data', |
802 | - 'data_raw', |
803 | - 'instance_id', |
804 | -] |
805 | - |
806 | # Frequency shortname to full name |
807 | # (so users don't have to remember the full name...) |
808 | FREQ_SHORT_NAMES = { |
809 | @@ -510,11 +503,6 @@ def main_modules(action_name, args): |
810 | return run_module_section(mods, name, name) |
811 | |
812 | |
813 | -def main_query(name, _args): |
814 | - raise NotImplementedError(("Action '%s' is not" |
815 | - " currently implemented") % (name)) |
816 | - |
817 | - |
818 | def main_single(name, args): |
819 | # Cloud-init single stage is broken up into the following sub-stages |
820 | # 1. Ensure that the init object fetches its config without errors |
821 | @@ -713,9 +701,11 @@ def main(sysv_args=None): |
822 | default=False) |
823 | |
824 | parser.set_defaults(reporter=None) |
825 | - subparsers = parser.add_subparsers() |
826 | + subparsers = parser.add_subparsers(title='Subcommands', dest='subcommand') |
827 | + subparsers.required = True |
828 | |
829 | # Each action and its sub-options (if any) |
830 | + |
831 | parser_init = subparsers.add_parser('init', |
832 | help=('initializes cloud-init and' |
833 | ' performs initial modules')) |
834 | @@ -737,17 +727,6 @@ def main(sysv_args=None): |
835 | choices=('init', 'config', 'final')) |
836 | parser_mod.set_defaults(action=('modules', main_modules)) |
837 | |
838 | - # These settings are used when you want to query information |
839 | - # stored in the cloud-init data objects/directories/files |
840 | - parser_query = subparsers.add_parser('query', |
841 | - help=('query information stored ' |
842 | - 'in cloud-init')) |
843 | - parser_query.add_argument("--name", '-n', action="store", |
844 | - help="item name to query on", |
845 | - required=True, |
846 | - choices=QUERY_DATA_TYPES) |
847 | - parser_query.set_defaults(action=('query', main_query)) |
848 | - |
849 | # This subcommand allows you to run a single module |
850 | parser_single = subparsers.add_parser('single', |
851 | help=('run a single module ')) |
852 | @@ -781,15 +760,22 @@ def main(sysv_args=None): |
853 | help=('list defined features')) |
854 | parser_features.set_defaults(action=('features', main_features)) |
855 | |
856 | + parser_analyze = subparsers.add_parser( |
857 | + 'analyze', help='Devel tool: Analyze cloud-init logs and data') |
858 | + if sysv_args and sysv_args[0] == 'analyze': |
859 | + # Only load this parser if analyze is specified to avoid file load cost |
860 | + # FIXME put this under 'devel' subcommand (coming in next branch) |
861 | + from cloudinit.analyze.__main__ import get_parser as analyze_parser |
862 | + # Construct analyze subcommand parser |
863 | + analyze_parser(parser_analyze) |
864 | + |
865 | args = parser.parse_args(args=sysv_args) |
866 | |
867 | - try: |
868 | - (name, functor) = args.action |
869 | - except AttributeError: |
870 | - parser.error('too few arguments') |
871 | + # Subparsers.required = True and each subparser sets action=(name, functor) |
872 | + (name, functor) = args.action |
873 | |
874 | # Setup basic logging to start (until reinitialized) |
875 | - # iff in debug mode... |
876 | + # iff in debug mode. |
877 | if args.debug: |
878 | logging.setupBasicLogging() |
879 | |
880 | diff --git a/doc/rtd/index.rst b/doc/rtd/index.rst |
881 | index a691103..de67f36 100644 |
882 | --- a/doc/rtd/index.rst |
883 | +++ b/doc/rtd/index.rst |
884 | @@ -40,6 +40,7 @@ initialization of a cloud instance. |
885 | topics/merging.rst |
886 | topics/network-config.rst |
887 | topics/vendordata.rst |
888 | + topics/debugging.rst |
889 | topics/moreinfo.rst |
890 | topics/hacking.rst |
891 | topics/tests.rst |
892 | diff --git a/doc/rtd/topics/debugging.rst b/doc/rtd/topics/debugging.rst |
893 | new file mode 100644 |
894 | index 0000000..4e43dd5 |
895 | --- /dev/null |
896 | +++ b/doc/rtd/topics/debugging.rst |
897 | @@ -0,0 +1,146 @@ |
898 | +********************** |
899 | +Testing and debugging cloud-init |
900 | +********************** |
901 | + |
902 | +Overview |
903 | +======== |
904 | +This topic will discuss general approaches for test and debug of cloud-init on |
905 | +deployed instances. |
906 | + |
907 | + |
908 | +Boot Time Analysis - cloud-init analyze |
909 | +====================================== |
910 | +Occasionally instances don't appear as performant as we would like and |
911 | +cloud-init packages a simple facility to inspect what operations took |
912 | +cloud-init the longest during boot and setup. |
913 | + |
914 | +The script **/usr/bin/cloud-init** has an analyze sub-command **analyze** |
915 | +which parses any cloud-init.log file into formatted and sorted events. It |
916 | +allows for detailed analysis of the most costly cloud-init operations are to |
917 | +determine the long-pole in cloud-init configuration and setup. These |
918 | +subcommands default to reading /var/log/cloud-init.log. |
919 | + |
920 | +* ``analyze show`` Parse and organize cloud-init.log events by stage and |
921 | +include each sub-stage granularity with time delta reports. |
922 | + |
923 | +.. code-block:: bash |
924 | + |
925 | + $ cloud-init analyze show -i my-cloud-init.log |
926 | + -- Boot Record 01 -- |
927 | + The total time elapsed since completing an event is printed after the "@" |
928 | + character. |
929 | + The time the event takes is printed after the "+" character. |
930 | + |
931 | + Starting stage: modules-config |
932 | + |`->config-emit_upstart ran successfully @05.47600s +00.00100s |
933 | + |`->config-snap_config ran successfully @05.47700s +00.00100s |
934 | + |`->config-ssh-import-id ran successfully @05.47800s +00.00200s |
935 | + |`->config-locale ran successfully @05.48000s +00.00100s |
936 | + ... |
937 | + |
938 | + |
939 | +* ``analyze dump`` Parse cloud-init.log into event records and return a list of |
940 | +dictionaries that can be consumed for other reporting needs. |
941 | + |
942 | +.. code-block:: bash |
943 | + |
944 | + $ cloud-init analyze blame -i my-cloud-init.log |
945 | + [ |
946 | + { |
947 | + "description": "running config modules", |
948 | + "event_type": "start", |
949 | + "name": "modules-config", |
950 | + "origin": "cloudinit", |
951 | + "timestamp": 1510807493.0 |
952 | + },... |
953 | + |
954 | +* ``analyze blame`` Parse cloud-init.log into event records and sort them based |
955 | +on highest time cost for quick assessment of areas of cloud-init that may need |
956 | +improvement. |
957 | + |
958 | +.. code-block:: bash |
959 | + |
960 | + $ cloud-init analyze blame -i my-cloud-init.log |
961 | + -- Boot Record 11 -- |
962 | + 00.01300s (modules-final/config-scripts-per-boot) |
963 | + 00.00400s (modules-final/config-final-message) |
964 | + 00.00100s (modules-final/config-rightscale_userdata) |
965 | + ... |
966 | + |
967 | + |
968 | +Analyze quickstart - LXC |
969 | +--------------------------- |
970 | +To quickly obtain a cloud-init log try using lxc on any ubuntu system: |
971 | + |
972 | +.. code-block:: bash |
973 | + |
974 | + $ lxc init ubuntu-daily:xenial x1 |
975 | + $ lxc start x1 |
976 | + # Take lxc's cloud-init.log and pipe it to the analyzer |
977 | + $ lxc file pull x1/var/log/cloud-init.log - | cloud-init analyze dump -i - |
978 | + $ lxc file pull x1/var/log/cloud-init.log - | \ |
979 | + python3 -m cloudinit.analyze dump -i - |
980 | + |
981 | +Analyze quickstart - KVM |
982 | +--------------------------- |
983 | +To quickly analyze a KVM a cloud-init log: |
984 | + |
985 | +1. Download the current cloud image |
986 | + wget https://cloud-images.ubuntu.com/daily/server/xenial/current/xenial-server-cloudimg-amd64.img |
987 | +2. Create a snapshot image to preserve the original cloud-image |
988 | + |
989 | +.. code-block:: bash |
990 | + |
991 | + $ qemu-img create -b xenial-server-cloudimg-amd64.img -f qcow2 \ |
992 | + test-cloudinit.qcow2 |
993 | + |
994 | +3. Create a seed image with metadata using `cloud-localds` |
995 | + |
996 | +.. code-block:: bash |
997 | + |
998 | + $ cat > user-data <<EOF |
999 | + #cloud-config |
1000 | + password: passw0rd |
1001 | + chpasswd: { expire: False } |
1002 | + EOF |
1003 | + $ cloud-localds my-seed.img user-data |
1004 | + |
1005 | +4. Launch your modified VM |
1006 | + |
1007 | +.. code-block:: bash |
1008 | + |
1009 | + $ kvm -m 512 -net nic -net user -redir tcp:2222::22 \ |
1010 | + -drive file=test-cloudinit.qcow2,if=virtio,format=qcow2 \ |
1011 | + -drive file=my-seed.img,if=virtio,format=raw |
1012 | + |
1013 | +5. Analyze the boot (blame, dump, show) |
1014 | + |
1015 | +.. code-block:: bash |
1016 | + |
1017 | + $ ssh -p 2222 ubuntu@localhost 'cat /var/log/cloud-init.log' | \ |
1018 | + cloud-init analyze blame -i - |
1019 | + |
1020 | + |
1021 | +Running single cloud config modules |
1022 | +=================================== |
1023 | +This subcommand is not called by the init system. It can be called manually to |
1024 | +load the configured datasource and run a single cloud-config module once using |
1025 | +the cached userdata and metadata after the instance has booted. Each |
1026 | +cloud-config module has a module FREQUENCY configured: PER_INSTANCE, PER_BOOT, |
1027 | +PER_ONCE or PER_ALWAYS. When a module is run by cloud-init, it stores a |
1028 | +semaphore file in |
1029 | +``/var/lib/cloud/instance/sem/config_<module_name>.<frequency>`` which marks |
1030 | +when the module last successfully ran. Presence of this semaphore file |
1031 | +prevents a module from running again if it has already been run. To ensure that |
1032 | +a module is run again, the desired frequency can be overridden on the |
1033 | +commandline: |
1034 | + |
1035 | +.. code-block:: bash |
1036 | + |
1037 | + $ sudo cloud-init single --name cc_ssh --frequency always |
1038 | + ... |
1039 | + Generating public/private ed25519 key pair |
1040 | + ... |
1041 | + |
1042 | +Inspect cloud-init.log for output of what operations were performed as a |
1043 | +result. |
1044 | diff --git a/tests/unittests/test_cli.py b/tests/unittests/test_cli.py |
1045 | index 06f366b..7780f16 100644 |
1046 | --- a/tests/unittests/test_cli.py |
1047 | +++ b/tests/unittests/test_cli.py |
1048 | @@ -31,9 +31,90 @@ class TestCLI(test_helpers.FilesystemMockingTestCase): |
1049 | |
1050 | def test_no_arguments_shows_error_message(self): |
1051 | exit_code = self._call_main() |
1052 | - self.assertIn('cloud-init: error: too few arguments', |
1053 | - self.stderr.getvalue()) |
1054 | + missing_subcommand_message = [ |
1055 | + 'too few arguments', # python2.7 msg |
1056 | + 'the following arguments are required: subcommand' # python3 msg |
1057 | + ] |
1058 | + error = self.stderr.getvalue() |
1059 | + matches = ([msg in error for msg in missing_subcommand_message]) |
1060 | + self.assertTrue( |
1061 | + any(matches), 'Did not find error message for missing subcommand') |
1062 | self.assertEqual(2, exit_code) |
1063 | |
1064 | + def test_all_subcommands_represented_in_help(self): |
1065 | + """All known subparsers are represented in the cloud-int help doc.""" |
1066 | + self._call_main() |
1067 | + error = self.stderr.getvalue() |
1068 | + expected_subcommands = ['analyze', 'init', 'modules', 'single', |
1069 | + 'dhclient-hook', 'features'] |
1070 | + for subcommand in expected_subcommands: |
1071 | + self.assertIn(subcommand, error) |
1072 | |
1073 | -# vi: ts=4 expandtab |
1074 | + @mock.patch('cloudinit.cmd.main.status_wrapper') |
1075 | + def test_init_subcommand_parser(self, m_status_wrapper): |
1076 | + """The subcommand 'init' calls status_wrapper passing init.""" |
1077 | + self._call_main(['cloud-init', 'init']) |
1078 | + (name, parseargs) = m_status_wrapper.call_args_list[0][0] |
1079 | + self.assertEqual('init', name) |
1080 | + self.assertEqual('init', parseargs.subcommand) |
1081 | + self.assertEqual('init', parseargs.action[0]) |
1082 | + self.assertEqual('main_init', parseargs.action[1].__name__) |
1083 | + |
1084 | + @mock.patch('cloudinit.cmd.main.status_wrapper') |
1085 | + def test_modules_subcommand_parser(self, m_status_wrapper): |
1086 | + """The subcommand 'modules' calls status_wrapper passing modules.""" |
1087 | + self._call_main(['cloud-init', 'modules']) |
1088 | + (name, parseargs) = m_status_wrapper.call_args_list[0][0] |
1089 | + self.assertEqual('modules', name) |
1090 | + self.assertEqual('modules', parseargs.subcommand) |
1091 | + self.assertEqual('modules', parseargs.action[0]) |
1092 | + self.assertEqual('main_modules', parseargs.action[1].__name__) |
1093 | + |
1094 | + def test_analyze_subcommand_parser(self): |
1095 | + """The subcommand cloud-init analyze calls the correct subparser.""" |
1096 | + self._call_main(['cloud-init', 'analyze']) |
1097 | + # These subcommands only valid for cloud-init analyze script |
1098 | + expected_subcommands = ['blame', 'show', 'dump'] |
1099 | + error = self.stderr.getvalue() |
1100 | + for subcommand in expected_subcommands: |
1101 | + self.assertIn(subcommand, error) |
1102 | + |
1103 | + @mock.patch('cloudinit.cmd.main.main_single') |
1104 | + def test_single_subcommand(self, m_main_single): |
1105 | + """The subcommand 'single' calls main_single with valid args.""" |
1106 | + self._call_main(['cloud-init', 'single', '--name', 'cc_ntp']) |
1107 | + (name, parseargs) = m_main_single.call_args_list[0][0] |
1108 | + self.assertEqual('single', name) |
1109 | + self.assertEqual('single', parseargs.subcommand) |
1110 | + self.assertEqual('single', parseargs.action[0]) |
1111 | + self.assertFalse(parseargs.debug) |
1112 | + self.assertFalse(parseargs.force) |
1113 | + self.assertIsNone(parseargs.frequency) |
1114 | + self.assertEqual('cc_ntp', parseargs.name) |
1115 | + self.assertFalse(parseargs.report) |
1116 | + |
1117 | + @mock.patch('cloudinit.cmd.main.dhclient_hook') |
1118 | + def test_dhclient_hook_subcommand(self, m_dhclient_hook): |
1119 | + """The subcommand 'dhclient-hook' calls dhclient_hook with args.""" |
1120 | + self._call_main(['cloud-init', 'dhclient-hook', 'net_action', 'eth0']) |
1121 | + (name, parseargs) = m_dhclient_hook.call_args_list[0][0] |
1122 | + self.assertEqual('dhclient_hook', name) |
1123 | + self.assertEqual('dhclient-hook', parseargs.subcommand) |
1124 | + self.assertEqual('dhclient_hook', parseargs.action[0]) |
1125 | + self.assertFalse(parseargs.debug) |
1126 | + self.assertFalse(parseargs.force) |
1127 | + self.assertEqual('net_action', parseargs.net_action) |
1128 | + self.assertEqual('eth0', parseargs.net_interface) |
1129 | + |
1130 | + @mock.patch('cloudinit.cmd.main.main_features') |
1131 | + def test_features_hook_subcommand(self, m_features): |
1132 | + """The subcommand 'features' calls main_features with args.""" |
1133 | + self._call_main(['cloud-init', 'features']) |
1134 | + (name, parseargs) = m_features.call_args_list[0][0] |
1135 | + self.assertEqual('features', name) |
1136 | + self.assertEqual('features', parseargs.subcommand) |
1137 | + self.assertEqual('features', parseargs.action[0]) |
1138 | + self.assertFalse(parseargs.debug) |
1139 | + self.assertFalse(parseargs.force) |
1140 | + |
1141 | +# : ts=4 expandtab |
FAILED: Continuous integration, rev:ff4396d6a45 791bf31b78cdfe7 457e90d00f02ed /jenkins. ubuntu. com/server/ job/cloud- init-ci/ 132/
https:/
Executed test runs:
SUCCESS: Checkout
FAILED: Unit & Style Tests
Click here to trigger a rebuild: /jenkins. ubuntu. com/server/ job/cloud- init-ci/ 132/rebuild
https:/