Merge lp:~allenap/maas/normalise-logging--bug-1639182 into lp:~maas-committers/maas/trunk
- normalise-logging--bug-1639182
- Merge into trunk
Status: | Merged | ||||
---|---|---|---|---|---|
Approved by: | Andres Rodriguez | ||||
Approved revision: | no longer in the source branch. | ||||
Merged at revision: | 5541 | ||||
Proposed branch: | lp:~allenap/maas/normalise-logging--bug-1639182 | ||||
Merge into: | lp:~maas-committers/maas/trunk | ||||
Diff against target: |
410 lines (+148/-113) 4 files modified
src/provisioningserver/logger/_common.py (+1/-1) src/provisioningserver/logger/_twisted.py (+55/-24) src/provisioningserver/logger/tests/test__twisted.py (+87/-83) src/provisioningserver/logger/tests/test_logger.py (+5/-5) |
||||
To merge this branch: | bzr merge lp:~allenap/maas/normalise-logging--bug-1639182 | ||||
Related bugs: |
|
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Andres Rodriguez (community) | Approve | ||
Blake Rouse (community) | Needs Information | ||
Review via email: mp+310188@code.launchpad.net |
Commit message
Change the logging format to be similar between regiond.log, rackd.log, and maas.log.
Description of the change
Gavin Panella (allenap) wrote : | # |
Blake Rouse (blake-rouse) wrote : | # |
Looks good. I tried to get this to work on Friday and was lost. I really dont think twisted could have made logging any more complicated.
This is good enough for the SRU to Xenial, but we need to get this fixes in yacketty as well. ETA on that, possible to do in the branch. So we can get it done in one pass. We need to release 2.1.1 today.
Gavin Panella (allenap) wrote : | # |
> Looks good. I tried to get this to work on Friday and was lost. I
> really dont think twisted could have made logging any more
> complicated.
I'm right with you on this. My only hope for this is that it'll get a
lot simpler once t.p.log is gone.
> This is good enough for the SRU to Xenial, but we need to get this
> fixes in yacketty as well. ETA on that, possible to do in the branch.
> So we can get it done in one pass. We need to release 2.1.1 today.
I've been hitting my head against the wall with doing this in Yakkety.
I'll land this but we're just going to have to live with funny log for a
bit longer in Yakkety.
On the other hand, do we need to do this at all? The message format has
not changed through the recent logging changes. Do we actually need to
change it now before we can release?
Blake Rouse (blake-rouse) wrote : | # |
Yes we need to get this correct before release of 2.1.1. As we are SRU'ing 2.1.1 to Xenial. Mark wants the SRU finished by the end of this week. So we need to release ASAP. This 2.1.1 will also require an SRU to yakketty so we should make this work in yakketty as well.
Andres Rodriguez (andreserl) wrote : | # |
The format has changed between 2.1.0 and what will be 2.1.1. So I'm fine landing this for xenial and we will address it for yakkety afterwards.
We need to have a serious discussion how we can overall improve the logging system.
Gavin Panella (allenap) wrote : | # |
> We need to have a serious discussion how we can overall improve the
> logging system.
Let me kick it off.
There has been pain in all parts of the stack: Django, Twisted, and the
standard library all do stupid things that we need to work around. I've
been able to contain the ugliness to the p.logger package.
The logs that Django and `logging` produce and those that Twisted
produce are integrated only when they hit the log file. Logs from one
subsystem are not shipped from `logging` to Twisted or vice-versa. There
is an adapter that can pipe Twisted's logs over to the standard library
but it's too lossy right now, and it's not clear that it'll help much
anyway.
Twisted has just one idea of what a log should look like. It doesn't
care about customising the textual output, though it does provide the
means to write your own log event[1] observer and so emit rot-13 logs to
a thermal printer on a pidgeon. It's neither right nor wrong, just a
different approach to that of the stdlib's `logging` package.
This bug in particular has been painful for a couple of reasons:
1. I'm trying to do it the "wrong" way with Twisted. When invoking a
plugin via twistd the "right" way to customise logging output is to
specify an observer via `--logger=
2. As I've mentioned before, Twisted is in a transitional state between
its "legacy" and "modern" logging subsystems. There's a lot of
complexity behind the scenes to make this work, and significant
change between Twisted in Xenial and Yakkety.
I've been trying to avoid #1 so far, to keep the details hidden, but no
longer. The changes between Twisted 16.0 (Xenial) and 16.4 (Yakkety)
means this is likely to be the only sane way to achieve what we want,
and this is what I'll work on now.
[1] A log *event* is a dict containing some standard keys and any number
of other pieces of information, passed between log observers. One such
observer might filter events, one might forward events to multiple
observers, one might format the event and write it out to a log file.
Preview Diff
1 | === modified file 'src/provisioningserver/logger/_common.py' | |||
2 | --- src/provisioningserver/logger/_common.py 2016-10-28 15:58:32 +0000 | |||
3 | +++ src/provisioningserver/logger/_common.py 2016-11-07 14:38:23 +0000 | |||
4 | @@ -21,7 +21,7 @@ | |||
5 | 21 | 21 | ||
6 | 22 | # This format roughly matches Twisted's default, so that combined Twisted and | 22 | # This format roughly matches Twisted's default, so that combined Twisted and |
7 | 23 | # Django logs are consistent with one another. | 23 | # Django logs are consistent with one another. |
9 | 24 | DEFAULT_LOG_FORMAT = "%(asctime)s [%(name)s#%(levelname)s] %(message)s" | 24 | DEFAULT_LOG_FORMAT = "%(asctime)s %(name)s: [%(levelname)s] %(message)s" |
10 | 25 | DEFAULT_LOG_FORMAT_DATE = "%Y-%m-%d %H:%M:%S" | 25 | DEFAULT_LOG_FORMAT_DATE = "%Y-%m-%d %H:%M:%S" |
11 | 26 | DEFAULT_LOG_VERBOSITY_LEVELS = {0, 1, 2, 3} | 26 | DEFAULT_LOG_VERBOSITY_LEVELS = {0, 1, 2, 3} |
12 | 27 | DEFAULT_LOG_VERBOSITY = 2 | 27 | DEFAULT_LOG_VERBOSITY = 2 |
13 | 28 | 28 | ||
14 | === modified file 'src/provisioningserver/logger/_twisted.py' | |||
15 | --- src/provisioningserver/logger/_twisted.py 2016-11-03 21:11:05 +0000 | |||
16 | +++ src/provisioningserver/logger/_twisted.py 2016-11-07 14:38:23 +0000 | |||
17 | @@ -15,6 +15,7 @@ | |||
18 | 15 | 15 | ||
19 | 16 | import crochet | 16 | import crochet |
20 | 17 | from provisioningserver.logger._common import ( | 17 | from provisioningserver.logger._common import ( |
21 | 18 | DEFAULT_LOG_FORMAT, | ||
22 | 18 | DEFAULT_LOG_FORMAT_DATE, | 19 | DEFAULT_LOG_FORMAT_DATE, |
23 | 19 | DEFAULT_LOG_VERBOSITY, | 20 | DEFAULT_LOG_VERBOSITY, |
24 | 20 | DEFAULT_LOG_VERBOSITY_LEVELS, | 21 | DEFAULT_LOG_VERBOSITY_LEVELS, |
25 | @@ -28,6 +29,7 @@ | |||
26 | 28 | log as twistedLegacy, | 29 | log as twistedLegacy, |
27 | 29 | usage, | 30 | usage, |
28 | 30 | ) | 31 | ) |
29 | 32 | from twisted.python.util import untilConcludes | ||
30 | 31 | 33 | ||
31 | 32 | # Map verbosity numbers to `twisted.logger` levels. | 34 | # Map verbosity numbers to `twisted.logger` levels. |
32 | 33 | DEFAULT_TWISTED_VERBOSITY_LEVELS = { | 35 | DEFAULT_TWISTED_VERBOSITY_LEVELS = { |
33 | @@ -117,12 +119,10 @@ | |||
34 | 117 | twistedModern.globalLogBeginner.showwarning = warnings.showwarning | 119 | twistedModern.globalLogBeginner.showwarning = warnings.showwarning |
35 | 118 | twistedLegacy.theLogPublisher.showwarning = warnings.showwarning | 120 | twistedLegacy.theLogPublisher.showwarning = warnings.showwarning |
36 | 119 | 121 | ||
43 | 120 | # Globally override Twisted's log date format. It's tricky to get to the | 122 | # Globally override Twisted's legacy logging format. |
44 | 121 | # FileLogObserver that twistd installs so that we can modify its config | 123 | warn_unless(hasattr(twistedLegacy, "FileLogObserver"), ( |
45 | 122 | # alone, but we actually do want to make a global change anyway. | 124 | "No t.p.log.FileLogObserver attribute found; please investigate!")) |
46 | 123 | warn_unless(hasattr(twistedLegacy.FileLogObserver, "timeFormat"), ( | 125 | LegacyFileLogObserver.install() |
41 | 124 | "No FileLogObserver.timeFormat attribute found; please investigate!")) | ||
42 | 125 | twistedLegacy.FileLogObserver.timeFormat = DEFAULT_LOG_FORMAT_DATE | ||
47 | 126 | 126 | ||
48 | 127 | # Install a wrapper so that log events from `t.logger` are logged with a | 127 | # Install a wrapper so that log events from `t.logger` are logged with a |
49 | 128 | # namespace and level by the legacy logger in `t.python.log`. This needs | 128 | # namespace and level by the legacy logger in `t.python.log`. This needs |
50 | @@ -157,18 +157,55 @@ | |||
51 | 157 | twistedLegacy.startLogging(sys.__stdout__, setStdout=False) | 157 | twistedLegacy.startLogging(sys.__stdout__, setStdout=False) |
52 | 158 | 158 | ||
53 | 159 | 159 | ||
54 | 160 | class LegacyFileLogObserver(twistedLegacy.FileLogObserver): | ||
55 | 161 | """Log legacy/mixed events to a file, formatting the MAAS way.""" | ||
56 | 162 | |||
57 | 163 | timeFormat = DEFAULT_LOG_FORMAT_DATE | ||
58 | 164 | lineFormat = DEFAULT_LOG_FORMAT + "\n" | ||
59 | 165 | |||
60 | 166 | @classmethod | ||
61 | 167 | def install(cls): | ||
62 | 168 | """Install this wrapper in place of `log.FileLogObserver`.""" | ||
63 | 169 | twistedLegacy.FileLogObserver = cls | ||
64 | 170 | |||
65 | 171 | def emit(self, event): | ||
66 | 172 | """Format and write out the given `event`.""" | ||
67 | 173 | text = twistedLegacy.textFromEventDict(event) | ||
68 | 174 | if text is None: | ||
69 | 175 | return | ||
70 | 176 | |||
71 | 177 | system = event["system"] if "system" in event else None | ||
72 | 178 | if system is None and "log_namespace" in event: | ||
73 | 179 | system = event["log_namespace"] | ||
74 | 180 | # Logs written directly to `t.p.log.logfile` and `.logerr` get a | ||
75 | 181 | # namespace of "twisted.python.log". This is not interesting. | ||
76 | 182 | if system == twistedLegacy.__name__: | ||
77 | 183 | system = None | ||
78 | 184 | |||
79 | 185 | level = event["log_level"] if "log_level" in event else None | ||
80 | 186 | if level is None: | ||
81 | 187 | if "isError" in event and event["isError"]: | ||
82 | 188 | level = twistedModern.LogLevel.critical | ||
83 | 189 | else: | ||
84 | 190 | level = twistedModern.LogLevel.info | ||
85 | 191 | |||
86 | 192 | line = twistedLegacy._safeFormat(self.lineFormat, { | ||
87 | 193 | "asctime": self.formatTime(event["time"]), | ||
88 | 194 | "levelname": "-" if level is None else level.name, | ||
89 | 195 | "message": text.replace("\n", "\n\t"), | ||
90 | 196 | "name": "-" if system is None else system, | ||
91 | 197 | }) | ||
92 | 198 | |||
93 | 199 | untilConcludes(self.write, line) | ||
94 | 200 | untilConcludes(self.flush) | ||
95 | 201 | |||
96 | 202 | |||
97 | 160 | class LegacyLogObserverWrapper(twistedModern.LegacyLogObserverWrapper): | 203 | class LegacyLogObserverWrapper(twistedModern.LegacyLogObserverWrapper): |
98 | 161 | """Ensure that `log_system` is set in the event. | 204 | """Ensure that `log_system` is set in the event. |
99 | 162 | 205 | ||
109 | 163 | This mimics what `twisted.logger.formatEventAsClassicLogText` does when | 206 | A newly populated `log_system` value is seen by `LegacyLogObserverWrapper` |
110 | 164 | `log_system` is not set, and constructs it from `log_namespace` and | 207 | (the superclass) and copied into the `system` key, later used when emitting |
111 | 165 | `log_level`. | 208 | formatted log lines. |
103 | 166 | |||
104 | 167 | This `log_system` value is then seen by `LegacyLogObserverWrapper` and | ||
105 | 168 | copied into the `system` key and then printed out in the logs by Twisted's | ||
106 | 169 | legacy logging (`t.python.log`) machinery. This still used by `twistd`, so | ||
107 | 170 | the net effect is that the logger's namespace and level are printed to the | ||
108 | 171 | `twistd` log. | ||
112 | 172 | """ | 209 | """ |
113 | 173 | 210 | ||
114 | 174 | @classmethod | 211 | @classmethod |
115 | @@ -185,17 +222,11 @@ | |||
116 | 185 | 222 | ||
117 | 186 | def __call__(self, event): | 223 | def __call__(self, event): |
118 | 187 | # Be defensive: `system` could be missing or could have a value of | 224 | # Be defensive: `system` could be missing or could have a value of |
120 | 188 | # None. Same goes for `log_system`, `log_namespace`, and `log_level`. | 225 | # None. Same goes for `log_system` and `log_namespace`. |
121 | 189 | if event.get("system") is None and event.get("log_system") is None: | 226 | if event.get("system") is None and event.get("log_system") is None: |
122 | 190 | namespace = event.get("log_namespace") | 227 | namespace = event.get("log_namespace") |
131 | 191 | # Logs written directly to `t.p.log.logfile` and `.logerr` get a | 228 | if namespace is not None: |
132 | 192 | # namespace of "twisted.python.log". This is not interesting. | 229 | event["log_system"] = namespace |
125 | 193 | if namespace == twistedLegacy.__name__: | ||
126 | 194 | namespace = None | ||
127 | 195 | level = event.get("log_level") | ||
128 | 196 | event["log_system"] = "{namespace}#{level}".format( | ||
129 | 197 | namespace=("-" if namespace is None else namespace), | ||
130 | 198 | level=("-" if level is None else level.name)) | ||
133 | 199 | # Up-call, which will apply some more transformations. | 230 | # Up-call, which will apply some more transformations. |
134 | 200 | return super().__call__(event) | 231 | return super().__call__(event) |
135 | 201 | 232 | ||
136 | 202 | 233 | ||
137 | === modified file 'src/provisioningserver/logger/tests/test__twisted.py' | |||
138 | --- src/provisioningserver/logger/tests/test__twisted.py 2016-11-02 10:01:16 +0000 | |||
139 | +++ src/provisioningserver/logger/tests/test__twisted.py 2016-11-07 14:38:23 +0000 | |||
140 | @@ -17,6 +17,7 @@ | |||
141 | 17 | from maastesting.testcase import MAASTestCase | 17 | from maastesting.testcase import MAASTestCase |
142 | 18 | from maastesting.twisted import TwistedLoggerFixture | 18 | from maastesting.twisted import TwistedLoggerFixture |
143 | 19 | from provisioningserver.logger._twisted import ( | 19 | from provisioningserver.logger._twisted import ( |
144 | 20 | LegacyFileLogObserver, | ||
145 | 20 | LegacyLogger, | 21 | LegacyLogger, |
146 | 21 | LegacyLogObserverWrapper, | 22 | LegacyLogObserverWrapper, |
147 | 22 | observe_twisted_internet_tcp, | 23 | observe_twisted_internet_tcp, |
148 | @@ -57,6 +58,80 @@ | |||
149 | 57 | log.theLogPublisher.addObserver(observer) | 58 | log.theLogPublisher.addObserver(observer) |
150 | 58 | 59 | ||
151 | 59 | 60 | ||
152 | 61 | class TestLegacyFileLogObserver(MAASTestCase): | ||
153 | 62 | """Scenario tests for `LegacyFileLogObserver`.""" | ||
154 | 63 | |||
155 | 64 | scenarios = tuple( | ||
156 | 65 | (log_level.name, dict(log_level=log_level)) | ||
157 | 66 | for log_level in logger.LogLevel.iterconstants() | ||
158 | 67 | ) | ||
159 | 68 | |||
160 | 69 | def test__namespace_and_level_is_printed_in_legacy_log(self): | ||
161 | 70 | # Restore existing observers at the end. This must be careful with | ||
162 | 71 | # ordering of clean-ups, hence the use of unittest.mock.patch.object | ||
163 | 72 | # as a context manager. | ||
164 | 73 | self.addCleanup(setLegacyObservers, log.theLogPublisher.observers) | ||
165 | 74 | # The global non-legacy `LogBeginner` emits critical messages straight | ||
166 | 75 | # to stderr, so temporarily put aside its observer to avoid seeing the | ||
167 | 76 | # critical log messages we're going to generate. | ||
168 | 77 | self.patch(logger.globalLogPublisher, "_observers", []) | ||
169 | 78 | |||
170 | 79 | logbuffer = io.StringIO() | ||
171 | 80 | observer = LegacyFileLogObserver(logbuffer) | ||
172 | 81 | observer.formatTime = lambda when: "<timestamp>" | ||
173 | 82 | |||
174 | 83 | oldlog = log.msg | ||
175 | 84 | # Deliberately use the default global observer in the new logger | ||
176 | 85 | # because we want to see how it behaves in a typical environment where | ||
177 | 86 | # logs are being emitted by the legacy logging infrastructure, for | ||
178 | 87 | # example running under `twistd`. | ||
179 | 88 | newlog = partial(logger.Logger().emit, self.log_level) | ||
180 | 89 | |||
181 | 90 | with patch.object( | ||
182 | 91 | log, "LegacyLogObserverWrapper", | ||
183 | 92 | log.LegacyLogObserverWrapper): | ||
184 | 93 | setLegacyObservers([observer.emit]) | ||
185 | 94 | oldlog("Message from legacy", system="legacy") | ||
186 | 95 | newlog("Message from modern", log_system="modern") | ||
187 | 96 | |||
188 | 97 | self.assertThat( | ||
189 | 98 | logbuffer.getvalue(), DocTestMatches("""\ | ||
190 | 99 | <timestamp> legacy: [info] Message from legacy | ||
191 | 100 | <timestamp> modern: [%s] Message from modern | ||
192 | 101 | """ % self.log_level.name)) | ||
193 | 102 | |||
194 | 103 | |||
195 | 104 | class TestLegacyFileLogObserver_Other(MAASTestCase): | ||
196 | 105 | """Other tests for `LegacyFileLogObserver`.""" | ||
197 | 106 | |||
198 | 107 | def test__namespace_is_not_emitted_via_logfile_logerr(self): | ||
199 | 108 | # Restore existing observers at the end. This must be careful with | ||
200 | 109 | # ordering of clean-ups, hence the use of unittest.mock.patch.object | ||
201 | 110 | # as a context manager. | ||
202 | 111 | self.addCleanup(setLegacyObservers, log.theLogPublisher.observers) | ||
203 | 112 | # The global non-legacy `LogBeginner` emits critical messages straight | ||
204 | 113 | # to stderr, so temporarily put aside its observer to avoid seeing the | ||
205 | 114 | # critical log messages we're going to generate. | ||
206 | 115 | self.patch(logger.globalLogPublisher, "_observers", []) | ||
207 | 116 | |||
208 | 117 | logbuffer = io.StringIO() | ||
209 | 118 | observer = LegacyFileLogObserver(logbuffer) | ||
210 | 119 | observer.formatTime = lambda when: "<timestamp>" | ||
211 | 120 | |||
212 | 121 | with patch.object( | ||
213 | 122 | log, "LegacyLogObserverWrapper", | ||
214 | 123 | log.LegacyLogObserverWrapper): | ||
215 | 124 | setLegacyObservers([observer.emit]) | ||
216 | 125 | log.logfile.write("Via log.logfile\n") | ||
217 | 126 | log.logerr.write("Via log.logerr\n") | ||
218 | 127 | |||
219 | 128 | self.assertThat( | ||
220 | 129 | logbuffer.getvalue(), DocTestMatches("""\ | ||
221 | 130 | <timestamp> -: [info] Via log.logfile | ||
222 | 131 | <timestamp> -: [error] Via log.logerr | ||
223 | 132 | """)) | ||
224 | 133 | |||
225 | 134 | |||
226 | 60 | class TestLegacyLogObserverWrapper(MAASTestCase): | 135 | class TestLegacyLogObserverWrapper(MAASTestCase): |
227 | 61 | """Scenario tests for `LegacyLogObserverWrapper`.""" | 136 | """Scenario tests for `LegacyLogObserverWrapper`.""" |
228 | 62 | 137 | ||
229 | @@ -72,7 +147,7 @@ | |||
230 | 72 | self.assertThat(events, HasLength(1)) | 147 | self.assertThat(events, HasLength(1)) |
231 | 73 | return events[0] | 148 | return events[0] |
232 | 74 | 149 | ||
234 | 75 | def test__adds_log_system_and_system_to_event(self): | 150 | def test__adds_system_to_event(self): |
235 | 76 | self.assertThat( | 151 | self.assertThat( |
236 | 77 | # This is a `twisted.logger` event, not legacy, and requires | 152 | # This is a `twisted.logger` event, not legacy, and requires |
237 | 78 | # values for `log_time` and `log_level` at a minimum. | 153 | # values for `log_time` and `log_level` at a minimum. |
238 | @@ -80,10 +155,10 @@ | |||
239 | 80 | "log_time": sentinel.log_time, | 155 | "log_time": sentinel.log_time, |
240 | 81 | "log_level": self.log_level, | 156 | "log_level": self.log_level, |
241 | 82 | }), | 157 | }), |
246 | 83 | ContainsDictByEquality({ | 158 | MatchesAll( |
247 | 84 | "log_system": "-#" + self.log_level.name, | 159 | Not(Contains("log_system")), |
248 | 85 | "system": "-#" + self.log_level.name, | 160 | ContainsDictByEquality({"system": "-"}), |
249 | 86 | }), | 161 | ), |
250 | 87 | ) | 162 | ) |
251 | 88 | 163 | ||
252 | 89 | def test__adds_log_system_and_system_to_event_with_namespace(self): | 164 | def test__adds_log_system_and_system_to_event_with_namespace(self): |
253 | @@ -95,12 +170,12 @@ | |||
254 | 95 | "log_namespace": log_namespace, | 170 | "log_namespace": log_namespace, |
255 | 96 | }), | 171 | }), |
256 | 97 | ContainsDictByEquality({ | 172 | ContainsDictByEquality({ |
259 | 98 | "log_system": log_namespace + "#" + self.log_level.name, | 173 | "log_system": log_namespace, |
260 | 99 | "system": log_namespace + "#" + self.log_level.name, | 174 | "system": log_namespace, |
261 | 100 | }), | 175 | }), |
262 | 101 | ) | 176 | ) |
263 | 102 | 177 | ||
265 | 103 | def test__adds_log_system_and_system_to_legacy_event(self): | 178 | def test__adds_system_to_legacy_event(self): |
266 | 104 | self.assertThat( | 179 | self.assertThat( |
267 | 105 | # This is a `twisted.python.log` event, i.e. legacy, and requires | 180 | # This is a `twisted.python.log` event, i.e. legacy, and requires |
268 | 106 | # values for `time` and `isError` at a minimum. | 181 | # values for `time` and `isError` at a minimum. |
269 | @@ -108,10 +183,10 @@ | |||
270 | 108 | "time": sentinel.time, | 183 | "time": sentinel.time, |
271 | 109 | "isError": factory.pick_bool(), | 184 | "isError": factory.pick_bool(), |
272 | 110 | }), | 185 | }), |
277 | 111 | ContainsDictByEquality({ | 186 | MatchesAll( |
278 | 112 | "log_system": "-#-", | 187 | Not(Contains("log_system")), |
279 | 113 | "system": "-#-", | 188 | ContainsDictByEquality({"system": "-"}), |
280 | 114 | }), | 189 | ), |
281 | 115 | ) | 190 | ) |
282 | 116 | 191 | ||
283 | 117 | def test__preserves_log_system_in_event(self): | 192 | def test__preserves_log_system_in_event(self): |
284 | @@ -146,77 +221,6 @@ | |||
285 | 146 | ), | 221 | ), |
286 | 147 | ) | 222 | ) |
287 | 148 | 223 | ||
288 | 149 | def test__namespace_and_level_is_printed_in_legacy_log(self): | ||
289 | 150 | # Restore existing observers at the end. This must be careful with | ||
290 | 151 | # ordering of clean-ups, hence the use of unittest.mock.patch.object | ||
291 | 152 | # as a context manager. | ||
292 | 153 | self.addCleanup(setLegacyObservers, log.theLogPublisher.observers) | ||
293 | 154 | # The global non-legacy `LogBeginner` emits critical messages straight | ||
294 | 155 | # to stderr, so temporarily put aside its observer to avoid seeing the | ||
295 | 156 | # critical log messages we're going to generate. | ||
296 | 157 | self.patch(logger.globalLogPublisher, "_observers", []) | ||
297 | 158 | |||
298 | 159 | logbuffer = io.StringIO() | ||
299 | 160 | observer = log.FileLogObserver(logbuffer) | ||
300 | 161 | observer.formatTime = lambda when: "<timestamp>" | ||
301 | 162 | |||
302 | 163 | oldlog = log.msg | ||
303 | 164 | # Deliberately use the default global observer in the new logger | ||
304 | 165 | # because we want to see how it behaves in a typical environment where | ||
305 | 166 | # logs are being emitted by the legacy logging infrastructure, for | ||
306 | 167 | # example running under `twistd`. | ||
307 | 168 | newlog = partial(logger.Logger().emit, self.log_level) | ||
308 | 169 | |||
309 | 170 | with patch.object( | ||
310 | 171 | log, "LegacyLogObserverWrapper", | ||
311 | 172 | log.LegacyLogObserverWrapper): | ||
312 | 173 | setLegacyObservers([observer.emit]) | ||
313 | 174 | oldlog("Before (legacy)") | ||
314 | 175 | newlog("Before (new)") | ||
315 | 176 | LegacyLogObserverWrapper.install() | ||
316 | 177 | oldlog("After (legacy)") | ||
317 | 178 | newlog("After (new)") | ||
318 | 179 | |||
319 | 180 | self.assertThat( | ||
320 | 181 | logbuffer.getvalue(), DocTestMatches("""\ | ||
321 | 182 | <timestamp> [-] Before (legacy) | ||
322 | 183 | <timestamp> [-] Before (new) | ||
323 | 184 | <timestamp> [-] After (legacy) | ||
324 | 185 | <timestamp> [%s#%s] After (new) | ||
325 | 186 | """ % (__name__, self.log_level.name))) | ||
326 | 187 | |||
327 | 188 | def test__namespace_and_level_are_elided_via_logfile_logerr(self): | ||
328 | 189 | # Restore existing observers at the end. This must be careful with | ||
329 | 190 | # ordering of clean-ups, hence the use of unittest.mock.patch.object | ||
330 | 191 | # as a context manager. | ||
331 | 192 | self.addCleanup(setLegacyObservers, log.theLogPublisher.observers) | ||
332 | 193 | # The global non-legacy `LogBeginner` emits critical messages straight | ||
333 | 194 | # to stderr, so temporarily put aside its observer to avoid seeing the | ||
334 | 195 | # critical log messages we're going to generate. | ||
335 | 196 | self.patch(logger.globalLogPublisher, "_observers", []) | ||
336 | 197 | |||
337 | 198 | logbuffer = io.StringIO() | ||
338 | 199 | observer = log.FileLogObserver(logbuffer) | ||
339 | 200 | observer.formatTime = lambda when: "<timestamp>" | ||
340 | 201 | |||
341 | 202 | with patch.object( | ||
342 | 203 | log, "LegacyLogObserverWrapper", | ||
343 | 204 | log.LegacyLogObserverWrapper): | ||
344 | 205 | setLegacyObservers([observer.emit]) | ||
345 | 206 | log.logfile.write("Before (via log.logfile)\n") | ||
346 | 207 | log.logerr.write("Before (via log.logerr)\n") | ||
347 | 208 | LegacyLogObserverWrapper.install() | ||
348 | 209 | log.logfile.write("After (via log.logfile)\n") | ||
349 | 210 | log.logerr.write("After (via log.logerr)\n") | ||
350 | 211 | |||
351 | 212 | self.assertThat( | ||
352 | 213 | logbuffer.getvalue(), DocTestMatches("""\ | ||
353 | 214 | <timestamp> [-] Before (via log.logfile) | ||
354 | 215 | <timestamp> [-] Before (via log.logerr) | ||
355 | 216 | <timestamp> [-#info] After (via log.logfile) | ||
356 | 217 | <timestamp> [-#error] After (via log.logerr) | ||
357 | 218 | """)) | ||
358 | 219 | |||
359 | 220 | 224 | ||
360 | 221 | class TestLegacyLogObserverWrapper_Installation(MAASTestCase): | 225 | class TestLegacyLogObserverWrapper_Installation(MAASTestCase): |
361 | 222 | """Tests for `LegacyLogObserverWrapper`.""" | 226 | """Tests for `LegacyLogObserverWrapper`.""" |
362 | 223 | 227 | ||
363 | === modified file 'src/provisioningserver/logger/tests/test_logger.py' | |||
364 | --- src/provisioningserver/logger/tests/test_logger.py 2016-11-03 21:39:45 +0000 | |||
365 | +++ src/provisioningserver/logger/tests/test_logger.py 2016-11-07 14:38:23 +0000 | |||
366 | @@ -42,7 +42,7 @@ | |||
367 | 42 | 42 | ||
368 | 43 | # Matches lines like: 2016-10-18 14:23:55 [namespace#level] message | 43 | # Matches lines like: 2016-10-18 14:23:55 [namespace#level] message |
369 | 44 | find_log_lines_re = re.compile( | 44 | find_log_lines_re = re.compile( |
371 | 45 | r"^(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}) [[](.*?)(?:#(.*))?[]] (.*)$", | 45 | r"^(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}) (.*?): [[](.*)[]] (.*)$", |
372 | 46 | re.MULTILINE) | 46 | re.MULTILINE) |
373 | 47 | 47 | ||
374 | 48 | 48 | ||
375 | @@ -106,7 +106,7 @@ | |||
376 | 106 | (name, 'info', 'From `twisted.logger`.'), | 106 | (name, 'info', 'From `twisted.logger`.'), |
377 | 107 | (name, 'warn', 'From `twisted.logger`.'), | 107 | (name, 'warn', 'From `twisted.logger`.'), |
378 | 108 | (name, 'error', 'From `twisted.logger`.'), | 108 | (name, 'error', 'From `twisted.logger`.'), |
380 | 109 | (name, '', 'From `twisted.python.log`.'), | 109 | (name, 'info', 'From `twisted.python.log`.'), |
381 | 110 | (name, 'info', 'From `logging`.'), | 110 | (name, 'info', 'From `logging`.'), |
382 | 111 | (name, 'warn', 'From `logging`.'), | 111 | (name, 'warn', 'From `logging`.'), |
383 | 112 | (name, 'error', 'From `logging`.'), | 112 | (name, 'error', 'From `logging`.'), |
384 | @@ -132,7 +132,7 @@ | |||
385 | 132 | (name, 'info', 'From `twisted.logger`.'), | 132 | (name, 'info', 'From `twisted.logger`.'), |
386 | 133 | (name, 'warn', 'From `twisted.logger`.'), | 133 | (name, 'warn', 'From `twisted.logger`.'), |
387 | 134 | (name, 'error', 'From `twisted.logger`.'), | 134 | (name, 'error', 'From `twisted.logger`.'), |
389 | 135 | (name, '', 'From `twisted.python.log`.'), | 135 | (name, 'info', 'From `twisted.python.log`.'), |
390 | 136 | (name, 'debug', 'From `logging`.'), | 136 | (name, 'debug', 'From `logging`.'), |
391 | 137 | (name, 'info', 'From `logging`.'), | 137 | (name, 'info', 'From `logging`.'), |
392 | 138 | (name, 'warn', 'From `logging`.'), | 138 | (name, 'warn', 'From `logging`.'), |
393 | @@ -195,7 +195,7 @@ | |||
394 | 195 | (name, 'info', 'From `twisted.logger`.'), | 195 | (name, 'info', 'From `twisted.logger`.'), |
395 | 196 | (name, 'warn', 'From `twisted.logger`.'), | 196 | (name, 'warn', 'From `twisted.logger`.'), |
396 | 197 | (name, 'error', 'From `twisted.logger`.'), | 197 | (name, 'error', 'From `twisted.logger`.'), |
398 | 198 | (name, '', 'From `twisted.python.log`.'), | 198 | (name, 'info', 'From `twisted.python.log`.'), |
399 | 199 | (name, 'info', 'From `logging`.'), | 199 | (name, 'info', 'From `logging`.'), |
400 | 200 | (name, 'warn', 'From `logging`.'), | 200 | (name, 'warn', 'From `logging`.'), |
401 | 201 | (name, 'error', 'From `logging`.'), | 201 | (name, 'error', 'From `logging`.'), |
402 | @@ -224,7 +224,7 @@ | |||
403 | 224 | (name, 'info', 'From `twisted.logger`.'), | 224 | (name, 'info', 'From `twisted.logger`.'), |
404 | 225 | (name, 'warn', 'From `twisted.logger`.'), | 225 | (name, 'warn', 'From `twisted.logger`.'), |
405 | 226 | (name, 'error', 'From `twisted.logger`.'), | 226 | (name, 'error', 'From `twisted.logger`.'), |
407 | 227 | (name, '', 'From `twisted.python.log`.'), | 227 | (name, 'info', 'From `twisted.python.log`.'), |
408 | 228 | (name, 'debug', 'From `logging`.'), | 228 | (name, 'debug', 'From `logging`.'), |
409 | 229 | (name, 'info', 'From `logging`.'), | 229 | (name, 'info', 'From `logging`.'), |
410 | 230 | (name, 'warn', 'From `logging`.'), | 230 | (name, 'warn', 'From `logging`.'), |
This monkey-patches the legacy log emitter to use DEFAULT_LOG_FORMAT, so future changes to logging format will be easier.
Having said that... because we're in a transnational state in Twisted between the legacy and modern systems... this code completely doesn't have the intended effect in Twisted 16.4 (which is found in Yakkety).