Merge lp:~allenap/maas/cluster-worker-logging into lp:~maas-committers/maas/trunk

Proposed by Gavin Panella
Status: Merged
Approved by: Gavin Panella
Approved revision: no longer in the source branch.
Merged at revision: 1293
Proposed branch: lp:~allenap/maas/cluster-worker-logging
Merge into: lp:~maas-committers/maas/trunk
Diff against target: 228 lines (+32/-55)
9 files modified
etc/celeryconfig_common.py (+4/-3)
etc/democeleryconfig.py (+1/-1)
etc/democeleryconfig_cluster.py (+2/-2)
src/provisioningserver/boot_images.py (+4/-1)
src/provisioningserver/dhcp/leases.py (+5/-1)
src/provisioningserver/logging.py (+0/-22)
src/provisioningserver/start_cluster_controller.py (+9/-20)
src/provisioningserver/tags.py (+6/-5)
src/provisioningserver/tests/test_start_cluster_controller.py (+1/-0)
To merge this branch: bzr merge lp:~allenap/maas/cluster-worker-logging
Reviewer Review Type Date Requested Status
John A Meinel (community) Approve
Raphaël Badin (community) Approve
Review via email: mp+130576@code.launchpad.net

Commit message

Capture log messages emitted during cluster worker start-up.

Previously these messages were being lost, so debugging the start-up of a cluster worker was difficult.

To post a comment you must log in.
Revision history for this message
Raphaël Badin (rvb) wrote :

Looks good.

[0]

101 + setup_logging_subsystem(loglevel="INFO", logfile=get_maas_celery_log())

I think you should make the log level a configuration option in the celery config file.

review: Approve
Revision history for this message
Gavin Panella (allenap) wrote :

> Looks good.
>
> [0]
>
> 101     +    setup_logging_subsystem(loglevel="INFO",
> logfile=get_maas_celery_log())
>
> I think you should make the log level a configuration option in the celery
> config file.

Yeah, I've discovered that this is already supported. I can remove the
arguments to setup_logging_subsystem and use CELERYD_LOG_LEVEL and
CELERYD_LOG_FILE instead. The latter means I can also get rid of
MAAS_CELERY_LOG.

Likewise, MAAS_CLUSTER_CELERY_DB can also be replaced with
CELERYBEAT_SCHEDULE_FILENAME.

Thanks for the review. I'll make these other changes next week before
I land this.

Revision history for this message
Gavin Panella (allenap) wrote :

Hi Raphael, or anyone else who reads this, please would you mind taking a look at rev 1294? Thanks!

Revision history for this message
John A Meinel (jameinel) wrote :

So I think the specific change of 1294 is reasonable, but does it actually handle the case of 'start a new cluster-controller, point it at the region controller, and get some error logging while it is has not been accepted yet'?

I guess the biggest change is calling 'setup_logging_subsystem()' earlier, which should handle that.

So the other bit about getting rid of logging.task_logger and instead using a separate task_logger for each module. Why is that actually necessary?

Breaking it down:
1) Using CELERYD_LOG_FILE et al: Definitely good
2) Calling setup_logging_subsystem() earlier: Definitely good
3) Using separate task_loggers for each module: What is the purpose behind this? If you have a reason, good enough.

review: Approve
Revision history for this message
Gavin Panella (allenap) wrote :

Thanks for the review!

> 3) Using separate task_loggers for each module: What is the purpose
> behind this? If you have a reason, good enough.

This started out as another approach at the problem. I realised it
wasn't the right approach, but decided to leave the change because I
think it adheres closer to Celery's recommended practice (and, indeed,
standard Python use of the logging module).

Revision history for this message
Gavin Panella (allenap) wrote :

> So I think the specific change of 1294 is reasonable, but does it
> actually handle the case of 'start a new cluster-controller, point
> it at the region controller, and get some error logging while it is
> has not been accepted yet'?

I forgot to answer this. Yes, it does. You can try it out with:

  CELERY_CONFIG_MODULE=democeleryconfig_cluster \
    bin/maas-provision start-cluster-controller http://localhost/

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'etc/celeryconfig_common.py'
2--- etc/celeryconfig_common.py 2012-10-16 10:10:28 +0000
3+++ etc/celeryconfig_common.py 2012-10-23 12:54:22 +0000
4@@ -46,11 +46,12 @@
5 # this setting.
6 BROKER_URL = 'amqp://guest:guest@localhost:5672//'
7
8-# Location for log file.
9-MAAS_CELERY_LOG = '/var/log/maas/celery.log'
10+# Logging.
11+CELERYD_LOG_FILE = '/var/log/maas/celery.log'
12+CELERYD_LOG_LEVEL = 'INFO'
13
14 # Location for the cluster worker schedule file.
15-MAAS_CLUSTER_CELERY_DB = '/var/lib/maas/celerybeat-cluster-schedule'
16+CELERYBEAT_SCHEDULE_FILENAME = '/var/lib/maas/celerybeat-cluster-schedule'
17
18 WORKER_QUEUE_DNS = 'celery'
19 WORKER_QUEUE_BOOT_IMAGES = 'celery'
20
21=== modified file 'etc/democeleryconfig.py'
22--- etc/democeleryconfig.py 2012-10-03 14:18:17 +0000
23+++ etc/democeleryconfig.py 2012-10-23 12:54:22 +0000
24@@ -26,5 +26,5 @@
25
26 import_settings(democeleryconfig_common)
27
28-MAAS_CELERY_LOG = os.path.join(
29+CELERYD_LOG_FILE = os.path.join(
30 DEV_ROOT_DIRECTORY, 'logs/region-worker/current')
31
32=== modified file 'etc/democeleryconfig_cluster.py'
33--- etc/democeleryconfig_cluster.py 2012-10-11 12:42:06 +0000
34+++ etc/democeleryconfig_cluster.py 2012-10-23 12:54:22 +0000
35@@ -30,8 +30,8 @@
36 # maas_local_celeryconfig.
37 CLUSTER_UUID = "adfd3977-f251-4f2c-8d61-745dbd690bfc"
38
39-MAAS_CELERY_LOG = os.path.join(
40+CELERYD_LOG_FILE = os.path.join(
41 DEV_ROOT_DIRECTORY, 'logs/cluster-worker/current')
42
43-MAAS_CLUSTER_CELERY_DB = os.path.join(
44+CELERYBEAT_SCHEDULE_FILENAME = os.path.join(
45 DEV_ROOT_DIRECTORY, 'run/celerybeat-cluster-schedule')
46
47=== modified file 'src/provisioningserver/boot_images.py'
48--- src/provisioningserver/boot_images.py 2012-09-14 11:04:33 +0000
49+++ src/provisioningserver/boot_images.py 2012-10-23 12:54:22 +0000
50@@ -25,15 +25,18 @@
51 MAASDispatcher,
52 MAASOAuth,
53 )
54+from celery.log import get_task_logger
55 from provisioningserver.auth import (
56 get_recorded_api_credentials,
57 get_recorded_maas_url,
58 )
59 from provisioningserver.config import Config
60-from provisioningserver.logging import task_logger
61 from provisioningserver.pxe import tftppath
62
63
64+task_logger = get_task_logger(name=__name__)
65+
66+
67 def get_cached_knowledge():
68 """Return cached items required to report to the server.
69
70
71=== modified file 'src/provisioningserver/dhcp/leases.py'
72--- src/provisioningserver/dhcp/leases.py 2012-10-04 12:24:31 +0000
73+++ src/provisioningserver/dhcp/leases.py 2012-10-23 12:54:22 +0000
74@@ -45,6 +45,7 @@
75 MAASOAuth,
76 )
77 from celery.app import app_or_default
78+from celery.log import get_task_logger
79 from provisioningserver import cache
80 from provisioningserver.auth import (
81 get_recorded_api_credentials,
82@@ -52,7 +53,10 @@
83 get_recorded_nodegroup_uuid,
84 )
85 from provisioningserver.dhcp.leases_parser import parse_leases
86-from provisioningserver.logging import task_logger
87+
88+
89+task_logger = get_task_logger(name=__name__)
90+
91
92 # Cache key for the modification time on last-processed leases file.
93 LEASES_TIME_CACHE_KEY = 'leases_time'
94
95=== removed file 'src/provisioningserver/logging.py'
96--- src/provisioningserver/logging.py 2012-08-10 14:15:23 +0000
97+++ src/provisioningserver/logging.py 1970-01-01 00:00:00 +0000
98@@ -1,22 +0,0 @@
99-# Copyright 2012 Canonical Ltd. This software is licensed under the
100-# GNU Affero General Public License version 3 (see the file LICENSE).
101-
102-"""Celery logging."""
103-
104-from __future__ import (
105- absolute_import,
106- print_function,
107- unicode_literals,
108- )
109-
110-__metaclass__ = type
111-__all__ = [
112- 'task_logger',
113- ]
114-
115-
116-from celery.log import get_task_logger
117-
118-# Celery task logger. Shared between tasks, as per Celery's recommended
119-# practice.
120-task_logger = get_task_logger()
121
122=== modified file 'src/provisioningserver/start_cluster_controller.py'
123--- src/provisioningserver/start_cluster_controller.py 2012-10-03 18:16:22 +0000
124+++ src/provisioningserver/start_cluster_controller.py 2012-10-23 12:54:22 +0000
125@@ -32,10 +32,16 @@
126 NoAuth,
127 )
128 from celery.app import app_or_default
129-from provisioningserver.logging import task_logger
130+from celery.log import (
131+ get_task_logger,
132+ setup_logging_subsystem,
133+ )
134 from provisioningserver.network import discover_networks
135
136
137+task_logger = get_task_logger(name=__name__)
138+
139+
140 class ClusterControllerRejected(Exception):
141 """Request to become a cluster controller has been rejected."""
142
143@@ -68,16 +74,6 @@
144 return app_or_default().conf.CLUSTER_UUID
145
146
147-def get_maas_celery_log():
148- """Read location for MAAS Celery log file from the config."""
149- return app_or_default().conf.MAAS_CELERY_LOG
150-
151-
152-def get_maas_celerybeat_db():
153- """Read location for MAAS Celery schedule file from the config."""
154- return app_or_default().conf.MAAS_CLUSTER_CELERY_DB
155-
156-
157 def register(server_url):
158 """Request Rabbit connection details from the domain controller.
159
160@@ -134,15 +130,7 @@
161
162 # Copy environment, but also tell celeryd what broker to listen to.
163 env = dict(os.environ, CELERY_BROKER_URL=broker_url)
164-
165- command = [
166- 'celeryd',
167- '--logfile=%s' % get_maas_celery_log(),
168- '--schedule=%s' % get_maas_celerybeat_db(),
169- '--loglevel=INFO',
170- '--beat',
171- '-Q', get_cluster_uuid(),
172- ]
173+ command = 'celeryd', '--beat', '--queues', get_cluster_uuid()
174
175 # Change gid first, just in case changing the uid might deprive
176 # us of the privileges required to setgid.
177@@ -182,6 +170,7 @@
178 If this system is still awaiting approval as a cluster controller, this
179 command will keep looping until it gets a definite answer.
180 """
181+ setup_logging_subsystem()
182 connection_details = register(args.server_url)
183 while connection_details is None:
184 sleep(60)
185
186=== modified file 'src/provisioningserver/tags.py'
187--- src/provisioningserver/tags.py 2012-10-11 10:41:00 +0000
188+++ src/provisioningserver/tags.py 2012-10-23 12:54:22 +0000
189@@ -19,22 +19,23 @@
190
191
192 import httplib
193-import simplejson as json
194-from lxml import etree
195
196 from apiclient.maas_client import (
197 MAASClient,
198 MAASDispatcher,
199 MAASOAuth,
200 )
201-
202+from celery.log import get_task_logger
203+from lxml import etree
204 from provisioningserver.auth import (
205 get_recorded_api_credentials,
206 get_recorded_maas_url,
207 get_recorded_nodegroup_uuid,
208 )
209-
210-from provisioningserver.logging import task_logger
211+import simplejson as json
212+
213+
214+task_logger = get_task_logger(name=__name__)
215
216
217 class MissingCredentials(Exception):
218
219=== modified file 'src/provisioningserver/tests/test_start_cluster_controller.py'
220--- src/provisioningserver/tests/test_start_cluster_controller.py 2012-10-05 17:29:05 +0000
221+++ src/provisioningserver/tests/test_start_cluster_controller.py 2012-10-23 12:54:22 +0000
222@@ -93,6 +93,7 @@
223 self.patch(os, 'setuid')
224 self.patch(os, 'setgid')
225 self.patch(os, 'execvpe').side_effect = Executing()
226+ self.patch(start_cluster_controller, 'setup_logging_subsystem')
227 get_uuid = self.patch(start_cluster_controller, 'get_cluster_uuid')
228 get_uuid.return_value = factory.getRandomUUID()
229