Merge lp:~javier.collado/utah/bug1095669-2 into lp:utah

Proposed by Javier Collado
Status: Merged
Approved by: Max Brustkern
Approved revision: 814
Merged at revision: 803
Proposed branch: lp:~javier.collado/utah/bug1095669-2
Merge into: lp:utah
Diff against target: 600 lines (+161/-79) (has conflicts)
12 files modified
examples/run_install_test.py (+3/-0)
examples/run_test_bamboo_feeder.py (+5/-1)
examples/run_test_cobbler.py (+5/-1)
examples/run_test_vm.py (+3/-1)
examples/run_utah_tests.py (+3/-0)
utah/config.py (+5/-0)
utah/provisioning/baremetal/inventory.py (+4/-5)
utah/provisioning/inventory/sqlite.py (+32/-0)
utah/provisioning/provisioning.py (+1/-29)
utah/provisioning/ssh.py (+27/-26)
utah/provisioning/vm/vm.py (+5/-5)
utah/run.py (+68/-11)
Text conflict in utah/provisioning/inventory/sqlite.py
To merge this branch: bzr merge lp:~javier.collado/utah/bug1095669-2
Reviewer Review Type Date Requested Status
Max Brustkern (community) Approve
Review via email: mp+144365@code.launchpad.net

Description of the change

This branch adds a wrapper method to log all the sql commands executed during
the provisioning step.

In addition to this, since the inventory is created before the machine object,
there a lot of changes regarding logging:
- root logger object is configured in all run_* scripts
- machine logger propagates messages and doesn't contain any handler by itself
- logging messages outside the machine object now use the root logger
- the log format has been updated to include the python module name
- paramiko logs are only written to debug log file regardless of their level

The big change is that since root logger object is configured using pretty much
the same code that was used for the machine logger, now it's possible to use
logging in any place, not only in the machine object. This should be useful to
add logging messages that previously were not even printed because of the lack
of handlers for the root logger.

To post a comment you must log in.
lp:~javier.collado/utah/bug1095669-2 updated
810. By Javier Collado

Added separate file to log ssh related messages

The new file writes the whole set of ssh logs including the ones from paramiko
to a different file. Note that while paramiko logger doesn't propagate, the ssh
one does, so the ssh related messages that were written to log files before
this change will still be written.

811. By Javier Collado

Removed module name from logs

812. By Javier Collado

Fixed activecheck shouldn't use ssh logger

813. By Javier Collado

Set console log level back to WARNING

814. By Javier Collado

Set console log level to debug if -d/--debug option is passed

Revision history for this message
Javier Collado (javier.collado) wrote :

I've added a few changes to the logging part that don't really change anything
with regard to sql.

The biggest change is that ssh log is now written to a different file in case
detailed information is needed to troubleshoot a problem. In particular, the
logs from the paramiko library are also included in that file.

Aside from this, I think that the -d/--debug flag in the command line is kind
of confusing. In this branch, the console and log file handlers use the levels
set from the configuration file (the handler for the debug file is always set
to debug). The -d/--debug flag is used just to set to debug the logging in the
console, but not on the the file log. I'm not sure if this is the intended
behavior, but I don't think setting the file log to debug is a great benefit if
there's already a debug file. Maybe the -d/--debug flag should be removed since
it's not very likely that we use it to troubleshoot something looking just at
the console instead of checking the debug file. Please let me know what do you
think.

Revision history for this message
Max Brustkern (nuclearbob) wrote :

When -d was originally implemented, we didn't have a separate debug log file. I think enabling or disabling debug affects some code paths for cleanup routines as well. I think removing that option in favor of newer ones would be good. We can have explicit options for cleaning things up or not, and leave the log level stuff to the configuration files. I think that would be worth doing as a separate merge, since it'll affect some cleanup routines and need more testing.

Revision history for this message
Max Brustkern (nuclearbob) wrote :

Lines 214, 219, and 247 of the merge look like leftover bzr merge lines. Those are all in utah/provisioning/inventory/sqlite.py I wonder if lines 215-218 are still needed as well, as they would seem to be superseded by the lines after that.

Other than that, it looks like it should improve things a lot. If we do decide to start using rotating file handlers instead of external log rotation, this will make that a lot easier as well.

review: Approve
Revision history for this message
Max Brustkern (nuclearbob) wrote :

I'm not sure if I should have marked approve or not. I feel like the actual changes are good, and I'm confident you'll fix the typos, but I guess technically I wouldn't merge it in the current state. I think anyone who would merge it will be reading enough to know to fix those things and retest.

Revision history for this message
Javier Collado (javier.collado) wrote :

I think it's ok to mark the merge request as approved for now because we don't
have any auto-merging mechanism in place.

However, once tarmac is finally setup, an attempt to merge the code will be
made in such a situation, so we should avoid approving until the changes are
completely fine. Anyway, in this particular case, the merge would fail because
of the conflicts that you saw in the diff.

Revision history for this message
Javier Collado (javier.collado) wrote :

The merge conflict is because both lp:bug1095669 and lp:bug1095669-2 were
branched from the same dev branch revision and the latter is already merged.
Anyway, that's not a problem I'll fix the conflict in the merge itself.

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
=== modified file 'examples/run_install_test.py'
--- examples/run_install_test.py 2012-12-19 15:41:25 +0000
+++ examples/run_install_test.py 2013-01-23 12:29:24 +0000
@@ -27,6 +27,7 @@
27 file_arguments,27 file_arguments,
28 run_tests,28 run_tests,
29 virtual_arguments,29 virtual_arguments,
30 configure_logging,
30)31)
3132
3233
@@ -60,6 +61,8 @@
60 exitstatus = 061 exitstatus = 0
61 machine = None62 machine = None
6263
64 configure_logging(args.debug)
65
63 try:66 try:
64 inventory = TinySQLiteInventory()67 inventory = TinySQLiteInventory()
65 machine = inventory.request(68 machine = inventory.request(
6669
=== modified file 'examples/run_test_bamboo_feeder.py'
--- examples/run_test_bamboo_feeder.py 2012-12-19 15:41:25 +0000
+++ examples/run_test_bamboo_feeder.py 2013-01-23 12:29:24 +0000
@@ -18,6 +18,7 @@
18import argparse18import argparse
19import os19import os
20import sys20import sys
21import logging
2122
22from utah.exceptions import UTAHException23from utah.exceptions import UTAHException
23from utah.group import check_user_group, print_group_error_message24from utah.group import check_user_group, print_group_error_message
@@ -29,6 +30,7 @@
29 custom_arguments,30 custom_arguments,
30 name_argument,31 name_argument,
31 run_tests,32 run_tests,
33 configure_logging,
32)34)
3335
3436
@@ -61,6 +63,8 @@
61 exitstatus = 063 exitstatus = 0
62 machine = None64 machine = None
6365
66 configure_logging(args.debug)
67
64 try:68 try:
65 inventory = ManualBaremetalSQLiteInventory(69 inventory = ManualBaremetalSQLiteInventory(
66 db=os.path.join('~', '.utah-bamboofeeder-inventory'),70 db=os.path.join('~', '.utah-bamboofeeder-inventory'),
@@ -79,7 +83,7 @@
79 except UTAHException as error:83 except UTAHException as error:
80 mesg = 'Exception: ' + str(error)84 mesg = 'Exception: ' + str(error)
81 try:85 try:
82 machine.logger.error(mesg)86 logging.error(mesg)
83 except (AttributeError, NameError):87 except (AttributeError, NameError):
84 sys.stderr.write(mesg)88 sys.stderr.write(mesg)
85 exitstatus = 289 exitstatus = 2
8690
=== modified file 'examples/run_test_cobbler.py'
--- examples/run_test_cobbler.py 2012-12-19 15:41:25 +0000
+++ examples/run_test_cobbler.py 2013-01-23 12:29:24 +0000
@@ -17,6 +17,7 @@
1717
18import argparse18import argparse
19import sys19import sys
20import logging
2021
21from utah.exceptions import UTAHException22from utah.exceptions import UTAHException
22from utah.group import check_user_group, print_group_error_message23from utah.group import check_user_group, print_group_error_message
@@ -28,6 +29,7 @@
28 custom_arguments,29 custom_arguments,
29 name_argument,30 name_argument,
30 run_tests,31 run_tests,
32 configure_logging,
31)33)
3234
3335
@@ -60,6 +62,8 @@
60 exitstatus = 062 exitstatus = 0
61 machine = None63 machine = None
6264
65 configure_logging(args.debug)
66
63 try:67 try:
64 inventory = ManualBaremetalSQLiteInventory()68 inventory = ManualBaremetalSQLiteInventory()
65 kw = {}69 kw = {}
@@ -84,7 +88,7 @@
84 except UTAHException as error:88 except UTAHException as error:
85 mesg = 'Exception: ' + str(error)89 mesg = 'Exception: ' + str(error)
86 try:90 try:
87 machine.logger.error(mesg)91 logging.error(mesg)
88 except (AttributeError, NameError):92 except (AttributeError, NameError):
89 sys.stderr.write(mesg)93 sys.stderr.write(mesg)
90 exitstatus = 294 exitstatus = 2
9195
=== modified file 'examples/run_test_vm.py'
--- examples/run_test_vm.py 2012-12-19 15:41:25 +0000
+++ examples/run_test_vm.py 2013-01-23 12:29:24 +0000
@@ -21,7 +21,7 @@
21from utah.exceptions import UTAHException21from utah.exceptions import UTAHException
22from utah.group import check_user_group, print_group_error_message22from utah.group import check_user_group, print_group_error_message
23from utah.provisioning.vm.vm import TinySQLiteInventory23from utah.provisioning.vm.vm import TinySQLiteInventory
24from utah.run import common_arguments, run_tests24from utah.run import common_arguments, run_tests, configure_logging
2525
2626
27def get_parser():27def get_parser():
@@ -50,6 +50,8 @@
50 exitstatus = 050 exitstatus = 0
51 machine = None51 machine = None
5252
53 configure_logging(args.debug)
54
53 try:55 try:
54 inventory = TinySQLiteInventory()56 inventory = TinySQLiteInventory()
55 kw = {}57 kw = {}
5658
=== modified file 'examples/run_utah_tests.py'
--- examples/run_utah_tests.py 2012-12-19 15:41:25 +0000
+++ examples/run_utah_tests.py 2013-01-23 12:29:24 +0000
@@ -26,6 +26,7 @@
26 file_arguments,26 file_arguments,
27 name_argument,27 name_argument,
28 virtual_arguments,28 virtual_arguments,
29 configure_logging
29)30)
30from utah.timeout import timeout, UTAHTimeout31from utah.timeout import timeout, UTAHTimeout
31from run_install_test import run_install_test32from run_install_test import run_install_test
@@ -70,6 +71,8 @@
70 else:71 else:
71 machinetype = args.machinetype72 machinetype = args.machinetype
7273
74 configure_logging(args.debug)
75
73 # Default is now CustomVM76 # Default is now CustomVM
74 function = run_install_test77 function = run_install_test
7578
7679
=== modified file 'utah/config.py'
--- utah/config.py 2013-01-22 10:35:02 +0000
+++ utah/config.py 2013-01-23 12:29:24 +0000
@@ -171,6 +171,11 @@
171 DEFAULTS['logpath'],171 DEFAULTS['logpath'],
172 LOCALDEFAULTS['hostname'] + '-debug.log')172 LOCALDEFAULTS['hostname'] + '-debug.log')
173173
174# Log file used to log SSH related information
175LOCALDEFAULTS['ssh_logfile'] = \
176 os.path.join(DEFAULTS['logpath'],
177 '{}-ssh.log'.format(LOCALDEFAULTS['hostname']))
178
174# This depends on a binary that may not be installed.179# This depends on a binary that may not be installed.
175# Also, the null thing may seem obvious, but I found it here:180# Also, the null thing may seem obvious, but I found it here:
176# http://stackoverflow.com/questions/699325/181# http://stackoverflow.com/questions/699325/
177182
=== modified file 'utah/provisioning/baremetal/inventory.py'
--- utah/provisioning/baremetal/inventory.py 2012-12-13 09:07:58 +0000
+++ utah/provisioning/baremetal/inventory.py 2013-01-23 12:29:24 +0000
@@ -41,8 +41,7 @@
41 super(ManualBaremetalSQLiteInventory, self).__init__(*args, db=db,41 super(ManualBaremetalSQLiteInventory, self).__init__(*args, db=db,
42 lockfile=lockfile,42 lockfile=lockfile,
43 **kw)43 **kw)
44 machines_count = (self.connection44 machines_count = (self.execute('SELECT COUNT(*) FROM machines')
45 .execute('SELECT COUNT(*) FROM machines')
46 .fetchall()[0][0])45 .fetchall()[0][0])
47 if machines_count == 0:46 if machines_count == 0:
48 raise UTAHProvisioningInventoryException('No machines in database')47 raise UTAHProvisioningInventoryException('No machines in database')
@@ -54,7 +53,7 @@
54 if name is not None:53 if name is not None:
55 query += ' WHERE name=?'54 query += ' WHERE name=?'
56 queryvars.append(name)55 queryvars.append(name)
57 result = self.connection.execute(query, queryvars).fetchall()56 result = self.execute(query, queryvars).fetchall()
58 if result is None:57 if result is None:
59 raise UTAHProvisioningInventoryException(58 raise UTAHProvisioningInventoryException(
60 'No machines meet criteria')59 'No machines meet criteria')
@@ -84,7 +83,7 @@
84 name = machineinfo.pop('name')83 name = machineinfo.pop('name')
85 state = machineinfo.pop('state')84 state = machineinfo.pop('state')
86 machineinfo.pop('pid')85 machineinfo.pop('pid')
87 update = self.connection.execute(86 update = self.execute(
88 "UPDATE machines SET pid=?, state='provisioned' WHERE machineid=?"87 "UPDATE machines SET pid=?, state='provisioned' WHERE machineid=?"
89 "AND state=?",88 "AND state=?",
90 [os.getpid(), machineid, state]).rowcount89 [os.getpid(), machineid, state]).rowcount
@@ -115,7 +114,7 @@
115 'name required to release a machine')114 'name required to release a machine')
116 query = "UPDATE machines SET state='available' WHERE name=?"115 query = "UPDATE machines SET state='available' WHERE name=?"
117 queryvars = [name]116 queryvars = [name]
118 update = self.connection.execute(query, queryvars).rowcount117 update = self.execute(query, queryvars).rowcount
119 if update == 1:118 if update == 1:
120 if machine is not None:119 if machine is not None:
121 if machine in self.machines:120 if machine in self.machines:
122121
=== modified file 'utah/provisioning/inventory/sqlite.py'
--- utah/provisioning/inventory/sqlite.py 2013-01-22 10:35:02 +0000
+++ utah/provisioning/inventory/sqlite.py 2013-01-23 12:29:24 +0000
@@ -17,6 +17,8 @@
1717
18import sqlite318import sqlite3
19import os19import os
20import logging
21
20from utah.provisioning.inventory.inventory import Inventory22from utah.provisioning.inventory.inventory import Inventory
21from utah import config23from utah import config
2224
@@ -30,10 +32,40 @@
30 db = os.path.expanduser(db)32 db = os.path.expanduser(db)
31 super(SQLiteInventory, self).__init__(*args, uniqueid=db, **kw)33 super(SQLiteInventory, self).__init__(*args, uniqueid=db, **kw)
32 self.db = db34 self.db = db
35<<<<<<< TREE
33 self.connection = sqlite3.connect(self.db,36 self.connection = sqlite3.connect(self.db,
34 config.sqlite3_db_connection_timeout)37 config.sqlite3_db_connection_timeout)
35 self.connection.isolation_level = None38 self.connection.isolation_level = None
36 self.connection.row_factory = sqlite3.Row39 self.connection.row_factory = sqlite3.Row
40=======
41 self._connection = sqlite3.connect(self.db)
42 self._connection.isolation_level = None
43 self._connection.row_factory = sqlite3.Row
44
45 def execute(self, sql, parameters=None):
46 """Execute sql statement and return cursor
47
48 This method is expected to be used as a wrapper around all
49 `connection.execute` calls so that all the SQL statements are logged in
50 case this information is needed to troubleshoot problems.
51
52 :param sql: A sql statement to be executed
53 :type sql: str
54 :param parameters:
55 Parameters to use to replace the placeholder
56 values in the sql statement.
57 :type parameters:
58 list (question mark placeholders) |
59 dict (colon named placeholders)
60 """
61 if parameters is None:
62 parameters = []
63
64 logging.debug('Executing SQL statement: {}, {}'
65 .format(sql, parameters))
66 cursor = self._connection.execute(sql, parameters)
67 return cursor
68>>>>>>> MERGE-SOURCE
3769
38 def delete(self):70 def delete(self):
39 os.unlink(self.db)71 os.unlink(self.db)
4072
=== modified file 'utah/provisioning/provisioning.py'
--- utah/provisioning/provisioning.py 2013-01-16 14:08:17 +0000
+++ utah/provisioning/provisioning.py 2013-01-23 12:29:24 +0000
@@ -265,33 +265,7 @@
265 Initialize the logging for the machine.265 Initialize the logging for the machine.
266 Subclasses can override or supplement to customize logging.266 Subclasses can override or supplement to customize logging.
267 """267 """
268 self.logger = logging.getLogger('utah.provisioning.' + str(self.name))268 self.logger = logging.getLogger(self.name)
269 self.logger.propagate = False
270 self.logger.setLevel(logging.INFO)
271 for handler in list(self.logger.handlers):
272 self.logger.removeHandler(handler)
273 del handler
274 self.consolehandler = logging.StreamHandler(stream=sys.stderr)
275 log_format = logging.Formatter('%(asctime)s ' + self.name
276 + ' %(levelname)s: %(message)s')
277 console_formatter = log_format
278 self.consolehandler.setFormatter(console_formatter)
279 self.consolehandler.setLevel(config.consoleloglevel)
280 self.logger.addHandler(self.consolehandler)
281 self.filehandler = logging.handlers.WatchedFileHandler(config.logfile)
282 file_formatter = log_format
283 self.filehandler.setFormatter(file_formatter)
284 self.filehandler.setLevel(config.fileloglevel)
285 self.logger.addHandler(self.filehandler)
286 if config.debuglog is not None:
287 self.logger.setLevel(logging.DEBUG)
288 self.debughandler = (
289 logging.handlers.WatchedFileHandler(config.debuglog))
290 self.debughandler.setFormatter(file_formatter)
291 self.debughandler.setLevel(logging.DEBUG)
292 self.logger.addHandler(self.debughandler)
293 if self.debug:
294 self.logger.setLevel(logging.DEBUG)
295269
296 def _loggerunsetup(self):270 def _loggerunsetup(self):
297 """271 """
@@ -299,8 +273,6 @@
299 Principally used when a machine changes name after init.273 Principally used when a machine changes name after init.
300 """274 """
301 del self.logger275 del self.logger
302 del self.consolehandler
303 del self.filehandler
304276
305 def provisioncheck(self):277 def provisioncheck(self):
306 """278 """
307279
=== modified file 'utah/provisioning/ssh.py'
--- utah/provisioning/ssh.py 2012-12-12 18:56:29 +0000
+++ utah/provisioning/ssh.py 2013-01-23 12:29:24 +0000
@@ -43,6 +43,7 @@
43 ssh_client = paramiko.SSHClient()43 ssh_client = paramiko.SSHClient()
44 ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())44 ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
45 self.ssh_client = ssh_client45 self.ssh_client = ssh_client
46 self.ssh_logger = logging.getLogger('ssh')
4647
47 def run(self, command, _quiet=None, root=False, timeout=None):48 def run(self, command, _quiet=None, root=False, timeout=None):
48 """49 """
@@ -64,15 +65,15 @@
64 # With additions from here:65 # With additions from here:
65 # http://od-eon.com/blogs/66 # http://od-eon.com/blogs/
66 # stefan/automating-remote-commands-over-ssh-paramiko/67 # stefan/automating-remote-commands-over-ssh-paramiko/
67 self.logger.debug('Connecting SSH')68 self.ssh_logger.debug('Connecting SSH')
68 self.ssh_client.connect(self.name,69 self.ssh_client.connect(self.name,
69 username=user,70 username=user,
70 key_filename=config.sshprivatekey)71 key_filename=config.sshprivatekey)
7172
72 self.logger.debug('Opening SSH session')73 self.ssh_logger.debug('Opening SSH session')
73 channel = self.ssh_client.get_transport().open_session()74 channel = self.ssh_client.get_transport().open_session()
7475
75 self.logger.info('Running command through SSH: ' + commandstring)76 self.ssh_logger.info('Running command through SSH: ' + commandstring)
76 stdout = channel.makefile('rb')77 stdout = channel.makefile('rb')
77 stderr = channel.makefile_stderr('rb')78 stderr = channel.makefile_stderr('rb')
78 if timeout is None:79 if timeout is None:
@@ -81,22 +82,22 @@
81 utah.timeout.timeout(timeout, channel.exec_command, commandstring)82 utah.timeout.timeout(timeout, channel.exec_command, commandstring)
82 retval = channel.recv_exit_status()83 retval = channel.recv_exit_status()
8384
84 self.logger.debug('Closing SSH connection')85 self.ssh_logger.debug('Closing SSH connection')
85 self.ssh_client.close()86 self.ssh_client.close()
8687
87 log_level = logging.DEBUG if retval == 0 else logging.WARNING88 log_level = logging.DEBUG if retval == 0 else logging.WARNING
88 log_message = 'Return code: {}'.format(retval)89 log_message = 'Return code: {}'.format(retval)
89 self.logger.log(log_level, log_message)90 self.ssh_logger.log(log_level, log_message)
9091
91 self.logger.debug('Standard output follows:')92 self.ssh_logger.debug('Standard output follows:')
92 stdout_lines = stdout.readlines()93 stdout_lines = stdout.readlines()
93 for line in stdout_lines:94 for line in stdout_lines:
94 self.logger.debug(line.strip())95 self.ssh_logger.debug(line.strip())
9596
96 self.logger.debug('Standard error follows:')97 self.ssh_logger.debug('Standard error follows:')
97 stderr_lines = stderr.readlines()98 stderr_lines = stderr.readlines()
98 for line in stderr_lines:99 for line in stderr_lines:
99 self.logger.debug(line.strip())100 self.ssh_logger.debug(line.strip())
100101
101 return retval, ''.join(stdout_lines), ''.join(stderr_lines)102 return retval, ''.join(stdout_lines), ''.join(stderr_lines)
102103
@@ -116,9 +117,9 @@
116 try:117 try:
117 for localpath in files:118 for localpath in files:
118 if os.path.isfile(localpath):119 if os.path.isfile(localpath):
119 self.logger.info('Uploading ' + localpath120 self.ssh_logger.info('Uploading {} from the host '
120 + ' from the host to ' + target121 'to {} on the machine'
121 + ' on the machine')122 .format(localpath, target))
122 remotepath = os.path.join(target,123 remotepath = os.path.join(target,
123 os.path.basename(localpath))124 os.path.basename(localpath))
124 sftp_client.put(localpath, remotepath)125 sftp_client.put(localpath, remotepath)
@@ -155,9 +156,9 @@
155 try:156 try:
156 for remotepath in files:157 for remotepath in files:
157 localpath = get_localpath(remotepath)158 localpath = get_localpath(remotepath)
158 self.logger.info('Downloading ' + remotepath159 self.ssh_logger.info('Downloading {} from the machine '
159 + ' from the machine to ' + target160 'to {} on the host'
160 + ' on the host')161 .format(remotepath, target))
161 sftp_client.get(remotepath, localpath)162 sftp_client.get(remotepath, localpath)
162 finally:163 finally:
163 sftp_client.close()164 sftp_client.close()
@@ -179,9 +180,9 @@
179 for myfile in files:180 for myfile in files:
180 newtarget = os.path.join(target, os.path.basename(myfile))181 newtarget = os.path.join(target, os.path.basename(myfile))
181 if S_ISDIR(sftp_client.stat(myfile).st_mode):182 if S_ISDIR(sftp_client.stat(myfile).st_mode):
182 self.logger.debug(myfile + ' is a directory, recursing')183 self.ssh_logger.debug(myfile + ' is a directory, recursing')
183 if not os.path.isdir(newtarget):184 if not os.path.isdir(newtarget):
184 self.logger.debug('Attempting to create ' + newtarget)185 self.ssh_logger.debug('Attempting to create ' + newtarget)
185 os.makedirs(newtarget)186 os.makedirs(newtarget)
186 myfiles = [os.path.join(myfile, x)187 myfiles = [os.path.join(myfile, x)
187 for x in sftp_client.listdir(myfile)]188 for x in sftp_client.listdir(myfile)]
@@ -194,16 +195,16 @@
194 Clean up known hosts for machine.195 Clean up known hosts for machine.
195 """196 """
196 # TODO: evaluate value of known_hosts with paramiko197 # TODO: evaluate value of known_hosts with paramiko
197 self.logger.info('Removing machine addresses '198 self.ssh_logger.info('Removing machine addresses '
198 'from ssh known_hosts file')199 'from ssh known_hosts file')
199 addresses = [self.name]200 addresses = [self.name]
200 try:201 try:
201 addresses.append(socket.gethostbyname(self.name))202 addresses.append(socket.gethostbyname(self.name))
202 except socket.gaierror as err:203 except socket.gaierror as err:
203 if err.errno in [-2, -5]:204 if err.errno in [-2, -5]:
204 self.logger.debug(self.name205 self.ssh_logger.debug('{} is not resolvable, '
205 + ' is not resolvable, '206 'so not removing from known_hosts'
206 + 'so not removing from known_hosts')207 .format(self.name))
207 else:208 else:
208 raise err209 raise err
209210
@@ -227,10 +228,10 @@
227 Return a retryable exception if it is not.228 Return a retryable exception if it is not.
228 Intended for use with retry.229 Intended for use with retry.
229 """230 """
230 self.logger.info('Sleeping {timeout} seconds'231 self.ssh_logger.info('Sleeping {timeout} seconds'
231 .format(timeout=timeout))232 .format(timeout=timeout))
232 time.sleep(timeout)233 time.sleep(timeout)
233 self.logger.info('Checking for ssh availability')234 self.ssh_logger.info('Checking for ssh availability')
234 try:235 try:
235 self.ssh_client.connect(self.name,236 self.ssh_client.connect(self.name,
236 username=config.user,237 username=config.user,
@@ -246,7 +247,7 @@
246 if timeout is None:247 if timeout is None:
247 timeout = self.boottimeout248 timeout = self.boottimeout
248 if logmethod is None:249 if logmethod is None:
249 logmethod = self.logger.debug250 logmethod = self.ssh_logger.debug
250 utah.timeout.timeout(timeout, retry, self.sshcheck, checktimeout,251 utah.timeout.timeout(timeout, retry, self.sshcheck, checktimeout,
251 logmethod=logmethod)252 logmethod=logmethod)
252253
253254
=== modified file 'utah/provisioning/vm/vm.py'
--- utah/provisioning/vm/vm.py 2012-12-17 19:55:15 +0000
+++ utah/provisioning/vm/vm.py 2013-01-23 12:29:24 +0000
@@ -589,7 +589,7 @@
589 Initialize simple database.589 Initialize simple database.
590 """590 """
591 super(TinySQLiteInventory, self).__init__(*args, **kw)591 super(TinySQLiteInventory, self).__init__(*args, **kw)
592 self.connection.execute(592 self.execute(
593 'CREATE TABLE IF NOT EXISTS '593 'CREATE TABLE IF NOT EXISTS '
594 'machines(machineid INTEGER PRIMARY KEY, state TEXT)')594 'machines(machineid INTEGER PRIMARY KEY, state TEXT)')
595595
@@ -599,8 +599,8 @@
599 machineid along with all other arguments to that class's constructor,599 machineid along with all other arguments to that class's constructor,
600 returning the resulting object.600 returning the resulting object.
601 """601 """
602 cursor = self.connection.cursor()602 cursor = self.execute("INSERT INTO machines (state) "
603 cursor.execute("INSERT INTO machines (state) VALUES ('provisioned')")603 "VALUES ('provisioned')")
604 machineid = cursor.lastrowid604 machineid = cursor.lastrowid
605 return machinetype(machineid=machineid, *args, **kw)605 return machinetype(machineid=machineid, *args, **kw)
606606
@@ -608,7 +608,7 @@
608 """608 """
609 Updates the database to indicate the machine is available.609 Updates the database to indicate the machine is available.
610 """610 """
611 if self.connection.execute(611 if self.execute(
612 "UPDATE machines SET state='available' WHERE machineid=?",612 "UPDATE machines SET state='available' WHERE machineid=?",
613 [machineid]):613 [machineid]):
614 return True614 return True
@@ -620,7 +620,7 @@
620 Updates the database to indicate the machine is destroyed, but does not620 Updates the database to indicate the machine is destroyed, but does not
621 destroy the machine.621 destroy the machine.
622 """622 """
623 if self.connection.execute(623 if self.execute(
624 "UPDATE machines SET state='destroyed' ""WHERE machineid=?",624 "UPDATE machines SET state='destroyed' ""WHERE machineid=?",
625 [machineid]):625 [machineid]):
626 return True626 return True
627627
=== modified file 'utah/run.py'
--- utah/run.py 2013-01-16 11:05:00 +0000
+++ utah/run.py 2013-01-23 12:29:24 +0000
@@ -134,7 +134,7 @@
134 utah_command = 'utah {} {}'.format(extraopts, options)134 utah_command = 'utah {} {}'.format(extraopts, options)
135 exitstatus, _stdout, _stderr = machine.run(utah_command, root=True)135 exitstatus, _stdout, _stderr = machine.run(utah_command, root=True)
136 except UTAHException as error:136 except UTAHException as error:
137 machine.logger.error('Failed to run test: ' + str(error))137 logging.error('Failed to run test: ' + str(error))
138 exitstatus = 1138 exitstatus = 1
139 else:139 else:
140 _write(runlist, report_remote_path)140 _write(runlist, report_remote_path)
@@ -164,17 +164,12 @@
164 report_local_path = os.path.join(config.logpath, log_filename)164 report_local_path = os.path.join(config.logpath, log_filename)
165 report_local_path = os.path.normpath(report_local_path)165 report_local_path = os.path.normpath(report_local_path)
166 machine.downloadfiles(report_remote_path, report_local_path)166 machine.downloadfiles(report_remote_path, report_local_path)
167 machine.logger.info('Test log copied to ' + report_local_path)167 logging.info('Test log copied to ' + report_local_path)
168 locallogs.append(report_local_path)168 locallogs.append(report_local_path)
169 if args.dumplogs:169 if args.dumplogs:
170 with open(report_local_path, 'r') as f:170 with open(report_local_path, 'r') as f:
171 print(f.read())171 print(f.read())
172172
173 # Since we're running from a script, output INFO
174 for logger in [machine.logger] + machine.logger.handlers:
175 if logger.level > logging.INFO:
176 logger.setLevel(logging.INFO)
177
178 # Write the machine name to standard out for log gathering173 # Write the machine name to standard out for log gathering
179 print('Running on machine: ' + machine.name)174 print('Running on machine: ' + machine.name)
180175
@@ -196,14 +191,13 @@
196 try:191 try:
197 locallogs += getfiles(args, machine)192 locallogs += getfiles(args, machine)
198 except Exception as err:193 except Exception as err:
199 machine.logger.warning('Failed to download files: ' + str(err))194 logging.warning('Failed to download files: ' + str(err))
200195
201 if args.outputpreseed or config.outputpreseed:196 if args.outputpreseed or config.outputpreseed:
202 if args.outputpreseed:197 if args.outputpreseed:
203 machine.logger.debug('Capturing preseed due to '198 logging.debug('Capturing preseed due to command line option')
204 'command line option')
205 elif config.outputpreseed:199 elif config.outputpreseed:
206 machine.logger.debug('Capturing preseed due to config option')200 logging.debug('Capturing preseed due to config option')
207 preseedfile = os.path.join(config.logpath,201 preseedfile = os.path.join(config.logpath,
208 machine.name + '-preseed.cfg')202 machine.name + '-preseed.cfg')
209 shutil.copyfile(machine.finalpreseed, preseedfile)203 shutil.copyfile(machine.finalpreseed, preseedfile)
@@ -230,3 +224,66 @@
230 localfiles.append(myfile)224 localfiles.append(myfile)
231225
232 return localfiles226 return localfiles
227
228
229def configure_logging(debug=False):
230 """Configure logging.
231
232 Configure root logger using three different handlers with a different log
233 level:
234 - Console: Set according to `consoleloglevel` from configuration
235 - Log file: Set according to `fileloglevel` from configuration
236 - Debug log file: Always set to debug
237
238 Other loggers are expected to propagate their messages up to the logging
239 hierarchy.
240
241 """
242 logger = logging.getLogger()
243 # configure_logging is being called in different scripts that might call
244 # other scripts. This check avoids adding multiple handlers when logging as
245 # already been configured.
246 if logger.handlers:
247 logger.debug('Logging already configured. '
248 'Skipping logging configuration.')
249 return
250
251 logger.setLevel(logging.NOTSET)
252
253 formatter = logging.Formatter('%(asctime)s %(name)s '
254 '%(levelname)s: %(message)s')
255
256 # Console logging
257 console_handler = logging.StreamHandler()
258 console_handler.setFormatter(formatter)
259 log_level = logging.DEBUG if debug else config.consoleloglevel
260 console_handler.setLevel(log_level)
261 logger.addHandler(console_handler)
262
263 # File logging
264 file_handler = logging.handlers.WatchedFileHandler(config.logfile)
265 file_handler.setFormatter(formatter)
266 file_handler.setLevel(config.fileloglevel)
267 logger.addHandler(file_handler)
268
269 # Debug file logging
270 if config.debuglog is not None:
271 debug_file_handler = \
272 logging.handlers.WatchedFileHandler(config.debuglog)
273 debug_file_handler.setFormatter(formatter)
274 debug_file_handler.setLevel(logging.DEBUG)
275 logger.addHandler(debug_file_handler)
276
277 # Configure ssh logging in a separate file
278 ssh_file_handler = logging.handlers.WatchedFileHandler(config.ssh_logfile)
279 ssh_file_handler.setFormatter(formatter)
280 ssh_file_handler.setLevel(logging.DEBUG)
281
282 # ssh logs available in both ssh log file and other files
283 logger = logging.getLogger('ssh')
284 logger.addHandler(ssh_file_handler)
285
286 # paramiko logs only available in ssh log file
287 logger = logging.getLogger('paramiko')
288 logger.propagate = False
289 logger.addHandler(ssh_file_handler)

Subscribers

People subscribed via source and target branches