Merge ~chad.smith/cloud-init:unify-datasource-get-data into cloud-init:master

Proposed by Chad Smith
Status: Merged
Approved by: Chad Smith
Approved revision: 88aefe1fc382327420b91b71246226e7fc378006
Merged at revision: 0cf6db3617e0cebeb89c4809396f84360827e96c
Proposed branch: ~chad.smith/cloud-init:unify-datasource-get-data
Merge into: cloud-init:master
Prerequisite: ~chad.smith/cloud-init:feature/move-base-testcase
Diff against target: 1726 lines (+517/-153)
47 files modified
cloudinit/analyze/__main__.py (+3/-1)
cloudinit/analyze/dump.py (+1/-7)
cloudinit/net/tests/test_dhcp.py (+0/-4)
cloudinit/sources/DataSourceAliYun.py (+1/-0)
cloudinit/sources/DataSourceAltCloud.py (+4/-1)
cloudinit/sources/DataSourceAzure.py (+3/-1)
cloudinit/sources/DataSourceBigstep.py (+4/-1)
cloudinit/sources/DataSourceCloudSigma.py (+4/-1)
cloudinit/sources/DataSourceCloudStack.py (+4/-1)
cloudinit/sources/DataSourceConfigDrive.py (+4/-1)
cloudinit/sources/DataSourceDigitalOcean.py (+4/-1)
cloudinit/sources/DataSourceEc2.py (+9/-3)
cloudinit/sources/DataSourceGCE.py (+4/-1)
cloudinit/sources/DataSourceMAAS.py (+4/-1)
cloudinit/sources/DataSourceNoCloud.py (+4/-1)
cloudinit/sources/DataSourceNone.py (+4/-1)
cloudinit/sources/DataSourceOVF.py (+4/-1)
cloudinit/sources/DataSourceOpenNebula.py (+4/-1)
cloudinit/sources/DataSourceOpenStack.py (+4/-1)
cloudinit/sources/DataSourceScaleway.py (+3/-1)
cloudinit/sources/DataSourceSmartOS.py (+4/-1)
cloudinit/sources/__init__.py (+116/-13)
cloudinit/sources/tests/__init__.py (+0/-0)
cloudinit/sources/tests/test_init.py (+202/-0)
cloudinit/tests/helpers.py (+0/-10)
cloudinit/util.py (+24/-9)
tests/unittests/test_datasource/test_aliyun.py (+1/-1)
tests/unittests/test_datasource/test_altcloud.py (+13/-9)
tests/unittests/test_datasource/test_azure.py (+15/-13)
tests/unittests/test_datasource/test_azure_helper.py (+0/-4)
tests/unittests/test_datasource/test_cloudsigma.py (+9/-4)
tests/unittests/test_datasource/test_cloudstack.py (+13/-10)
tests/unittests/test_datasource/test_configdrive.py (+2/-1)
tests/unittests/test_datasource/test_digitalocean.py (+7/-4)
tests/unittests/test_datasource/test_ec2.py (+2/-1)
tests/unittests/test_datasource/test_gce.py (+2/-1)
tests/unittests/test_datasource/test_nocloud.py (+6/-8)
tests/unittests/test_datasource/test_opennebula.py (+5/-7)
tests/unittests/test_datasource/test_openstack.py (+8/-4)
tests/unittests/test_datasource/test_scaleway.py (+9/-4)
tests/unittests/test_datasource/test_smartos.py (+2/-1)
tests/unittests/test_ds_identify.py (+2/-2)
tests/unittests/test_handler/test_handler_chef.py (+0/-4)
tests/unittests/test_handler/test_handler_lxd.py (+0/-5)
tests/unittests/test_runs/test_merge_run.py (+1/-0)
tests/unittests/test_runs/test_simple_run.py (+2/-1)
tests/unittests/test_vmware_config_file.py (+0/-6)
Reviewer Review Type Date Requested Status
Scott Moser Approve
Server Team CI bot continuous-integration Approve
Review via email: mp+330115@code.launchpad.net

Description of the change

Datasources: Formalize DataSource get_data and related properties.

Each DataSource subclass must define its own get_data method. This branch
formalizes our DataSource class to require that subclasses define an
explicit dsname for sourcing cloud-config datasource configuration.
Subclasses must also override the _get_data method or a
NotImplementedError is raised.

The branch also writes /run/cloud-init/instance-data.json. This file
contains all meta-data, user-data and vendor-data and a standardized set
of metadata keys in a json blob which other utilities with root-access
could make use of. Because some meta-data or user-data is potentially
sensitive the file is only readable by root.

Generally most metadata content types should be json serializable. If
specific keys or values are not serializable, those specific values will
be base64encoded and the key path will be listed under the top-level key
'base64-encoded-keys' in instance-data.json. If json writing fails due to
other TypeErrors or UnicodeDecodeErrors, a warning log will be emitted to
/var/log/cloud-init.log and no instance-data.json will be created.

To post a comment you must log in.
Revision history for this message
Server Team CI bot (server-team-bot) wrote :

PASSED: Continuous integration, rev:e04585b6417a9a75a834c0b8ed22adf8ffe435a2
https://jenkins.ubuntu.com/server/job/cloud-init-ci/255/
Executed test runs:
    SUCCESS: Checkout
    SUCCESS: Unit & Style Tests
    SUCCESS: Ubuntu LTS: Build
    SUCCESS: Ubuntu LTS: Integration
    SUCCESS: MAAS Compatability Testing
    IN_PROGRESS: Declarative: Post Actions

Click here to trigger a rebuild:
https://jenkins.ubuntu.com/server/job/cloud-init-ci/255/rebuild

review: Approve (continuous-integration)
Revision history for this message
Ryan Harper (raharper) wrote :

In commit message:

s/it's/its

I'd replace 'caches' metadata, to dumps; cache implies (to me) reuse but cloud-init doesn't use the json file at all (nor would it since it's already part of the cloud object).

Drop the 'Subsequent' sentence; while you may follow up with that it's not needed as part of the commit message.

Finally, question on the aborting if we have unserializable data; is it possible to skip/replace the binary data so we don't have an all-or-nothing situation?

Revision history for this message
Ryan Harper (raharper) :
2da6f0d... by Chad Smith

move json_loads up out of helpers, sources.__init__ and analyze and into cloudinit.util

c60d25d... by Chad Smith

use util.load_json instead of json.loads in unit tests

Revision history for this message
Chad Smith (chad.smith) wrote :

Thanks for the review Ryan, pushed changes per your review comments. We should be able to handle TypeError issues during json errors and attempt encoding or exclusion of those items instead of bailing on everything.

Revision history for this message
Server Team CI bot (server-team-bot) wrote :

FAILED: Continuous integration, rev:3cf4ab99853c0575b1d6fd0951fd3783a579a615
https://jenkins.ubuntu.com/server/job/cloud-init-ci/273/
Executed test runs:
    SUCCESS: Checkout
    FAILED: Unit & Style Tests

Click here to trigger a rebuild:
https://jenkins.ubuntu.com/server/job/cloud-init-ci/273/rebuild

review: Needs Fixing (continuous-integration)
Revision history for this message
Server Team CI bot (server-team-bot) wrote :

FAILED: Continuous integration, rev:95c1151f0481695b7148609447a162132990bc20
https://jenkins.ubuntu.com/server/job/cloud-init-ci/277/
Executed test runs:
    SUCCESS: Checkout
    FAILED: Unit & Style Tests

Click here to trigger a rebuild:
https://jenkins.ubuntu.com/server/job/cloud-init-ci/277/rebuild

review: Needs Fixing (continuous-integration)
145541c... by Chad Smith

flakes and allow util.json_loads to redact unseralizable content

Revision history for this message
Server Team CI bot (server-team-bot) wrote :

PASSED: Continuous integration, rev:c0009178b955e0ccf7cd62bfe595b4865339e978
https://jenkins.ubuntu.com/server/job/cloud-init-ci/281/
Executed test runs:
    SUCCESS: Checkout
    SUCCESS: Unit & Style Tests
    SUCCESS: Ubuntu LTS: Build
    SUCCESS: Ubuntu LTS: Integration
    SUCCESS: MAAS Compatability Testing
    IN_PROGRESS: Declarative: Post Actions

Click here to trigger a rebuild:
https://jenkins.ubuntu.com/server/job/cloud-init-ci/281/rebuild

review: Approve (continuous-integration)
Revision history for this message
Scott Moser (smoser) wrote :

userdata_raw for an instance should *always* be bytes, which is *always* unserializble in json:

>>> json.dumps({'key': b'bytes value'})
TypeError: Object of type 'bytes' is not JSON serializable

heres a test that shows more actual content of user-data:
 http://paste.ubuntu.com/25594310/

So that means that unless something is wrong in the Datasource, the 'user-data' (and maybe vendor-data) blobs are going to be redacted with the Warning.

Is meta-data one of our "normalized" values ?

I kind of expected that our exported data looks like:
 {'instance-id': <string>,
  'hostname': <hostname>,
  'user-data'
  'vendor-data':

  'base64-keys': ['user-data', 'vendor-data', 'ec2/blobs/blob1', 'ec2/more-blobs/0']
  '_datasource': {
    'ec2-specific-key1': <string>,
    'blobs': { 'blob1': <bytes> },
    'more-blobs': [b'blobby'],
  }

So we'd have generally available keys in the top (many of them taken out of 'meta-data') with well known names, and then '_datasource' or other name with non-guaranteed stuff under that key.

Maybe i'm thinknig to far ahead, but I'm afraid of putting something in /run/cloud-init/INSTANCE_JSON_FILE that then someone reads and we're stuck with the format of.

the 'base64-keys' thought above is that those would be paths in the data to keys whose value is binary data. We should also make a distinction between a value whos contents raise error on .decode('utf-8') and a value that should generally be regarded as binary. Ie, we shouldnt just "try and see" if we can decode the values because then that leads to oddness for the user where sometimes a given key path might be in base64-keys and other times not.

I don't like that the value of a key can be a string 'Warning:' ... that seems
hard for someone to identify the difference between a possible but strange
value and "didnt work". We should never come to this case I do not think.
And if we do, raising exception is prbably ok.

25481bf... by Chad Smith

attempt to base64encode json unserializable content

14b44a8... by Chad Smith

Datasource base class updates

 1. require dname defined in subclassess to avoid magic prefix/suffix drops when loading datasource config.
 2. Add cached cloud_name property which can be overridden with meta-data['cloud-name']
 3. Add _get_standardized_metadata to emit standard/expected metadata keys for all datasources
 4. Move unstandardized meta-data, instance-data and vendor-data under _datasource key

3020dfa... by Chad Smith

add unit tests for instance-data.json

24ffc07... by Chad Smith

add explicit dsname definitions for all datasource subclasses. Alow Ec2 to identify it's cloud_name via _get_cloud_name() which calls DataSourceEc2.cloud_platform

eb66299... by Chad Smith

flakes

Revision history for this message
Server Team CI bot (server-team-bot) wrote :

FAILED: Continuous integration, rev:5d4105b8bf166b0111dfed82b9e123ee9f2b2ff7
https://jenkins.ubuntu.com/server/job/cloud-init-ci/378/
Executed test runs:
    SUCCESS: Checkout
    FAILED: Unit & Style Tests

Click here to trigger a rebuild:
https://jenkins.ubuntu.com/server/job/cloud-init-ci/378/rebuild

review: Needs Fixing (continuous-integration)
71fe9b0... by Chad Smith

use uncommon ci-b64: prefix for to mark cloud-init's base64 encoding prefix when json serializing unserializable content. Add unit test for py2 which serializes bytes values

Revision history for this message
Server Team CI bot (server-team-bot) wrote :

FAILED: Continuous integration, rev:f29e17db6a387a60ef1928081a74b0fa4911b2f8
https://jenkins.ubuntu.com/server/job/cloud-init-ci/379/
Executed test runs:
    SUCCESS: Checkout
    FAILED: Unit & Style Tests

Click here to trigger a rebuild:
https://jenkins.ubuntu.com/server/job/cloud-init-ci/379/rebuild

review: Needs Fixing (continuous-integration)
477ddd6... by Chad Smith

handle non-utf8 in python2

Revision history for this message
Server Team CI bot (server-team-bot) wrote :

PASSED: Continuous integration, rev:8415d1035bbcd21595d4c631b1dc860840347e07
https://jenkins.ubuntu.com/server/job/cloud-init-ci/380/
Executed test runs:
    SUCCESS: Checkout
    SUCCESS: Unit & Style Tests
    SUCCESS: Ubuntu LTS: Build
    SUCCESS: Ubuntu LTS: Integration
    SUCCESS: MAAS Compatability Testing
    IN_PROGRESS: Declarative: Post Actions

Click here to trigger a rebuild:
https://jenkins.ubuntu.com/server/job/cloud-init-ci/380/rebuild

review: Approve (continuous-integration)
6274b97... by Chad Smith

keep all standardized fields in instance-data.json under standard-v1 key

Revision history for this message
Chad Smith (chad.smith) wrote :

@smoser: addressed discussions we had at the Ubuntu Rally about standardizing metadata keys in instance-data.json and versioning it. The branch is getting big (and therefore risky) so I didn't address some cloud-name renames and surfacing 'platform' in this branch as it would involve a bit of renaming in DataSourceEc2.Platforms too and I want to tackle that a separate branch to get fresh eyes on it.

Revision history for this message
Server Team CI bot (server-team-bot) wrote :

FAILED: Continuous integration, rev:39d79188da50f2f2ab840a3a8bbc2bd1c1b099bd
https://jenkins.ubuntu.com/server/job/cloud-init-ci/381/
Executed test runs:
    SUCCESS: Checkout
    SUCCESS: Unit & Style Tests
    SUCCESS: Ubuntu LTS: Build
    SUCCESS: Ubuntu LTS: Integration
    FAILED: MAAS Compatability Testing

Click here to trigger a rebuild:
https://jenkins.ubuntu.com/server/job/cloud-init-ci/381/rebuild

review: Needs Fixing (continuous-integration)
Revision history for this message
Server Team CI bot (server-team-bot) wrote :

PASSED: Continuous integration, rev:39d79188da50f2f2ab840a3a8bbc2bd1c1b099bd
https://jenkins.ubuntu.com/server/job/cloud-init-ci/382/
Executed test runs:
    SUCCESS: Checkout
    SUCCESS: Unit & Style Tests
    SUCCESS: Ubuntu LTS: Build
    SUCCESS: Ubuntu LTS: Integration
    SUCCESS: MAAS Compatability Testing
    IN_PROGRESS: Declarative: Post Actions

Click here to trigger a rebuild:
https://jenkins.ubuntu.com/server/job/cloud-init-ci/382/rebuild

review: Approve (continuous-integration)
d4ae4e6... by Chad Smith

shuffle instance-data.json field names to make it easier to use in jinja templates

Revision history for this message
Server Team CI bot (server-team-bot) wrote :

PASSED: Continuous integration, rev:874f2a2bd457ebeb557bd60c0f705ea46d79d43e
https://jenkins.ubuntu.com/server/job/cloud-init-ci/519/
Executed test runs:
    SUCCESS: Checkout
    SUCCESS: Unit & Style Tests
    SUCCESS: Ubuntu LTS: Build
    SUCCESS: Ubuntu LTS: Integration
    SUCCESS: MAAS Compatability Testing
    IN_PROGRESS: Declarative: Post Actions

Click here to trigger a rebuild:
https://jenkins.ubuntu.com/server/job/cloud-init-ci/519/rebuild

review: Approve (continuous-integration)
Revision history for this message
Chad Smith (chad.smith) wrote :

Latest example of instance-data.json on EC2 with the following user-data:

$ echo 'I2Nsb3VkLWNvbmZpZwp0aW1lem9uZTogUGFjaWZpYy9Ib25vbHVsdQo=' | base64 -d
#cloud-config
timezone: Pacific/Honolulu

http://paste.ubuntu.com/26007492/

23b31d5... by Chad Smith

datasource meta-data will all contain hyphens anyway. Let's just leave the keys all hyphenated, when handling jinja we'll convert and instance-data.json hyphens to underscores for templating

Revision history for this message
Chad Smith (chad.smith) wrote :

let's just use hyphens throughout http://paste.ubuntu.com/26007592/

Revision history for this message
Server Team CI bot (server-team-bot) wrote :

PASSED: Continuous integration, rev:87def7b76e67b2905ddab88fe936050686509462
https://jenkins.ubuntu.com/server/job/cloud-init-ci/520/
Executed test runs:
    SUCCESS: Checkout
    SUCCESS: Unit & Style Tests
    SUCCESS: Ubuntu LTS: Build
    SUCCESS: Ubuntu LTS: Integration
    SUCCESS: MAAS Compatability Testing
    IN_PROGRESS: Declarative: Post Actions

Click here to trigger a rebuild:
https://jenkins.ubuntu.com/server/job/cloud-init-ci/520/rebuild

review: Approve (continuous-integration)
Revision history for this message
Scott Moser (smoser) wrote :

you have some merge conflicts in the diff.
i'll review more later.
this does look good though.

Revision history for this message
Scott Moser (smoser) :
Revision history for this message
Scott Moser (smoser) :
7bbc057... by Chad Smith

drop indented scope and exit early

Revision history for this message
Chad Smith (chad.smith) :
Revision history for this message
Server Team CI bot (server-team-bot) wrote :

FAILED: Continuous integration, rev:eb1ad7e97119bc2b05da035c2119638146e835d2
https://jenkins.ubuntu.com/server/job/cloud-init-ci/555/
Executed test runs:
    SUCCESS: Checkout
    FAILED: Unit & Style Tests

Click here to trigger a rebuild:
https://jenkins.ubuntu.com/server/job/cloud-init-ci/555/rebuild

review: Needs Fixing (continuous-integration)
05c7419... by Chad Smith

use atomic_helper write_json in Datasource. fix unit tests

Revision history for this message
Server Team CI bot (server-team-bot) wrote :

PASSED: Continuous integration, rev:05c74190b01e839360dd4f306cfe0397962ffb98
https://jenkins.ubuntu.com/server/job/cloud-init-ci/561/
Executed test runs:
    SUCCESS: Checkout
    SUCCESS: Unit & Style Tests
    SUCCESS: Ubuntu LTS: Build
    SUCCESS: Ubuntu LTS: Integration
    SUCCESS: MAAS Compatability Testing
    IN_PROGRESS: Declarative: Post Actions

Click here to trigger a rebuild:
https://jenkins.ubuntu.com/server/job/cloud-init-ci/561/rebuild

review: Approve (continuous-integration)
Revision history for this message
Chad Smith (chad.smith) wrote :

One more pass of instance-data from lxc and ec2 validated no Tracebacks as well.

http://pastebin.ubuntu.com/26075320/

Revision history for this message
Chad Smith (chad.smith) wrote :

Since we are playing around, here's Azure instance-data. not much there http://pastebin.ubuntu.com/26075842/

d936c64... by Chad Smith

drop unpopulated ipv4 and ipv6 addresses from standardized metadata

Revision history for this message
Server Team CI bot (server-team-bot) wrote :

FAILED: Continuous integration, rev:d936c64b352b52a0a9639aa6e24a18d6b876a69e
https://jenkins.ubuntu.com/server/job/cloud-init-ci/566/
Executed test runs:
    SUCCESS: Checkout
    FAILED: Unit & Style Tests

Click here to trigger a rebuild:
https://jenkins.ubuntu.com/server/job/cloud-init-ci/566/rebuild

review: Needs Fixing (continuous-integration)
Revision history for this message
Chad Smith (chad.smith) wrote :

And openstack instance data for the win
http://pastebin.ubuntu.com/26084548/

Revision history for this message
Scott Moser (smoser) wrote :

you do still have merge conflicts.

88aefe1... by Chad Smith

flakes

Revision history for this message
Server Team CI bot (server-team-bot) wrote :

PASSED: Continuous integration, rev:88aefe1fc382327420b91b71246226e7fc378006
https://jenkins.ubuntu.com/server/job/cloud-init-ci/580/
Executed test runs:
    SUCCESS: Checkout
    SUCCESS: Unit & Style Tests
    SUCCESS: Ubuntu LTS: Build
    SUCCESS: Ubuntu LTS: Integration
    SUCCESS: MAAS Compatability Testing
    IN_PROGRESS: Declarative: Post Actions

Click here to trigger a rebuild:
https://jenkins.ubuntu.com/server/job/cloud-init-ci/580/rebuild

review: Approve (continuous-integration)
Revision history for this message
Scott Moser (smoser) wrote :

Two things
a.) i'd like some doc/ I think this can be followed up with in another branch
b.) your commit message has a long line. i read and quickly fixed a small thing
or two.

review: Approve
Revision history for this message
Chad Smith (chad.smith) wrote :

GCE instance-data for reference
http://paste.ubuntu.com/26165085/

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1diff --git a/cloudinit/analyze/__main__.py b/cloudinit/analyze/__main__.py
2index 69b9e43..3ba5903 100644
3--- a/cloudinit/analyze/__main__.py
4+++ b/cloudinit/analyze/__main__.py
5@@ -6,6 +6,8 @@ import argparse
6 import re
7 import sys
8
9+from cloudinit.util import json_dumps
10+
11 from . import dump
12 from . import show
13
14@@ -112,7 +114,7 @@ def analyze_show(name, args):
15 def analyze_dump(name, args):
16 """Dump cloud-init events in json format"""
17 (infh, outfh) = configure_io(args)
18- outfh.write(dump.json_dumps(_get_events(infh)) + '\n')
19+ outfh.write(json_dumps(_get_events(infh)) + '\n')
20
21
22 def _get_events(infile):
23diff --git a/cloudinit/analyze/dump.py b/cloudinit/analyze/dump.py
24index ca4da49..b071aa1 100644
25--- a/cloudinit/analyze/dump.py
26+++ b/cloudinit/analyze/dump.py
27@@ -2,7 +2,6 @@
28
29 import calendar
30 from datetime import datetime
31-import json
32 import sys
33
34 from cloudinit import util
35@@ -132,11 +131,6 @@ def parse_ci_logline(line):
36 return event
37
38
39-def json_dumps(data):
40- return json.dumps(data, indent=1, sort_keys=True,
41- separators=(',', ': '))
42-
43-
44 def dump_events(cisource=None, rawdata=None):
45 events = []
46 event = None
47@@ -169,7 +163,7 @@ def main():
48 else:
49 cisource = sys.stdin
50
51- return json_dumps(dump_events(cisource))
52+ return util.json_dumps(dump_events(cisource))
53
54
55 if __name__ == "__main__":
56diff --git a/cloudinit/net/tests/test_dhcp.py b/cloudinit/net/tests/test_dhcp.py
57index fad4e49..db25b6f 100644
58--- a/cloudinit/net/tests/test_dhcp.py
59+++ b/cloudinit/net/tests/test_dhcp.py
60@@ -8,12 +8,8 @@ from cloudinit.net.dhcp import (
61 InvalidDHCPLeaseFileError, maybe_perform_dhcp_discovery,
62 parse_dhcp_lease_file, dhcp_discovery, networkd_load_leases)
63 from cloudinit.util import ensure_file, write_file
64-<<<<<<< cloudinit/net/tests/test_dhcp.py
65 from cloudinit.tests.helpers import (
66 CiTestCase, mock, populate_dir, wrap_and_call)
67-=======
68-from cloudinit.tests.helpers import CiTestCase
69->>>>>>> cloudinit/net/tests/test_dhcp.py
70
71
72 class TestParseDHCPLeasesFile(CiTestCase):
73diff --git a/cloudinit/sources/DataSourceAliYun.py b/cloudinit/sources/DataSourceAliYun.py
74index 43a7e42..7ac8288 100644
75--- a/cloudinit/sources/DataSourceAliYun.py
76+++ b/cloudinit/sources/DataSourceAliYun.py
77@@ -11,6 +11,7 @@ ALIYUN_PRODUCT = "Alibaba Cloud ECS"
78
79 class DataSourceAliYun(EC2.DataSourceEc2):
80
81+ dsname = 'AliYun'
82 metadata_urls = ['http://100.100.100.200']
83
84 # The minimum supported metadata_version from the ec2 metadata apis
85diff --git a/cloudinit/sources/DataSourceAltCloud.py b/cloudinit/sources/DataSourceAltCloud.py
86index c78ad9e..be2d6cf 100644
87--- a/cloudinit/sources/DataSourceAltCloud.py
88+++ b/cloudinit/sources/DataSourceAltCloud.py
89@@ -74,6 +74,9 @@ def read_user_data_callback(mount_dir):
90
91
92 class DataSourceAltCloud(sources.DataSource):
93+
94+ dsname = 'AltCloud'
95+
96 def __init__(self, sys_cfg, distro, paths):
97 sources.DataSource.__init__(self, sys_cfg, distro, paths)
98 self.seed = None
99@@ -112,7 +115,7 @@ class DataSourceAltCloud(sources.DataSource):
100
101 return 'UNKNOWN'
102
103- def get_data(self):
104+ def _get_data(self):
105 '''
106 Description:
107 User Data is passed to the launching instance which
108diff --git a/cloudinit/sources/DataSourceAzure.py b/cloudinit/sources/DataSourceAzure.py
109index 14367e9..6978d4e 100644
110--- a/cloudinit/sources/DataSourceAzure.py
111+++ b/cloudinit/sources/DataSourceAzure.py
112@@ -246,6 +246,8 @@ def temporary_hostname(temp_hostname, cfg, hostname_command='hostname'):
113
114
115 class DataSourceAzure(sources.DataSource):
116+
117+ dsname = 'Azure'
118 _negotiated = False
119
120 def __init__(self, sys_cfg, distro, paths):
121@@ -330,7 +332,7 @@ class DataSourceAzure(sources.DataSource):
122 metadata['public-keys'] = key_value or pubkeys_from_crt_files(fp_files)
123 return metadata
124
125- def get_data(self):
126+ def _get_data(self):
127 # azure removes/ejects the cdrom containing the ovf-env.xml
128 # file on reboot. So, in order to successfully reboot we
129 # need to look in the datadir and consider that valid
130diff --git a/cloudinit/sources/DataSourceBigstep.py b/cloudinit/sources/DataSourceBigstep.py
131index d7fcd45..699a85b 100644
132--- a/cloudinit/sources/DataSourceBigstep.py
133+++ b/cloudinit/sources/DataSourceBigstep.py
134@@ -16,13 +16,16 @@ LOG = logging.getLogger(__name__)
135
136
137 class DataSourceBigstep(sources.DataSource):
138+
139+ dsname = 'Bigstep'
140+
141 def __init__(self, sys_cfg, distro, paths):
142 sources.DataSource.__init__(self, sys_cfg, distro, paths)
143 self.metadata = {}
144 self.vendordata_raw = ""
145 self.userdata_raw = ""
146
147- def get_data(self, apply_filter=False):
148+ def _get_data(self, apply_filter=False):
149 url = get_url_from_file()
150 if url is None:
151 return False
152diff --git a/cloudinit/sources/DataSourceCloudSigma.py b/cloudinit/sources/DataSourceCloudSigma.py
153index 19df16b..4eaad47 100644
154--- a/cloudinit/sources/DataSourceCloudSigma.py
155+++ b/cloudinit/sources/DataSourceCloudSigma.py
156@@ -23,6 +23,9 @@ class DataSourceCloudSigma(sources.DataSource):
157 For more information about CloudSigma's Server Context:
158 http://cloudsigma-docs.readthedocs.org/en/latest/server_context.html
159 """
160+
161+ dsname = 'CloudSigma'
162+
163 def __init__(self, sys_cfg, distro, paths):
164 self.cepko = Cepko()
165 self.ssh_public_key = ''
166@@ -46,7 +49,7 @@ class DataSourceCloudSigma(sources.DataSource):
167 LOG.warning("failed to query dmi data for system product name")
168 return False
169
170- def get_data(self):
171+ def _get_data(self):
172 """
173 Metadata is the whole server context and /meta/cloud-config is used
174 as userdata.
175diff --git a/cloudinit/sources/DataSourceCloudStack.py b/cloudinit/sources/DataSourceCloudStack.py
176index 9dc473f..0df545f 100644
177--- a/cloudinit/sources/DataSourceCloudStack.py
178+++ b/cloudinit/sources/DataSourceCloudStack.py
179@@ -65,6 +65,9 @@ class CloudStackPasswordServerClient(object):
180
181
182 class DataSourceCloudStack(sources.DataSource):
183+
184+ dsname = 'CloudStack'
185+
186 def __init__(self, sys_cfg, distro, paths):
187 sources.DataSource.__init__(self, sys_cfg, distro, paths)
188 self.seed_dir = os.path.join(paths.seed_dir, 'cs')
189@@ -117,7 +120,7 @@ class DataSourceCloudStack(sources.DataSource):
190 def get_config_obj(self):
191 return self.cfg
192
193- def get_data(self):
194+ def _get_data(self):
195 seed_ret = {}
196 if util.read_optional_seed(seed_ret, base=(self.seed_dir + "/")):
197 self.userdata_raw = seed_ret['user-data']
198diff --git a/cloudinit/sources/DataSourceConfigDrive.py b/cloudinit/sources/DataSourceConfigDrive.py
199index ef374f3..870b368 100644
200--- a/cloudinit/sources/DataSourceConfigDrive.py
201+++ b/cloudinit/sources/DataSourceConfigDrive.py
202@@ -32,6 +32,9 @@ OPTICAL_DEVICES = tuple(('/dev/%s%s' % (z, i) for z in POSSIBLE_MOUNTS
203
204
205 class DataSourceConfigDrive(openstack.SourceMixin, sources.DataSource):
206+
207+ dsname = 'ConfigDrive'
208+
209 def __init__(self, sys_cfg, distro, paths):
210 super(DataSourceConfigDrive, self).__init__(sys_cfg, distro, paths)
211 self.source = None
212@@ -50,7 +53,7 @@ class DataSourceConfigDrive(openstack.SourceMixin, sources.DataSource):
213 mstr += "[source=%s]" % (self.source)
214 return mstr
215
216- def get_data(self):
217+ def _get_data(self):
218 found = None
219 md = {}
220 results = {}
221diff --git a/cloudinit/sources/DataSourceDigitalOcean.py b/cloudinit/sources/DataSourceDigitalOcean.py
222index 5e7e66b..e0ef665 100644
223--- a/cloudinit/sources/DataSourceDigitalOcean.py
224+++ b/cloudinit/sources/DataSourceDigitalOcean.py
225@@ -27,6 +27,9 @@ MD_USE_IPV4LL = True
226
227
228 class DataSourceDigitalOcean(sources.DataSource):
229+
230+ dsname = 'DigitalOcean'
231+
232 def __init__(self, sys_cfg, distro, paths):
233 sources.DataSource.__init__(self, sys_cfg, distro, paths)
234 self.distro = distro
235@@ -44,7 +47,7 @@ class DataSourceDigitalOcean(sources.DataSource):
236 def _get_sysinfo(self):
237 return do_helper.read_sysinfo()
238
239- def get_data(self):
240+ def _get_data(self):
241 (is_do, droplet_id) = self._get_sysinfo()
242
243 # only proceed if we know we are on DigitalOcean
244diff --git a/cloudinit/sources/DataSourceEc2.py b/cloudinit/sources/DataSourceEc2.py
245index 7bbbfb6..e5c8833 100644
246--- a/cloudinit/sources/DataSourceEc2.py
247+++ b/cloudinit/sources/DataSourceEc2.py
248@@ -31,6 +31,7 @@ _unset = "_unset"
249
250
251 class Platforms(object):
252+ # TODO Rename and move to cloudinit.cloud.CloudNames
253 ALIYUN = "AliYun"
254 AWS = "AWS"
255 BRIGHTBOX = "Brightbox"
256@@ -45,6 +46,7 @@ class Platforms(object):
257
258 class DataSourceEc2(sources.DataSource):
259
260+ dsname = 'Ec2'
261 # Default metadata urls that will be used if none are provided
262 # They will be checked for 'resolveability' and some of the
263 # following may be discarded if they do not resolve
264@@ -68,11 +70,15 @@ class DataSourceEc2(sources.DataSource):
265 _fallback_interface = None
266
267 def __init__(self, sys_cfg, distro, paths):
268- sources.DataSource.__init__(self, sys_cfg, distro, paths)
269+ super(DataSourceEc2, self).__init__(sys_cfg, distro, paths)
270 self.metadata_address = None
271 self.seed_dir = os.path.join(paths.seed_dir, "ec2")
272
273- def get_data(self):
274+ def _get_cloud_name(self):
275+ """Return the cloud name as identified during _get_data."""
276+ return self.cloud_platform
277+
278+ def _get_data(self):
279 seed_ret = {}
280 if util.read_optional_seed(seed_ret, base=(self.seed_dir + "/")):
281 self.userdata_raw = seed_ret['user-data']
282@@ -274,7 +280,7 @@ class DataSourceEc2(sources.DataSource):
283 return None
284
285 @property
286- def cloud_platform(self):
287+ def cloud_platform(self): # TODO rename cloud_name
288 if self._cloud_platform is None:
289 self._cloud_platform = identify_platform()
290 return self._cloud_platform
291diff --git a/cloudinit/sources/DataSourceGCE.py b/cloudinit/sources/DataSourceGCE.py
292index ccae420..ad6dae3 100644
293--- a/cloudinit/sources/DataSourceGCE.py
294+++ b/cloudinit/sources/DataSourceGCE.py
295@@ -42,6 +42,9 @@ class GoogleMetadataFetcher(object):
296
297
298 class DataSourceGCE(sources.DataSource):
299+
300+ dsname = 'GCE'
301+
302 def __init__(self, sys_cfg, distro, paths):
303 sources.DataSource.__init__(self, sys_cfg, distro, paths)
304 self.metadata = dict()
305@@ -50,7 +53,7 @@ class DataSourceGCE(sources.DataSource):
306 BUILTIN_DS_CONFIG])
307 self.metadata_address = self.ds_cfg['metadata_url']
308
309- def get_data(self):
310+ def _get_data(self):
311 ret = util.log_time(
312 LOG.debug, 'Crawl of GCE metadata service',
313 read_md, kwargs={'address': self.metadata_address})
314diff --git a/cloudinit/sources/DataSourceMAAS.py b/cloudinit/sources/DataSourceMAAS.py
315index 77df5a5..496bd06 100644
316--- a/cloudinit/sources/DataSourceMAAS.py
317+++ b/cloudinit/sources/DataSourceMAAS.py
318@@ -39,6 +39,9 @@ class DataSourceMAAS(sources.DataSource):
319 hostname
320 vendor-data
321 """
322+
323+ dsname = "MAAS"
324+
325 def __init__(self, sys_cfg, distro, paths):
326 sources.DataSource.__init__(self, sys_cfg, distro, paths)
327 self.base_url = None
328@@ -62,7 +65,7 @@ class DataSourceMAAS(sources.DataSource):
329 root = sources.DataSource.__str__(self)
330 return "%s [%s]" % (root, self.base_url)
331
332- def get_data(self):
333+ def _get_data(self):
334 mcfg = self.ds_cfg
335
336 try:
337diff --git a/cloudinit/sources/DataSourceNoCloud.py b/cloudinit/sources/DataSourceNoCloud.py
338index e641244..5d3a8dd 100644
339--- a/cloudinit/sources/DataSourceNoCloud.py
340+++ b/cloudinit/sources/DataSourceNoCloud.py
341@@ -20,6 +20,9 @@ LOG = logging.getLogger(__name__)
342
343
344 class DataSourceNoCloud(sources.DataSource):
345+
346+ dsname = "NoCloud"
347+
348 def __init__(self, sys_cfg, distro, paths):
349 sources.DataSource.__init__(self, sys_cfg, distro, paths)
350 self.seed = None
351@@ -32,7 +35,7 @@ class DataSourceNoCloud(sources.DataSource):
352 root = sources.DataSource.__str__(self)
353 return "%s [seed=%s][dsmode=%s]" % (root, self.seed, self.dsmode)
354
355- def get_data(self):
356+ def _get_data(self):
357 defaults = {
358 "instance-id": "nocloud",
359 "dsmode": self.dsmode,
360diff --git a/cloudinit/sources/DataSourceNone.py b/cloudinit/sources/DataSourceNone.py
361index 906bb27..e63a7e3 100644
362--- a/cloudinit/sources/DataSourceNone.py
363+++ b/cloudinit/sources/DataSourceNone.py
364@@ -11,12 +11,15 @@ LOG = logging.getLogger(__name__)
365
366
367 class DataSourceNone(sources.DataSource):
368+
369+ dsname = "None"
370+
371 def __init__(self, sys_cfg, distro, paths, ud_proc=None):
372 sources.DataSource.__init__(self, sys_cfg, distro, paths, ud_proc)
373 self.metadata = {}
374 self.userdata_raw = ''
375
376- def get_data(self):
377+ def _get_data(self):
378 # If the datasource config has any provided 'fallback'
379 # userdata or metadata, use it...
380 if 'userdata_raw' in self.ds_cfg:
381diff --git a/cloudinit/sources/DataSourceOVF.py b/cloudinit/sources/DataSourceOVF.py
382index ccebf11..6ac621f 100644
383--- a/cloudinit/sources/DataSourceOVF.py
384+++ b/cloudinit/sources/DataSourceOVF.py
385@@ -43,6 +43,9 @@ LOG = logging.getLogger(__name__)
386
387
388 class DataSourceOVF(sources.DataSource):
389+
390+ dsname = "OVF"
391+
392 def __init__(self, sys_cfg, distro, paths):
393 sources.DataSource.__init__(self, sys_cfg, distro, paths)
394 self.seed = None
395@@ -60,7 +63,7 @@ class DataSourceOVF(sources.DataSource):
396 root = sources.DataSource.__str__(self)
397 return "%s [seed=%s]" % (root, self.seed)
398
399- def get_data(self):
400+ def _get_data(self):
401 found = []
402 md = {}
403 ud = ""
404diff --git a/cloudinit/sources/DataSourceOpenNebula.py b/cloudinit/sources/DataSourceOpenNebula.py
405index 5fdac19..5da1184 100644
406--- a/cloudinit/sources/DataSourceOpenNebula.py
407+++ b/cloudinit/sources/DataSourceOpenNebula.py
408@@ -31,6 +31,9 @@ CONTEXT_DISK_FILES = ["context.sh"]
409
410
411 class DataSourceOpenNebula(sources.DataSource):
412+
413+ dsname = "OpenNebula"
414+
415 def __init__(self, sys_cfg, distro, paths):
416 sources.DataSource.__init__(self, sys_cfg, distro, paths)
417 self.seed = None
418@@ -40,7 +43,7 @@ class DataSourceOpenNebula(sources.DataSource):
419 root = sources.DataSource.__str__(self)
420 return "%s [seed=%s][dsmode=%s]" % (root, self.seed, self.dsmode)
421
422- def get_data(self):
423+ def _get_data(self):
424 defaults = {"instance-id": DEFAULT_IID}
425 results = None
426 seed = None
427diff --git a/cloudinit/sources/DataSourceOpenStack.py b/cloudinit/sources/DataSourceOpenStack.py
428index b64a7f2..e55a763 100644
429--- a/cloudinit/sources/DataSourceOpenStack.py
430+++ b/cloudinit/sources/DataSourceOpenStack.py
431@@ -24,6 +24,9 @@ DEFAULT_METADATA = {
432
433
434 class DataSourceOpenStack(openstack.SourceMixin, sources.DataSource):
435+
436+ dsname = "OpenStack"
437+
438 def __init__(self, sys_cfg, distro, paths):
439 super(DataSourceOpenStack, self).__init__(sys_cfg, distro, paths)
440 self.metadata_address = None
441@@ -96,7 +99,7 @@ class DataSourceOpenStack(openstack.SourceMixin, sources.DataSource):
442 self.metadata_address = url2base.get(avail_url)
443 return bool(avail_url)
444
445- def get_data(self):
446+ def _get_data(self):
447 try:
448 if not self.wait_for_metadata_service():
449 return False
450diff --git a/cloudinit/sources/DataSourceScaleway.py b/cloudinit/sources/DataSourceScaleway.py
451index 3a8a8e8..b0b19c9 100644
452--- a/cloudinit/sources/DataSourceScaleway.py
453+++ b/cloudinit/sources/DataSourceScaleway.py
454@@ -169,6 +169,8 @@ def query_data_api(api_type, api_address, retries, timeout):
455
456 class DataSourceScaleway(sources.DataSource):
457
458+ dsname = "Scaleway"
459+
460 def __init__(self, sys_cfg, distro, paths):
461 super(DataSourceScaleway, self).__init__(sys_cfg, distro, paths)
462
463@@ -184,7 +186,7 @@ class DataSourceScaleway(sources.DataSource):
464 self.retries = int(self.ds_cfg.get('retries', DEF_MD_RETRIES))
465 self.timeout = int(self.ds_cfg.get('timeout', DEF_MD_TIMEOUT))
466
467- def get_data(self):
468+ def _get_data(self):
469 if not on_scaleway():
470 return False
471
472diff --git a/cloudinit/sources/DataSourceSmartOS.py b/cloudinit/sources/DataSourceSmartOS.py
473index 6c6902f..86bfa5d 100644
474--- a/cloudinit/sources/DataSourceSmartOS.py
475+++ b/cloudinit/sources/DataSourceSmartOS.py
476@@ -159,6 +159,9 @@ LEGACY_USER_D = "/var/db"
477
478
479 class DataSourceSmartOS(sources.DataSource):
480+
481+ dsname = "Joyent"
482+
483 _unset = "_unset"
484 smartos_type = _unset
485 md_client = _unset
486@@ -211,7 +214,7 @@ class DataSourceSmartOS(sources.DataSource):
487 os.rename('/'.join([svc_path, 'provisioning']),
488 '/'.join([svc_path, 'provision_success']))
489
490- def get_data(self):
491+ def _get_data(self):
492 self._init()
493
494 md = {}
495diff --git a/cloudinit/sources/__init__.py b/cloudinit/sources/__init__.py
496index 9a43fbe..4b819ce 100644
497--- a/cloudinit/sources/__init__.py
498+++ b/cloudinit/sources/__init__.py
499@@ -10,9 +10,11 @@
500
501 import abc
502 import copy
503+import json
504 import os
505 import six
506
507+from cloudinit.atomic_helper import write_json
508 from cloudinit import importer
509 from cloudinit import log as logging
510 from cloudinit import type_utils
511@@ -33,6 +35,12 @@ DEP_FILESYSTEM = "FILESYSTEM"
512 DEP_NETWORK = "NETWORK"
513 DS_PREFIX = 'DataSource'
514
515+# File in which instance meta-data, user-data and vendor-data is written
516+INSTANCE_JSON_FILE = 'instance-data.json'
517+
518+# Key which can be provide a cloud's official product name to cloud-init
519+METADATA_CLOUD_NAME_KEY = 'cloud-name'
520+
521 LOG = logging.getLogger(__name__)
522
523
524@@ -40,12 +48,39 @@ class DataSourceNotFoundException(Exception):
525 pass
526
527
528+def process_base64_metadata(metadata, key_path=''):
529+ """Strip ci-b64 prefix and return metadata with base64-encoded-keys set."""
530+ md_copy = copy.deepcopy(metadata)
531+ md_copy['base64-encoded-keys'] = []
532+ for key, val in metadata.items():
533+ if key_path:
534+ sub_key_path = key_path + '/' + key
535+ else:
536+ sub_key_path = key
537+ if isinstance(val, str) and val.startswith('ci-b64:'):
538+ md_copy['base64-encoded-keys'].append(sub_key_path)
539+ md_copy[key] = val.replace('ci-b64:', '')
540+ if isinstance(val, dict):
541+ return_val = process_base64_metadata(val, sub_key_path)
542+ md_copy['base64-encoded-keys'].extend(
543+ return_val.pop('base64-encoded-keys'))
544+ md_copy[key] = return_val
545+ return md_copy
546+
547+
548 @six.add_metaclass(abc.ABCMeta)
549 class DataSource(object):
550
551 dsmode = DSMODE_NETWORK
552 default_locale = 'en_US.UTF-8'
553
554+ # Datasource name needs to be set by subclasses to determine which
555+ # cloud-config datasource key is loaded
556+ dsname = '_undef'
557+
558+ # Cached cloud_name as determined by _get_cloud_name
559+ _cloud_name = None
560+
561 def __init__(self, sys_cfg, distro, paths, ud_proc=None):
562 self.sys_cfg = sys_cfg
563 self.distro = distro
564@@ -56,17 +91,8 @@ class DataSource(object):
565 self.vendordata = None
566 self.vendordata_raw = None
567
568- # find the datasource config name.
569- # remove 'DataSource' from classname on front, and remove 'Net' on end.
570- # Both Foo and FooNet sources expect config in cfg['sources']['Foo']
571- name = type_utils.obj_name(self)
572- if name.startswith(DS_PREFIX):
573- name = name[len(DS_PREFIX):]
574- if name.endswith('Net'):
575- name = name[0:-3]
576-
577- self.ds_cfg = util.get_cfg_by_path(self.sys_cfg,
578- ("datasource", name), {})
579+ self.ds_cfg = util.get_cfg_by_path(
580+ self.sys_cfg, ("datasource", self.dsname), {})
581 if not self.ds_cfg:
582 self.ds_cfg = {}
583
584@@ -78,6 +104,51 @@ class DataSource(object):
585 def __str__(self):
586 return type_utils.obj_name(self)
587
588+ def _get_standardized_metadata(self):
589+ """Return a dictionary of standardized metadata keys."""
590+ return {'v1': {
591+ 'local-hostname': self.get_hostname(),
592+ 'instance-id': self.get_instance_id(),
593+ 'cloud-name': self.cloud_name,
594+ 'region': self.region,
595+ 'availability-zone': self.availability_zone}}
596+
597+ def get_data(self):
598+ """Datasources implement _get_data to setup metadata and userdata_raw.
599+
600+ Minimally, the datasource should return a boolean True on success.
601+ """
602+ return_value = self._get_data()
603+ json_file = os.path.join(self.paths.run_dir, INSTANCE_JSON_FILE)
604+ if not return_value:
605+ return return_value
606+
607+ instance_data = {
608+ 'ds': {
609+ 'meta-data': self.metadata,
610+ 'user-data': self.get_userdata_raw(),
611+ 'vendor-data': self.get_vendordata_raw()}}
612+ instance_data.update(
613+ self._get_standardized_metadata())
614+ try:
615+ # Process content base64encoding unserializable values
616+ content = util.json_dumps(instance_data)
617+ # Strip base64: prefix and return base64-encoded-keys
618+ processed_data = process_base64_metadata(json.loads(content))
619+ except TypeError as e:
620+ LOG.warning('Error persisting instance-data.json: %s', str(e))
621+ return return_value
622+ except UnicodeDecodeError as e:
623+ LOG.warning('Error persisting instance-data.json: %s', str(e))
624+ return return_value
625+ write_json(json_file, processed_data, mode=0o600)
626+ return return_value
627+
628+ def _get_data(self):
629+ raise NotImplementedError(
630+ 'Subclasses of DataSource must implement _get_data which'
631+ ' sets self.metadata, vendordata_raw and userdata_raw.')
632+
633 def get_userdata(self, apply_filter=False):
634 if self.userdata is None:
635 self.userdata = self.ud_proc.process(self.get_userdata_raw())
636@@ -91,6 +162,34 @@ class DataSource(object):
637 return self.vendordata
638
639 @property
640+ def cloud_name(self):
641+ """Return lowercase cloud name as determined by the datasource.
642+
643+ Datasource can determine or define its own cloud product name in
644+ metadata.
645+ """
646+ if self._cloud_name:
647+ return self._cloud_name
648+ if self.metadata and self.metadata.get(METADATA_CLOUD_NAME_KEY):
649+ cloud_name = self.metadata.get(METADATA_CLOUD_NAME_KEY)
650+ if isinstance(cloud_name, six.string_types):
651+ self._cloud_name = cloud_name.lower()
652+ LOG.debug(
653+ 'Ignoring metadata provided key %s: non-string type %s',
654+ METADATA_CLOUD_NAME_KEY, type(cloud_name))
655+ else:
656+ self._cloud_name = self._get_cloud_name().lower()
657+ return self._cloud_name
658+
659+ def _get_cloud_name(self):
660+ """Return the datasource name as it frequently matches cloud name.
661+
662+ Should be overridden in subclasses which can run on multiple
663+ cloud names, such as DatasourceEc2.
664+ """
665+ return self.dsname
666+
667+ @property
668 def launch_index(self):
669 if not self.metadata:
670 return None
671@@ -161,8 +260,11 @@ class DataSource(object):
672
673 @property
674 def availability_zone(self):
675- return self.metadata.get('availability-zone',
676- self.metadata.get('availability_zone'))
677+ top_level_az = self.metadata.get(
678+ 'availability-zone', self.metadata.get('availability_zone'))
679+ if top_level_az:
680+ return top_level_az
681+ return self.metadata.get('placement', {}).get('availability-zone')
682
683 @property
684 def region(self):
685@@ -417,4 +519,5 @@ def list_from_depends(depends, ds_list):
686 ret_list.append(cls)
687 return ret_list
688
689+
690 # vi: ts=4 expandtab
691diff --git a/cloudinit/sources/tests/__init__.py b/cloudinit/sources/tests/__init__.py
692new file mode 100644
693index 0000000..e69de29
694--- /dev/null
695+++ b/cloudinit/sources/tests/__init__.py
696diff --git a/cloudinit/sources/tests/test_init.py b/cloudinit/sources/tests/test_init.py
697new file mode 100644
698index 0000000..af15115
699--- /dev/null
700+++ b/cloudinit/sources/tests/test_init.py
701@@ -0,0 +1,202 @@
702+# This file is part of cloud-init. See LICENSE file for license information.
703+
704+import os
705+import six
706+import stat
707+
708+from cloudinit.helpers import Paths
709+from cloudinit.sources import (
710+ INSTANCE_JSON_FILE, DataSource)
711+from cloudinit.tests.helpers import CiTestCase, skipIf
712+from cloudinit.user_data import UserDataProcessor
713+from cloudinit import util
714+
715+
716+class DataSourceTestSubclassNet(DataSource):
717+
718+ dsname = 'MyTestSubclass'
719+
720+ def __init__(self, sys_cfg, distro, paths, custom_userdata=None):
721+ super(DataSourceTestSubclassNet, self).__init__(
722+ sys_cfg, distro, paths)
723+ self._custom_userdata = custom_userdata
724+
725+ def _get_cloud_name(self):
726+ return 'SubclassCloudName'
727+
728+ def _get_data(self):
729+ self.metadata = {'availability_zone': 'myaz',
730+ 'local-hostname': 'test-subclass-hostname',
731+ 'region': 'myregion'}
732+ if self._custom_userdata:
733+ self.userdata_raw = self._custom_userdata
734+ else:
735+ self.userdata_raw = 'userdata_raw'
736+ self.vendordata_raw = 'vendordata_raw'
737+ return True
738+
739+
740+class InvalidDataSourceTestSubclassNet(DataSource):
741+ pass
742+
743+
744+class TestDataSource(CiTestCase):
745+
746+ with_logs = True
747+
748+ def setUp(self):
749+ super(TestDataSource, self).setUp()
750+ self.sys_cfg = {'datasource': {'_undef': {'key1': False}}}
751+ self.distro = 'distrotest' # generally should be a Distro object
752+ self.paths = Paths({})
753+ self.datasource = DataSource(self.sys_cfg, self.distro, self.paths)
754+
755+ def test_datasource_init(self):
756+ """DataSource initializes metadata attributes, ds_cfg and ud_proc."""
757+ self.assertEqual(self.paths, self.datasource.paths)
758+ self.assertEqual(self.sys_cfg, self.datasource.sys_cfg)
759+ self.assertEqual(self.distro, self.datasource.distro)
760+ self.assertIsNone(self.datasource.userdata)
761+ self.assertEqual({}, self.datasource.metadata)
762+ self.assertIsNone(self.datasource.userdata_raw)
763+ self.assertIsNone(self.datasource.vendordata)
764+ self.assertIsNone(self.datasource.vendordata_raw)
765+ self.assertEqual({'key1': False}, self.datasource.ds_cfg)
766+ self.assertIsInstance(self.datasource.ud_proc, UserDataProcessor)
767+
768+ def test_datasource_init_gets_ds_cfg_using_dsname(self):
769+ """Init uses DataSource.dsname for sourcing ds_cfg."""
770+ sys_cfg = {'datasource': {'MyTestSubclass': {'key2': False}}}
771+ distro = 'distrotest' # generally should be a Distro object
772+ paths = Paths({})
773+ datasource = DataSourceTestSubclassNet(sys_cfg, distro, paths)
774+ self.assertEqual({'key2': False}, datasource.ds_cfg)
775+
776+ def test_str_is_classname(self):
777+ """The string representation of the datasource is the classname."""
778+ self.assertEqual('DataSource', str(self.datasource))
779+ self.assertEqual(
780+ 'DataSourceTestSubclassNet',
781+ str(DataSourceTestSubclassNet('', '', self.paths)))
782+
783+ def test__get_data_unimplemented(self):
784+ """Raise an error when _get_data is not implemented."""
785+ with self.assertRaises(NotImplementedError) as context_manager:
786+ self.datasource.get_data()
787+ self.assertIn(
788+ 'Subclasses of DataSource must implement _get_data',
789+ str(context_manager.exception))
790+ datasource2 = InvalidDataSourceTestSubclassNet(
791+ self.sys_cfg, self.distro, self.paths)
792+ with self.assertRaises(NotImplementedError) as context_manager:
793+ datasource2.get_data()
794+ self.assertIn(
795+ 'Subclasses of DataSource must implement _get_data',
796+ str(context_manager.exception))
797+
798+ def test_get_data_calls_subclass__get_data(self):
799+ """Datasource.get_data uses the subclass' version of _get_data."""
800+ tmp = self.tmp_dir()
801+ datasource = DataSourceTestSubclassNet(
802+ self.sys_cfg, self.distro, Paths({'run_dir': tmp}))
803+ self.assertTrue(datasource.get_data())
804+ self.assertEqual(
805+ {'availability_zone': 'myaz',
806+ 'local-hostname': 'test-subclass-hostname',
807+ 'region': 'myregion'},
808+ datasource.metadata)
809+ self.assertEqual('userdata_raw', datasource.userdata_raw)
810+ self.assertEqual('vendordata_raw', datasource.vendordata_raw)
811+
812+ def test_get_data_write_json_instance_data(self):
813+ """get_data writes INSTANCE_JSON_FILE to run_dir as readonly root."""
814+ tmp = self.tmp_dir()
815+ datasource = DataSourceTestSubclassNet(
816+ self.sys_cfg, self.distro, Paths({'run_dir': tmp}))
817+ datasource.get_data()
818+ json_file = self.tmp_path(INSTANCE_JSON_FILE, tmp)
819+ content = util.load_file(json_file)
820+ expected = {
821+ 'base64-encoded-keys': [],
822+ 'v1': {
823+ 'availability-zone': 'myaz',
824+ 'cloud-name': 'subclasscloudname',
825+ 'instance-id': 'iid-datasource',
826+ 'local-hostname': 'test-subclass-hostname',
827+ 'region': 'myregion'},
828+ 'ds': {
829+ 'meta-data': {'availability_zone': 'myaz',
830+ 'local-hostname': 'test-subclass-hostname',
831+ 'region': 'myregion'},
832+ 'user-data': 'userdata_raw',
833+ 'vendor-data': 'vendordata_raw'}}
834+ self.assertEqual(expected, util.load_json(content))
835+ file_stat = os.stat(json_file)
836+ self.assertEqual(0o600, stat.S_IMODE(file_stat.st_mode))
837+
838+ def test_get_data_handles_redacted_unserializable_content(self):
839+ """get_data warns unserializable content in INSTANCE_JSON_FILE."""
840+ tmp = self.tmp_dir()
841+ datasource = DataSourceTestSubclassNet(
842+ self.sys_cfg, self.distro, Paths({'run_dir': tmp}),
843+ custom_userdata={'key1': 'val1', 'key2': {'key2.1': self.paths}})
844+ self.assertTrue(datasource.get_data())
845+ json_file = self.tmp_path(INSTANCE_JSON_FILE, tmp)
846+ content = util.load_file(json_file)
847+ expected_userdata = {
848+ 'key1': 'val1',
849+ 'key2': {
850+ 'key2.1': "Warning: redacted unserializable type <class"
851+ " 'cloudinit.helpers.Paths'>"}}
852+ instance_json = util.load_json(content)
853+ self.assertEqual(
854+ expected_userdata, instance_json['ds']['user-data'])
855+
856+ @skipIf(not six.PY3, "json serialization on <= py2.7 handles bytes")
857+ def test_get_data_base64encodes_unserializable_bytes(self):
858+ """On py3, get_data base64encodes any unserializable content."""
859+ tmp = self.tmp_dir()
860+ datasource = DataSourceTestSubclassNet(
861+ self.sys_cfg, self.distro, Paths({'run_dir': tmp}),
862+ custom_userdata={'key1': 'val1', 'key2': {'key2.1': b'\x123'}})
863+ self.assertTrue(datasource.get_data())
864+ json_file = self.tmp_path(INSTANCE_JSON_FILE, tmp)
865+ content = util.load_file(json_file)
866+ instance_json = util.load_json(content)
867+ self.assertEqual(
868+ ['ds/user-data/key2/key2.1'],
869+ instance_json['base64-encoded-keys'])
870+ self.assertEqual(
871+ {'key1': 'val1', 'key2': {'key2.1': 'EjM='}},
872+ instance_json['ds']['user-data'])
873+
874+ @skipIf(not six.PY2, "json serialization on <= py2.7 handles bytes")
875+ def test_get_data_handles_bytes_values(self):
876+ """On py2 get_data handles bytes values without having to b64encode."""
877+ tmp = self.tmp_dir()
878+ datasource = DataSourceTestSubclassNet(
879+ self.sys_cfg, self.distro, Paths({'run_dir': tmp}),
880+ custom_userdata={'key1': 'val1', 'key2': {'key2.1': b'\x123'}})
881+ self.assertTrue(datasource.get_data())
882+ json_file = self.tmp_path(INSTANCE_JSON_FILE, tmp)
883+ content = util.load_file(json_file)
884+ instance_json = util.load_json(content)
885+ self.assertEqual([], instance_json['base64-encoded-keys'])
886+ self.assertEqual(
887+ {'key1': 'val1', 'key2': {'key2.1': '\x123'}},
888+ instance_json['ds']['user-data'])
889+
890+ @skipIf(not six.PY2, "Only python2 hits UnicodeDecodeErrors on non-utf8")
891+ def test_non_utf8_encoding_logs_warning(self):
892+ """When non-utf-8 values exist in py2 instance-data is not written."""
893+ tmp = self.tmp_dir()
894+ datasource = DataSourceTestSubclassNet(
895+ self.sys_cfg, self.distro, Paths({'run_dir': tmp}),
896+ custom_userdata={'key1': 'val1', 'key2': {'key2.1': b'ab\xaadef'}})
897+ self.assertTrue(datasource.get_data())
898+ json_file = self.tmp_path(INSTANCE_JSON_FILE, tmp)
899+ self.assertFalse(os.path.exists(json_file))
900+ self.assertIn(
901+ "WARNING: Error persisting instance-data.json: 'utf8' codec can't"
902+ " decode byte 0xaa in position 2: invalid start byte",
903+ self.logs.getvalue())
904diff --git a/cloudinit/tests/helpers.py b/cloudinit/tests/helpers.py
905index e0adbda..feb884a 100644
906--- a/cloudinit/tests/helpers.py
907+++ b/cloudinit/tests/helpers.py
908@@ -3,7 +3,6 @@
909 from __future__ import print_function
910
911 import functools
912-import json
913 import logging
914 import os
915 import shutil
916@@ -104,7 +103,6 @@ class TestCase(unittest2.TestCase):
917 super(TestCase, self).setUp()
918 self.reset_global_state()
919
920-<<<<<<< cloudinit/tests/helpers.py
921 def add_patch(self, target, attr, **kwargs):
922 """Patches specified target object and sets it as attr on test
923 instance also schedules cleanup"""
924@@ -115,8 +113,6 @@ class TestCase(unittest2.TestCase):
925 self.addCleanup(m.stop)
926 setattr(self, attr, p)
927
928-=======
929->>>>>>> cloudinit/tests/helpers.py
930
931 class CiTestCase(TestCase):
932 """This is the preferred test case base class unless user
933@@ -340,12 +336,6 @@ def dir2dict(startdir, prefix=None):
934 return flist
935
936
937-def json_dumps(data):
938- # print data in nicely formatted json.
939- return json.dumps(data, indent=1, sort_keys=True,
940- separators=(',', ': '))
941-
942-
943 def wrap_and_call(prefix, mocks, func, *args, **kwargs):
944 """
945 call func(args, **kwargs) with mocks applied, then unapplies mocks
946diff --git a/cloudinit/util.py b/cloudinit/util.py
947index 6c014ba..30c5995 100644
948--- a/cloudinit/util.py
949+++ b/cloudinit/util.py
950@@ -533,15 +533,6 @@ def multi_log(text, console=True, stderr=True,
951 log.log(log_level, text)
952
953
954-def load_json(text, root_types=(dict,)):
955- decoded = json.loads(decode_binary(text))
956- if not isinstance(decoded, tuple(root_types)):
957- expected_types = ", ".join([str(t) for t in root_types])
958- raise TypeError("(%s) root types expected, got %s instead"
959- % (expected_types, type(decoded)))
960- return decoded
961-
962-
963 def is_ipv4(instr):
964 """determine if input string is a ipv4 address. return boolean."""
965 toks = instr.split('.')
966@@ -1454,7 +1445,31 @@ def ensure_dirs(dirlist, mode=0o755):
967 ensure_dir(d, mode)
968
969
970+def load_json(text, root_types=(dict,)):
971+ decoded = json.loads(decode_binary(text))
972+ if not isinstance(decoded, tuple(root_types)):
973+ expected_types = ", ".join([str(t) for t in root_types])
974+ raise TypeError("(%s) root types expected, got %s instead"
975+ % (expected_types, type(decoded)))
976+ return decoded
977+
978+
979+def json_serialize_default(_obj):
980+ """Handler for types which aren't json serializable."""
981+ try:
982+ return 'ci-b64:{0}'.format(b64e(_obj))
983+ except AttributeError:
984+ return 'Warning: redacted unserializable type {0}'.format(type(_obj))
985+
986+
987+def json_dumps(data):
988+ """Return data in nicely formatted json."""
989+ return json.dumps(data, indent=1, sort_keys=True,
990+ separators=(',', ': '), default=json_serialize_default)
991+
992+
993 def yaml_dumps(obj, explicit_start=True, explicit_end=True):
994+ """Return data in nicely formatted yaml."""
995 return yaml.safe_dump(obj,
996 line_break="\n",
997 indent=4,
998diff --git a/tests/unittests/test_datasource/test_aliyun.py b/tests/unittests/test_datasource/test_aliyun.py
999index 82ee971..714f5da 100644
1000--- a/tests/unittests/test_datasource/test_aliyun.py
1001+++ b/tests/unittests/test_datasource/test_aliyun.py
1002@@ -67,7 +67,7 @@ class TestAliYunDatasource(test_helpers.HttprettyTestCase):
1003 super(TestAliYunDatasource, self).setUp()
1004 cfg = {'datasource': {'AliYun': {'timeout': '1', 'max_wait': '1'}}}
1005 distro = {}
1006- paths = helpers.Paths({})
1007+ paths = helpers.Paths({'run_dir': self.tmp_dir()})
1008 self.ds = ay.DataSourceAliYun(cfg, distro, paths)
1009 self.metadata_address = self.ds.metadata_urls[0]
1010
1011diff --git a/tests/unittests/test_datasource/test_altcloud.py b/tests/unittests/test_datasource/test_altcloud.py
1012index a4dfb54..3253f3a 100644
1013--- a/tests/unittests/test_datasource/test_altcloud.py
1014+++ b/tests/unittests/test_datasource/test_altcloud.py
1015@@ -18,7 +18,7 @@ import tempfile
1016 from cloudinit import helpers
1017 from cloudinit import util
1018
1019-from cloudinit.tests.helpers import TestCase
1020+from cloudinit.tests.helpers import CiTestCase
1021
1022 import cloudinit.sources.DataSourceAltCloud as dsac
1023
1024@@ -97,7 +97,7 @@ def _dmi_data(expected):
1025 return _data
1026
1027
1028-class TestGetCloudType(TestCase):
1029+class TestGetCloudType(CiTestCase):
1030 '''
1031 Test to exercise method: DataSourceAltCloud.get_cloud_type()
1032 '''
1033@@ -143,14 +143,16 @@ class TestGetCloudType(TestCase):
1034 self.assertEqual('UNKNOWN', dsrc.get_cloud_type())
1035
1036
1037-class TestGetDataCloudInfoFile(TestCase):
1038+class TestGetDataCloudInfoFile(CiTestCase):
1039 '''
1040 Test to exercise method: DataSourceAltCloud.get_data()
1041 With a contrived CLOUD_INFO_FILE
1042 '''
1043 def setUp(self):
1044 '''Set up.'''
1045- self.paths = helpers.Paths({'cloud_dir': '/tmp'})
1046+ self.tmp = self.tmp_dir()
1047+ self.paths = helpers.Paths(
1048+ {'cloud_dir': self.tmp, 'run_dir': self.tmp})
1049 self.cloud_info_file = tempfile.mkstemp()[1]
1050 self.dmi_data = util.read_dmi_data
1051 dsac.CLOUD_INFO_FILE = self.cloud_info_file
1052@@ -207,14 +209,16 @@ class TestGetDataCloudInfoFile(TestCase):
1053 self.assertEqual(False, dsrc.get_data())
1054
1055
1056-class TestGetDataNoCloudInfoFile(TestCase):
1057+class TestGetDataNoCloudInfoFile(CiTestCase):
1058 '''
1059 Test to exercise method: DataSourceAltCloud.get_data()
1060 Without a CLOUD_INFO_FILE
1061 '''
1062 def setUp(self):
1063 '''Set up.'''
1064- self.paths = helpers.Paths({'cloud_dir': '/tmp'})
1065+ self.tmp = self.tmp_dir()
1066+ self.paths = helpers.Paths(
1067+ {'cloud_dir': self.tmp, 'run_dir': self.tmp})
1068 self.dmi_data = util.read_dmi_data
1069 dsac.CLOUD_INFO_FILE = \
1070 'no such file'
1071@@ -254,7 +258,7 @@ class TestGetDataNoCloudInfoFile(TestCase):
1072 self.assertEqual(False, dsrc.get_data())
1073
1074
1075-class TestUserDataRhevm(TestCase):
1076+class TestUserDataRhevm(CiTestCase):
1077 '''
1078 Test to exercise method: DataSourceAltCloud.user_data_rhevm()
1079 '''
1080@@ -320,7 +324,7 @@ class TestUserDataRhevm(TestCase):
1081 self.assertEqual(False, dsrc.user_data_rhevm())
1082
1083
1084-class TestUserDataVsphere(TestCase):
1085+class TestUserDataVsphere(CiTestCase):
1086 '''
1087 Test to exercise method: DataSourceAltCloud.user_data_vsphere()
1088 '''
1089@@ -368,7 +372,7 @@ class TestUserDataVsphere(TestCase):
1090 self.assertEqual(1, m_mount_cb.call_count)
1091
1092
1093-class TestReadUserDataCallback(TestCase):
1094+class TestReadUserDataCallback(CiTestCase):
1095 '''
1096 Test to exercise method: DataSourceAltCloud.read_user_data_callback()
1097 '''
1098diff --git a/tests/unittests/test_datasource/test_azure.py b/tests/unittests/test_datasource/test_azure.py
1099index 7cb1812..226c214 100644
1100--- a/tests/unittests/test_datasource/test_azure.py
1101+++ b/tests/unittests/test_datasource/test_azure.py
1102@@ -11,9 +11,7 @@ from cloudinit.tests.helpers import (CiTestCase, TestCase, populate_dir, mock,
1103
1104 import crypt
1105 import os
1106-import shutil
1107 import stat
1108-import tempfile
1109 import xml.etree.ElementTree as ET
1110 import yaml
1111
1112@@ -84,11 +82,11 @@ class TestAzureDataSource(CiTestCase):
1113 super(TestAzureDataSource, self).setUp()
1114 if PY26:
1115 raise SkipTest("Does not work on python 2.6")
1116- self.tmp = tempfile.mkdtemp()
1117- self.addCleanup(shutil.rmtree, self.tmp)
1118+ self.tmp = self.tmp_dir()
1119
1120 # patch cloud_dir, so our 'seed_dir' is guaranteed empty
1121- self.paths = helpers.Paths({'cloud_dir': self.tmp})
1122+ self.paths = helpers.Paths(
1123+ {'cloud_dir': self.tmp, 'run_dir': self.tmp})
1124 self.waagent_d = os.path.join(self.tmp, 'var', 'lib', 'waagent')
1125
1126 self.patches = ExitStack()
1127@@ -642,7 +640,7 @@ fdescfs /dev/fd fdescfs rw 0 0
1128 self.assertEqual(netconfig, expected_config)
1129
1130
1131-class TestAzureBounce(TestCase):
1132+class TestAzureBounce(CiTestCase):
1133
1134 def mock_out_azure_moving_parts(self):
1135 self.patches.enter_context(
1136@@ -669,10 +667,10 @@ class TestAzureBounce(TestCase):
1137
1138 def setUp(self):
1139 super(TestAzureBounce, self).setUp()
1140- self.tmp = tempfile.mkdtemp()
1141+ self.tmp = self.tmp_dir()
1142 self.waagent_d = os.path.join(self.tmp, 'var', 'lib', 'waagent')
1143- self.paths = helpers.Paths({'cloud_dir': self.tmp})
1144- self.addCleanup(shutil.rmtree, self.tmp)
1145+ self.paths = helpers.Paths(
1146+ {'cloud_dir': self.tmp, 'run_dir': self.tmp})
1147 dsaz.BUILTIN_DS_CONFIG['data_dir'] = self.waagent_d
1148 self.patches = ExitStack()
1149 self.mock_out_azure_moving_parts()
1150@@ -714,21 +712,24 @@ class TestAzureBounce(TestCase):
1151
1152 def test_disabled_bounce_does_not_change_hostname(self):
1153 cfg = {'hostname_bounce': {'policy': 'off'}}
1154- self._get_ds(self.get_ovf_env_with_dscfg('test-host', cfg)).get_data()
1155+ ds = self._get_ds(self.get_ovf_env_with_dscfg('test-host', cfg))
1156+ ds.get_data()
1157 self.assertEqual(0, self.set_hostname.call_count)
1158
1159 @mock.patch('cloudinit.sources.DataSourceAzure.perform_hostname_bounce')
1160 def test_disabled_bounce_does_not_perform_bounce(
1161 self, perform_hostname_bounce):
1162 cfg = {'hostname_bounce': {'policy': 'off'}}
1163- self._get_ds(self.get_ovf_env_with_dscfg('test-host', cfg)).get_data()
1164+ ds = self._get_ds(self.get_ovf_env_with_dscfg('test-host', cfg))
1165+ ds.get_data()
1166 self.assertEqual(0, perform_hostname_bounce.call_count)
1167
1168 def test_same_hostname_does_not_change_hostname(self):
1169 host_name = 'unchanged-host-name'
1170 self.get_hostname.return_value = host_name
1171 cfg = {'hostname_bounce': {'policy': 'yes'}}
1172- self._get_ds(self.get_ovf_env_with_dscfg(host_name, cfg)).get_data()
1173+ ds = self._get_ds(self.get_ovf_env_with_dscfg(host_name, cfg))
1174+ ds.get_data()
1175 self.assertEqual(0, self.set_hostname.call_count)
1176
1177 @mock.patch('cloudinit.sources.DataSourceAzure.perform_hostname_bounce')
1178@@ -737,7 +738,8 @@ class TestAzureBounce(TestCase):
1179 host_name = 'unchanged-host-name'
1180 self.get_hostname.return_value = host_name
1181 cfg = {'hostname_bounce': {'policy': 'yes'}}
1182- self._get_ds(self.get_ovf_env_with_dscfg(host_name, cfg)).get_data()
1183+ ds = self._get_ds(self.get_ovf_env_with_dscfg(host_name, cfg))
1184+ ds.get_data()
1185 self.assertEqual(0, perform_hostname_bounce.call_count)
1186
1187 @mock.patch('cloudinit.sources.DataSourceAzure.perform_hostname_bounce')
1188diff --git a/tests/unittests/test_datasource/test_azure_helper.py b/tests/unittests/test_datasource/test_azure_helper.py
1189index 0c468dc..b42b073 100644
1190--- a/tests/unittests/test_datasource/test_azure_helper.py
1191+++ b/tests/unittests/test_datasource/test_azure_helper.py
1192@@ -4,11 +4,7 @@ import os
1193 from textwrap import dedent
1194
1195 from cloudinit.sources.helpers import azure as azure_helper
1196-<<<<<<< tests/unittests/test_datasource/test_azure_helper.py
1197 from cloudinit.tests.helpers import CiTestCase, ExitStack, mock, populate_dir
1198-=======
1199-from cloudinit.tests.helpers import ExitStack, mock, TestCase
1200->>>>>>> tests/unittests/test_datasource/test_azure_helper.py
1201
1202 from cloudinit.sources.helpers.azure import WALinuxAgentShim as wa_shim
1203
1204diff --git a/tests/unittests/test_datasource/test_cloudsigma.py b/tests/unittests/test_datasource/test_cloudsigma.py
1205index e4c5990..f6a59b6 100644
1206--- a/tests/unittests/test_datasource/test_cloudsigma.py
1207+++ b/tests/unittests/test_datasource/test_cloudsigma.py
1208@@ -3,6 +3,7 @@
1209 import copy
1210
1211 from cloudinit.cs_utils import Cepko
1212+from cloudinit import helpers
1213 from cloudinit import sources
1214 from cloudinit.sources import DataSourceCloudSigma
1215
1216@@ -38,10 +39,12 @@ class CepkoMock(Cepko):
1217 return self
1218
1219
1220-class DataSourceCloudSigmaTest(test_helpers.TestCase):
1221+class DataSourceCloudSigmaTest(test_helpers.CiTestCase):
1222 def setUp(self):
1223 super(DataSourceCloudSigmaTest, self).setUp()
1224- self.datasource = DataSourceCloudSigma.DataSourceCloudSigma("", "", "")
1225+ self.paths = helpers.Paths({'run_dir': self.tmp_dir()})
1226+ self.datasource = DataSourceCloudSigma.DataSourceCloudSigma(
1227+ "", "", paths=self.paths)
1228 self.datasource.is_running_in_cloudsigma = lambda: True
1229 self.datasource.cepko = CepkoMock(SERVER_CONTEXT)
1230 self.datasource.get_data()
1231@@ -85,7 +88,8 @@ class DataSourceCloudSigmaTest(test_helpers.TestCase):
1232 def test_lack_of_vendor_data(self):
1233 stripped_context = copy.deepcopy(SERVER_CONTEXT)
1234 del stripped_context["vendor_data"]
1235- self.datasource = DataSourceCloudSigma.DataSourceCloudSigma("", "", "")
1236+ self.datasource = DataSourceCloudSigma.DataSourceCloudSigma(
1237+ "", "", paths=self.paths)
1238 self.datasource.cepko = CepkoMock(stripped_context)
1239 self.datasource.get_data()
1240
1241@@ -94,7 +98,8 @@ class DataSourceCloudSigmaTest(test_helpers.TestCase):
1242 def test_lack_of_cloudinit_key_in_vendor_data(self):
1243 stripped_context = copy.deepcopy(SERVER_CONTEXT)
1244 del stripped_context["vendor_data"]["cloudinit"]
1245- self.datasource = DataSourceCloudSigma.DataSourceCloudSigma("", "", "")
1246+ self.datasource = DataSourceCloudSigma.DataSourceCloudSigma(
1247+ "", "", paths=self.paths)
1248 self.datasource.cepko = CepkoMock(stripped_context)
1249 self.datasource.get_data()
1250
1251diff --git a/tests/unittests/test_datasource/test_cloudstack.py b/tests/unittests/test_datasource/test_cloudstack.py
1252index 180ebee..d6d2d6b 100644
1253--- a/tests/unittests/test_datasource/test_cloudstack.py
1254+++ b/tests/unittests/test_datasource/test_cloudstack.py
1255@@ -5,11 +5,7 @@ from cloudinit import util
1256 from cloudinit.sources.DataSourceCloudStack import (
1257 DataSourceCloudStack, get_latest_lease)
1258
1259-<<<<<<< tests/unittests/test_datasource/test_cloudstack.py
1260 from cloudinit.tests.helpers import CiTestCase, ExitStack, mock
1261-=======
1262-from cloudinit.tests.helpers import TestCase, mock, ExitStack
1263->>>>>>> tests/unittests/test_datasource/test_cloudstack.py
1264
1265 import os
1266 import time
1267@@ -37,6 +33,7 @@ class TestCloudStackPasswordFetching(CiTestCase):
1268 self.patches.enter_context(mock.patch(
1269 mod_name + '.dhcp.networkd_get_option_from_leases',
1270 get_networkd_server_address))
1271+ self.tmp = self.tmp_dir()
1272
1273 def _set_password_server_response(self, response_string):
1274 subp = mock.MagicMock(return_value=(response_string, ''))
1275@@ -47,26 +44,30 @@ class TestCloudStackPasswordFetching(CiTestCase):
1276
1277 def test_empty_password_doesnt_create_config(self):
1278 self._set_password_server_response('')
1279- ds = DataSourceCloudStack({}, None, helpers.Paths({}))
1280+ ds = DataSourceCloudStack(
1281+ {}, None, helpers.Paths({'run_dir': self.tmp}))
1282 ds.get_data()
1283 self.assertEqual({}, ds.get_config_obj())
1284
1285 def test_saved_password_doesnt_create_config(self):
1286 self._set_password_server_response('saved_password')
1287- ds = DataSourceCloudStack({}, None, helpers.Paths({}))
1288+ ds = DataSourceCloudStack(
1289+ {}, None, helpers.Paths({'run_dir': self.tmp}))
1290 ds.get_data()
1291 self.assertEqual({}, ds.get_config_obj())
1292
1293 def test_password_sets_password(self):
1294 password = 'SekritSquirrel'
1295 self._set_password_server_response(password)
1296- ds = DataSourceCloudStack({}, None, helpers.Paths({}))
1297+ ds = DataSourceCloudStack(
1298+ {}, None, helpers.Paths({'run_dir': self.tmp}))
1299 ds.get_data()
1300 self.assertEqual(password, ds.get_config_obj()['password'])
1301
1302 def test_bad_request_doesnt_stop_ds_from_working(self):
1303 self._set_password_server_response('bad_request')
1304- ds = DataSourceCloudStack({}, None, helpers.Paths({}))
1305+ ds = DataSourceCloudStack(
1306+ {}, None, helpers.Paths({'run_dir': self.tmp}))
1307 self.assertTrue(ds.get_data())
1308
1309 def assertRequestTypesSent(self, subp, expected_request_types):
1310@@ -81,14 +82,16 @@ class TestCloudStackPasswordFetching(CiTestCase):
1311 def test_valid_response_means_password_marked_as_saved(self):
1312 password = 'SekritSquirrel'
1313 subp = self._set_password_server_response(password)
1314- ds = DataSourceCloudStack({}, None, helpers.Paths({}))
1315+ ds = DataSourceCloudStack(
1316+ {}, None, helpers.Paths({'run_dir': self.tmp}))
1317 ds.get_data()
1318 self.assertRequestTypesSent(subp,
1319 ['send_my_password', 'saved_password'])
1320
1321 def _check_password_not_saved_for(self, response_string):
1322 subp = self._set_password_server_response(response_string)
1323- ds = DataSourceCloudStack({}, None, helpers.Paths({}))
1324+ ds = DataSourceCloudStack(
1325+ {}, None, helpers.Paths({'run_dir': self.tmp}))
1326 ds.get_data()
1327 self.assertRequestTypesSent(subp, ['send_my_password'])
1328
1329diff --git a/tests/unittests/test_datasource/test_configdrive.py b/tests/unittests/test_datasource/test_configdrive.py
1330index 237c189..9849788 100644
1331--- a/tests/unittests/test_datasource/test_configdrive.py
1332+++ b/tests/unittests/test_datasource/test_configdrive.py
1333@@ -725,8 +725,9 @@ class TestConvertNetworkData(TestCase):
1334
1335
1336 def cfg_ds_from_dir(seed_d):
1337+ tmp = tempfile.mkdtemp()
1338 cfg_ds = ds.DataSourceConfigDrive(settings.CFG_BUILTIN, None,
1339- helpers.Paths({}))
1340+ helpers.Paths({'run_dir': tmp}))
1341 cfg_ds.seed_dir = seed_d
1342 cfg_ds.known_macs = KNOWN_MACS.copy()
1343 if not cfg_ds.get_data():
1344diff --git a/tests/unittests/test_datasource/test_digitalocean.py b/tests/unittests/test_datasource/test_digitalocean.py
1345index f264f36..ec32173 100644
1346--- a/tests/unittests/test_datasource/test_digitalocean.py
1347+++ b/tests/unittests/test_datasource/test_digitalocean.py
1348@@ -13,7 +13,7 @@ from cloudinit import settings
1349 from cloudinit.sources import DataSourceDigitalOcean
1350 from cloudinit.sources.helpers import digitalocean
1351
1352-from cloudinit.tests.helpers import mock, TestCase
1353+from cloudinit.tests.helpers import mock, CiTestCase
1354
1355 DO_MULTIPLE_KEYS = ["ssh-rsa AAAAB3NzaC1yc2EAAAA... test1@do.co",
1356 "ssh-rsa AAAAB3NzaC1yc2EAAAA... test2@do.co"]
1357@@ -135,14 +135,17 @@ def _mock_dmi():
1358 return (True, DO_META.get('id'))
1359
1360
1361-class TestDataSourceDigitalOcean(TestCase):
1362+class TestDataSourceDigitalOcean(CiTestCase):
1363 """
1364 Test reading the meta-data
1365 """
1366+ def setUp(self):
1367+ super(TestDataSourceDigitalOcean, self).setUp()
1368+ self.tmp = self.tmp_dir()
1369
1370 def get_ds(self, get_sysinfo=_mock_dmi):
1371 ds = DataSourceDigitalOcean.DataSourceDigitalOcean(
1372- settings.CFG_BUILTIN, None, helpers.Paths({}))
1373+ settings.CFG_BUILTIN, None, helpers.Paths({'run_dir': self.tmp}))
1374 ds.use_ip4LL = False
1375 if get_sysinfo is not None:
1376 ds._get_sysinfo = get_sysinfo
1377@@ -194,7 +197,7 @@ class TestDataSourceDigitalOcean(TestCase):
1378 self.assertIsInstance(ds.get_public_ssh_keys(), list)
1379
1380
1381-class TestNetworkConvert(TestCase):
1382+class TestNetworkConvert(CiTestCase):
1383
1384 @mock.patch('cloudinit.net.get_interfaces_by_mac')
1385 def _get_networking(self, m_get_by_mac):
1386diff --git a/tests/unittests/test_datasource/test_ec2.py b/tests/unittests/test_datasource/test_ec2.py
1387index ba328ee..ba042ea 100644
1388--- a/tests/unittests/test_datasource/test_ec2.py
1389+++ b/tests/unittests/test_datasource/test_ec2.py
1390@@ -186,6 +186,7 @@ class TestEc2(test_helpers.HttprettyTestCase):
1391 super(TestEc2, self).setUp()
1392 self.datasource = ec2.DataSourceEc2
1393 self.metadata_addr = self.datasource.metadata_urls[0]
1394+ self.tmp = self.tmp_dir()
1395
1396 def data_url(self, version):
1397 """Return a metadata url based on the version provided."""
1398@@ -199,7 +200,7 @@ class TestEc2(test_helpers.HttprettyTestCase):
1399 def _setup_ds(self, sys_cfg, platform_data, md, md_version=None):
1400 self.uris = []
1401 distro = {}
1402- paths = helpers.Paths({})
1403+ paths = helpers.Paths({'run_dir': self.tmp})
1404 if sys_cfg is None:
1405 sys_cfg = {}
1406 ds = self.datasource(sys_cfg=sys_cfg, distro=distro, paths=paths)
1407diff --git a/tests/unittests/test_datasource/test_gce.py b/tests/unittests/test_datasource/test_gce.py
1408index d399ae7..82c788d 100644
1409--- a/tests/unittests/test_datasource/test_gce.py
1410+++ b/tests/unittests/test_datasource/test_gce.py
1411@@ -70,9 +70,10 @@ def _set_mock_metadata(gce_meta=None):
1412 class TestDataSourceGCE(test_helpers.HttprettyTestCase):
1413
1414 def setUp(self):
1415+ tmp = self.tmp_dir()
1416 self.ds = DataSourceGCE.DataSourceGCE(
1417 settings.CFG_BUILTIN, None,
1418- helpers.Paths({}))
1419+ helpers.Paths({'run_dir': tmp}))
1420 ppatch = self.m_platform_reports_gce = mock.patch(
1421 'cloudinit.sources.DataSourceGCE.platform_reports_gce')
1422 self.m_platform_reports_gce = ppatch.start()
1423diff --git a/tests/unittests/test_datasource/test_nocloud.py b/tests/unittests/test_datasource/test_nocloud.py
1424index fea9156..70d50de 100644
1425--- a/tests/unittests/test_datasource/test_nocloud.py
1426+++ b/tests/unittests/test_datasource/test_nocloud.py
1427@@ -3,22 +3,20 @@
1428 from cloudinit import helpers
1429 from cloudinit.sources import DataSourceNoCloud
1430 from cloudinit import util
1431-from cloudinit.tests.helpers import TestCase, populate_dir, mock, ExitStack
1432+from cloudinit.tests.helpers import CiTestCase, populate_dir, mock, ExitStack
1433
1434 import os
1435-import shutil
1436-import tempfile
1437 import textwrap
1438 import yaml
1439
1440
1441-class TestNoCloudDataSource(TestCase):
1442+class TestNoCloudDataSource(CiTestCase):
1443
1444 def setUp(self):
1445 super(TestNoCloudDataSource, self).setUp()
1446- self.tmp = tempfile.mkdtemp()
1447- self.addCleanup(shutil.rmtree, self.tmp)
1448- self.paths = helpers.Paths({'cloud_dir': self.tmp})
1449+ self.tmp = self.tmp_dir()
1450+ self.paths = helpers.Paths(
1451+ {'cloud_dir': self.tmp, 'run_dir': self.tmp})
1452
1453 self.cmdline = "root=TESTCMDLINE"
1454
1455@@ -215,7 +213,7 @@ class TestNoCloudDataSource(TestCase):
1456 self.assertNotIn(gateway, str(dsrc.network_config))
1457
1458
1459-class TestParseCommandLineData(TestCase):
1460+class TestParseCommandLineData(CiTestCase):
1461
1462 def test_parse_cmdline_data_valid(self):
1463 ds_id = "ds=nocloud"
1464diff --git a/tests/unittests/test_datasource/test_opennebula.py b/tests/unittests/test_datasource/test_opennebula.py
1465index e7d5569..2326dd5 100644
1466--- a/tests/unittests/test_datasource/test_opennebula.py
1467+++ b/tests/unittests/test_datasource/test_opennebula.py
1468@@ -3,12 +3,10 @@
1469 from cloudinit import helpers
1470 from cloudinit.sources import DataSourceOpenNebula as ds
1471 from cloudinit import util
1472-from cloudinit.tests.helpers import mock, populate_dir, TestCase
1473+from cloudinit.tests.helpers import mock, populate_dir, CiTestCase
1474
1475 import os
1476 import pwd
1477-import shutil
1478-import tempfile
1479 import unittest
1480
1481
1482@@ -36,14 +34,14 @@ PUBLIC_IP = '10.0.0.3'
1483 DS_PATH = "cloudinit.sources.DataSourceOpenNebula"
1484
1485
1486-class TestOpenNebulaDataSource(TestCase):
1487+class TestOpenNebulaDataSource(CiTestCase):
1488 parsed_user = None
1489
1490 def setUp(self):
1491 super(TestOpenNebulaDataSource, self).setUp()
1492- self.tmp = tempfile.mkdtemp()
1493- self.addCleanup(shutil.rmtree, self.tmp)
1494- self.paths = helpers.Paths({'cloud_dir': self.tmp})
1495+ self.tmp = self.tmp_dir()
1496+ self.paths = helpers.Paths(
1497+ {'cloud_dir': self.tmp, 'run_dir': self.tmp})
1498
1499 # defaults for few tests
1500 self.ds = ds.DataSourceOpenNebula
1501diff --git a/tests/unittests/test_datasource/test_openstack.py b/tests/unittests/test_datasource/test_openstack.py
1502index ed367e0..42c3155 100644
1503--- a/tests/unittests/test_datasource/test_openstack.py
1504+++ b/tests/unittests/test_datasource/test_openstack.py
1505@@ -131,6 +131,10 @@ def _read_metadata_service():
1506 class TestOpenStackDataSource(test_helpers.HttprettyTestCase):
1507 VERSION = 'latest'
1508
1509+ def setUp(self):
1510+ super(TestOpenStackDataSource, self).setUp()
1511+ self.tmp = self.tmp_dir()
1512+
1513 @hp.activate
1514 def test_successful(self):
1515 _register_uris(self.VERSION, EC2_FILES, EC2_META, OS_FILES)
1516@@ -232,7 +236,7 @@ class TestOpenStackDataSource(test_helpers.HttprettyTestCase):
1517 _register_uris(self.VERSION, EC2_FILES, EC2_META, OS_FILES)
1518 ds_os = ds.DataSourceOpenStack(settings.CFG_BUILTIN,
1519 None,
1520- helpers.Paths({}))
1521+ helpers.Paths({'run_dir': self.tmp}))
1522 self.assertIsNone(ds_os.version)
1523 found = ds_os.get_data()
1524 self.assertTrue(found)
1525@@ -256,7 +260,7 @@ class TestOpenStackDataSource(test_helpers.HttprettyTestCase):
1526 _register_uris(self.VERSION, {}, {}, os_files)
1527 ds_os = ds.DataSourceOpenStack(settings.CFG_BUILTIN,
1528 None,
1529- helpers.Paths({}))
1530+ helpers.Paths({'run_dir': self.tmp}))
1531 self.assertIsNone(ds_os.version)
1532 found = ds_os.get_data()
1533 self.assertFalse(found)
1534@@ -271,7 +275,7 @@ class TestOpenStackDataSource(test_helpers.HttprettyTestCase):
1535 _register_uris(self.VERSION, {}, {}, os_files)
1536 ds_os = ds.DataSourceOpenStack(settings.CFG_BUILTIN,
1537 None,
1538- helpers.Paths({}))
1539+ helpers.Paths({'run_dir': self.tmp}))
1540 ds_os.ds_cfg = {
1541 'max_wait': 0,
1542 'timeout': 0,
1543@@ -294,7 +298,7 @@ class TestOpenStackDataSource(test_helpers.HttprettyTestCase):
1544 _register_uris(self.VERSION, {}, {}, os_files)
1545 ds_os = ds.DataSourceOpenStack(settings.CFG_BUILTIN,
1546 None,
1547- helpers.Paths({}))
1548+ helpers.Paths({'run_dir': self.tmp}))
1549 ds_os.ds_cfg = {
1550 'max_wait': 0,
1551 'timeout': 0,
1552diff --git a/tests/unittests/test_datasource/test_scaleway.py b/tests/unittests/test_datasource/test_scaleway.py
1553index 436df9e..8dec06b 100644
1554--- a/tests/unittests/test_datasource/test_scaleway.py
1555+++ b/tests/unittests/test_datasource/test_scaleway.py
1556@@ -9,7 +9,7 @@ from cloudinit import helpers
1557 from cloudinit import settings
1558 from cloudinit.sources import DataSourceScaleway
1559
1560-from cloudinit.tests.helpers import mock, HttprettyTestCase, TestCase
1561+from cloudinit.tests.helpers import mock, HttprettyTestCase, CiTestCase
1562
1563
1564 class DataResponses(object):
1565@@ -63,7 +63,11 @@ class MetadataResponses(object):
1566 return 200, headers, json.dumps(cls.FAKE_METADATA)
1567
1568
1569-class TestOnScaleway(TestCase):
1570+class TestOnScaleway(CiTestCase):
1571+
1572+ def setUp(self):
1573+ super(TestOnScaleway, self).setUp()
1574+ self.tmp = self.tmp_dir()
1575
1576 def install_mocks(self, fake_dmi, fake_file_exists, fake_cmdline):
1577 mock, faked = fake_dmi
1578@@ -91,7 +95,7 @@ class TestOnScaleway(TestCase):
1579
1580 # When not on Scaleway, get_data() returns False.
1581 datasource = DataSourceScaleway.DataSourceScaleway(
1582- settings.CFG_BUILTIN, None, helpers.Paths({})
1583+ settings.CFG_BUILTIN, None, helpers.Paths({'run_dir': self.tmp})
1584 )
1585 self.assertFalse(datasource.get_data())
1586
1587@@ -159,8 +163,9 @@ def get_source_address_adapter(*args, **kwargs):
1588 class TestDataSourceScaleway(HttprettyTestCase):
1589
1590 def setUp(self):
1591+ tmp = self.tmp_dir()
1592 self.datasource = DataSourceScaleway.DataSourceScaleway(
1593- settings.CFG_BUILTIN, None, helpers.Paths({})
1594+ settings.CFG_BUILTIN, None, helpers.Paths({'run_dir': tmp})
1595 )
1596 super(TestDataSourceScaleway, self).setUp()
1597
1598diff --git a/tests/unittests/test_datasource/test_smartos.py b/tests/unittests/test_datasource/test_smartos.py
1599index 933d5b6..88bae5f 100644
1600--- a/tests/unittests/test_datasource/test_smartos.py
1601+++ b/tests/unittests/test_datasource/test_smartos.py
1602@@ -359,7 +359,8 @@ class TestSmartOSDataSource(FilesystemMockingTestCase):
1603
1604 self.tmp = tempfile.mkdtemp()
1605 self.addCleanup(shutil.rmtree, self.tmp)
1606- self.paths = c_helpers.Paths({'cloud_dir': self.tmp})
1607+ self.paths = c_helpers.Paths(
1608+ {'cloud_dir': self.tmp, 'run_dir': self.tmp})
1609
1610 self.legacy_user_d = os.path.join(self.tmp, 'legacy_user_tmp')
1611 os.mkdir(self.legacy_user_d)
1612diff --git a/tests/unittests/test_ds_identify.py b/tests/unittests/test_ds_identify.py
1613index 1284e75..7a920d4 100644
1614--- a/tests/unittests/test_ds_identify.py
1615+++ b/tests/unittests/test_ds_identify.py
1616@@ -7,7 +7,7 @@ from uuid import uuid4
1617 from cloudinit import safeyaml
1618 from cloudinit import util
1619 from cloudinit.tests.helpers import (
1620- CiTestCase, dir2dict, json_dumps, populate_dir)
1621+ CiTestCase, dir2dict, populate_dir)
1622
1623 UNAME_MYSYS = ("Linux bart 4.4.0-62-generic #83-Ubuntu "
1624 "SMP Wed Jan 18 14:10:15 UTC 2017 x86_64 GNU/Linux")
1625@@ -319,7 +319,7 @@ def _print_run_output(rc, out, err, cfg, files):
1626 '-- rc = %s --' % rc,
1627 '-- out --', str(out),
1628 '-- err --', str(err),
1629- '-- cfg --', json_dumps(cfg)]))
1630+ '-- cfg --', util.json_dumps(cfg)]))
1631 print('-- files --')
1632 for k, v in files.items():
1633 if "/_shwrap" in k:
1634diff --git a/tests/unittests/test_handler/test_handler_chef.py b/tests/unittests/test_handler/test_handler_chef.py
1635index 002fa19..0136a93 100644
1636--- a/tests/unittests/test_handler/test_handler_chef.py
1637+++ b/tests/unittests/test_handler/test_handler_chef.py
1638@@ -13,12 +13,8 @@ from cloudinit import helpers
1639 from cloudinit.sources import DataSourceNone
1640 from cloudinit import util
1641
1642-<<<<<<< tests/unittests/test_handler/test_handler_chef.py
1643 from cloudinit.tests.helpers import (
1644 CiTestCase, FilesystemMockingTestCase, mock, skipIf)
1645-=======
1646-from cloudinit.tests import helpers as t_help
1647->>>>>>> tests/unittests/test_handler/test_handler_chef.py
1648
1649 LOG = logging.getLogger(__name__)
1650
1651diff --git a/tests/unittests/test_handler/test_handler_lxd.py b/tests/unittests/test_handler/test_handler_lxd.py
1652index a8ce315..e0d9ab6 100644
1653--- a/tests/unittests/test_handler/test_handler_lxd.py
1654+++ b/tests/unittests/test_handler/test_handler_lxd.py
1655@@ -4,11 +4,6 @@ from cloudinit.config import cc_lxd
1656 from cloudinit.sources import DataSourceNoCloud
1657 from cloudinit import (distros, helpers, cloud)
1658 from cloudinit.tests import helpers as t_help
1659-<<<<<<< tests/unittests/test_handler/test_handler_lxd.py
1660-=======
1661-
1662-import logging
1663->>>>>>> tests/unittests/test_handler/test_handler_lxd.py
1664
1665 try:
1666 from unittest import mock
1667diff --git a/tests/unittests/test_runs/test_merge_run.py b/tests/unittests/test_runs/test_merge_run.py
1668index add9365..5d3f1ca 100644
1669--- a/tests/unittests/test_runs/test_merge_run.py
1670+++ b/tests/unittests/test_runs/test_merge_run.py
1671@@ -23,6 +23,7 @@ class TestMergeRun(helpers.FilesystemMockingTestCase):
1672 cfg = {
1673 'datasource_list': ['None'],
1674 'cloud_init_modules': ['write-files'],
1675+ 'system_info': {'paths': {'run_dir': new_root}}
1676 }
1677 ud = self.readResource('user_data.1.txt')
1678 cloud_cfg = util.yaml_dumps(cfg)
1679diff --git a/tests/unittests/test_runs/test_simple_run.py b/tests/unittests/test_runs/test_simple_run.py
1680index b8fb479..762974e 100644
1681--- a/tests/unittests/test_runs/test_simple_run.py
1682+++ b/tests/unittests/test_runs/test_simple_run.py
1683@@ -2,10 +2,10 @@
1684
1685 import os
1686
1687-from cloudinit.tests import helpers
1688
1689 from cloudinit.settings import PER_INSTANCE
1690 from cloudinit import stages
1691+from cloudinit.tests import helpers
1692 from cloudinit import util
1693
1694
1695@@ -23,6 +23,7 @@ class TestSimpleRun(helpers.FilesystemMockingTestCase):
1696 'datasource_list': ['None'],
1697 'runcmd': ['ls /etc'], # test ALL_DISTROS
1698 'spacewalk': {}, # test non-ubuntu distros module definition
1699+ 'system_info': {'paths': {'run_dir': self.new_root}},
1700 'write_files': [
1701 {
1702 'path': '/etc/blah.ini',
1703diff --git a/tests/unittests/test_vmware_config_file.py b/tests/unittests/test_vmware_config_file.py
1704index 306ee15..808d303 100644
1705--- a/tests/unittests/test_vmware_config_file.py
1706+++ b/tests/unittests/test_vmware_config_file.py
1707@@ -8,7 +8,6 @@
1708 import logging
1709 import sys
1710
1711-<<<<<<< tests/unittests/test_vmware_config_file.py
1712 from cloudinit.sources.DataSourceOVF import get_network_config_from_conf
1713 from cloudinit.sources.DataSourceOVF import read_vmware_imc
1714 from cloudinit.sources.helpers.vmware.imc.boot_proto import BootProtoEnum
1715@@ -16,11 +15,6 @@ from cloudinit.sources.helpers.vmware.imc.config import Config
1716 from cloudinit.sources.helpers.vmware.imc.config_file import ConfigFile
1717 from cloudinit.sources.helpers.vmware.imc.config_nic import gen_subnet
1718 from cloudinit.sources.helpers.vmware.imc.config_nic import NicConfigurator
1719-=======
1720-from cloudinit.sources.helpers.vmware.imc.boot_proto import BootProtoEnum
1721-from cloudinit.sources.helpers.vmware.imc.config import Config
1722-from cloudinit.sources.helpers.vmware.imc.config_file import ConfigFile
1723->>>>>>> tests/unittests/test_vmware_config_file.py
1724 from cloudinit.tests.helpers import CiTestCase
1725
1726 logging.basicConfig(level=logging.DEBUG, stream=sys.stdout)

Subscribers

People subscribed via source and target branches