Merge lp:~gocept/landscape-client/py3-package-store-reporter into lp:~landscape/landscape-client/trunk
- py3-package-store-reporter
- Merge into trunk
Status: | Merged |
---|---|
Approved by: | Данило Шеган |
Approved revision: | 987 |
Merged at revision: | 965 |
Proposed branch: | lp:~gocept/landscape-client/py3-package-store-reporter |
Merge into: | lp:~landscape/landscape-client/trunk |
Diff against target: |
664 lines (+122/-118) 11 files modified
landscape/broker/store.py (+3/-10) landscape/compat.py (+0/-13) landscape/package/facade.py (+2/-1) landscape/package/reporter.py (+4/-2) landscape/package/skeleton.py (+6/-2) landscape/package/store.py (+10/-9) landscape/package/tests/helpers.py (+32/-17) landscape/package/tests/test_reporter.py (+25/-27) landscape/package/tests/test_store.py (+35/-35) landscape/schema.py (+2/-2) py3_ready_tests (+3/-0) |
To merge this branch: | bzr merge lp:~gocept/landscape-client/py3-package-store-reporter |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Данило Шеган (community) | Approve | ||
🤖 Landscape Builder | test results | Approve | |
Daniel Havlik (community) | Approve | ||
Review via email: mp+320148@code.launchpad.net |
Commit message
Support py2/3 in landscape.
Description of the change
We are on the journey to get the Bytes / Unicode story solved for Python 2/3 compatibility.
This MP considers code in landscape.
🤖 Landscape Builder (landscape-builder) : | # |
- 982. By Steffen Allner
-
Comply with PEP-8 and coding style.
🤖 Landscape Builder (landscape-builder) wrote : | # |
🤖 Landscape Builder (landscape-builder) : | # |
🤖 Landscape Builder (landscape-builder) wrote : | # |
Command: TRIAL_ARGS=-j4 make check
Result: Success
Revno: 982
Branch: lp:~gocept/landscape-client/py3-package-store-reporter
Jenkins: https:/
Данило Шеган (danilo) wrote : | # |
FWIW, bpickle is now back to landscape.
- 983. By Steffen Allner
-
Backmerge from trunk with py2/3 bpickle.
Steffen Allner (sallner) wrote : | # |
> FWIW, bpickle is now back to landscape.
> conflicts in the imports now (Launchpad does not regenerate the diff when the
> target changes).
I have merged the trunk back into this branch. Should be updated now.
🤖 Landscape Builder (landscape-builder) : | # |
🤖 Landscape Builder (landscape-builder) wrote : | # |
Command: TRIAL_ARGS=-j4 make check
Result: Success
Revno: 983
Branch: lp:~gocept/landscape-client/py3-package-store-reporter
Jenkins: https:/
Данило Шеган (danilo) wrote : | # |
Looks good: a few minor nits inline. I'll just do a test run with this before I mark it approved.
Данило Шеган (danilo) : | # |
Steffen Allner (sallner) wrote : | # |
See inline comments. I would especially interested in a direction for the StrictVersion case.
- 984. By Steffen Allner
-
Remove no longer needed helper method.
- 985. By Steffen Allner
-
Remove superfluous assignment.
- 986. By Steffen Allner
-
Use undocumented parameter to ensure bytes as return value.
Steffen Allner (sallner) wrote : | # |
I cleaned up and documented the mentioned parts.
Additionally, I opened up a new MP which uses only bytes for software comparison[0].
[0] https:/
Данило Шеган (danilo) wrote : | # |
Replied inline. If you want, we can deal with the is_version_higher() fix in a follow-up branch and get this one landed, but please get a Gocept review as well.
Данило Шеган (danilo) wrote : | # |
Cool, haven't refreshed with inline comments for a while: there's a conflict in this branch now. Please fix a conflict and get a ~gocept review and I'll get this branch landed.
Данило Шеган (danilo) wrote : | # |
After looking at the other branch, I guess you might want to land that one first. Then we can merge trunk here, resolve a conflict and land this one too.
Daniel Havlik (nilo) wrote : | # |
If conflict is fixed: +1
Steffen Allner (sallner) wrote : | # |
Then let's wait for the other branch to arrive and I can fix both afterwards.
Данило Шеган (danilo) wrote : | # |
Landed the other branch, please re-merge trunk into this one and resolve the conflict.
- 987. By Steffen Allner
-
Backmerge from trunk.
🤖 Landscape Builder (landscape-builder) : | # |
🤖 Landscape Builder (landscape-builder) wrote : | # |
Command: TRIAL_ARGS=-j4 make check
Result: Fail
Revno: 987
Branch: lp:~gocept/landscape-client/py3-package-store-reporter
Jenkins: https:/
Steffen Allner (sallner) wrote : | # |
It seems there is again the same flaky test, we have seen before. Danilo, do you have an idea, how we could make it more robust? I can never reproduce that failre with Python 2. Maybe you want to restart it.
🤖 Landscape Builder (landscape-builder) : | # |
🤖 Landscape Builder (landscape-builder) wrote : | # |
Command: TRIAL_ARGS=-j4 make check
Result: Success
Revno: 987
Branch: lp:~gocept/landscape-client/py3-package-store-reporter
Jenkins: https:/
Preview Diff
1 | === modified file 'landscape/broker/store.py' | |||
2 | --- landscape/broker/store.py 2017-03-17 09:26:45 +0000 | |||
3 | +++ landscape/broker/store.py 2017-03-21 17:01:15 +0000 | |||
4 | @@ -101,7 +101,7 @@ | |||
5 | 101 | 101 | ||
6 | 102 | from landscape import DEFAULT_SERVER_API | 102 | from landscape import DEFAULT_SERVER_API |
7 | 103 | from landscape.lib import bpickle | 103 | from landscape.lib import bpickle |
9 | 104 | from landscape.lib.fs import create_binary_file | 104 | from landscape.lib.fs import create_binary_file, read_binary_file |
10 | 105 | from landscape.lib.versioning import sort_versions, is_version_higher | 105 | from landscape.lib.versioning import sort_versions, is_version_higher |
11 | 106 | 106 | ||
12 | 107 | 107 | ||
13 | @@ -260,7 +260,7 @@ | |||
14 | 260 | for filename in self._walk_pending_messages(): | 260 | for filename in self._walk_pending_messages(): |
15 | 261 | if max is not None and len(messages) >= max: | 261 | if max is not None and len(messages) >= max: |
16 | 262 | break | 262 | break |
18 | 263 | data = self._get_content(self._message_dir(filename)) | 263 | data = read_binary_file(self._message_dir(filename)) |
19 | 264 | try: | 264 | try: |
20 | 265 | message = bpickle.loads(data) | 265 | message = bpickle.loads(data) |
21 | 266 | except ValueError as e: | 266 | except ValueError as e: |
22 | @@ -436,13 +436,6 @@ | |||
23 | 436 | def _message_dir(self, *args): | 436 | def _message_dir(self, *args): |
24 | 437 | return os.path.join(self._directory, *args) | 437 | return os.path.join(self._directory, *args) |
25 | 438 | 438 | ||
26 | 439 | def _get_content(self, filename): | ||
27 | 440 | file = open(filename, 'rb') | ||
28 | 441 | try: | ||
29 | 442 | return file.read() | ||
30 | 443 | finally: | ||
31 | 444 | file.close() | ||
32 | 445 | |||
33 | 446 | def _reprocess_holding(self): | 439 | def _reprocess_holding(self): |
34 | 447 | """ | 440 | """ |
35 | 448 | Unhold accepted messages left behind, and hold unaccepted | 441 | Unhold accepted messages left behind, and hold unaccepted |
36 | @@ -454,7 +447,7 @@ | |||
37 | 454 | for old_filename in self._walk_messages(): | 447 | for old_filename in self._walk_messages(): |
38 | 455 | flags = self._get_flags(old_filename) | 448 | flags = self._get_flags(old_filename) |
39 | 456 | try: | 449 | try: |
41 | 457 | message = bpickle.loads(self._get_content(old_filename)) | 450 | message = bpickle.loads(read_binary_file(old_filename)) |
42 | 458 | except ValueError as e: | 451 | except ValueError as e: |
43 | 459 | logging.exception(e) | 452 | logging.exception(e) |
44 | 460 | if HELD not in flags: | 453 | if HELD not in flags: |
45 | 461 | 454 | ||
46 | === modified file 'landscape/compat.py' | |||
47 | --- landscape/compat.py 2017-03-17 09:26:45 +0000 | |||
48 | +++ landscape/compat.py 2017-03-21 17:01:15 +0000 | |||
49 | @@ -46,16 +46,3 @@ | |||
50 | 46 | return s.decode(encoding, errors) | 46 | return s.decode(encoding, errors) |
51 | 47 | else: | 47 | else: |
52 | 48 | return s | 48 | return s |
53 | 49 | |||
54 | 50 | |||
55 | 51 | def convert_buffer_to_string(mem_view): | ||
56 | 52 | """ | ||
57 | 53 | Converts a buffer in Python 2 or a memoryview in Python 3 to str. | ||
58 | 54 | |||
59 | 55 | @param mem_view: The view to convert. | ||
60 | 56 | """ | ||
61 | 57 | if _PY3: | ||
62 | 58 | result = mem_view.decode('ascii') | ||
63 | 59 | else: | ||
64 | 60 | result = str(mem_view) | ||
65 | 61 | return result | ||
66 | 62 | 49 | ||
67 | === modified file 'landscape/package/facade.py' | |||
68 | --- landscape/package/facade.py 2017-03-13 15:38:09 +0000 | |||
69 | +++ landscape/package/facade.py 2017-03-21 17:01:15 +0000 | |||
70 | @@ -196,7 +196,8 @@ | |||
71 | 196 | process = subprocess.Popen( | 196 | process = subprocess.Popen( |
72 | 197 | ["dpkg", "--set-selections"] + self._dpkg_args, | 197 | ["dpkg", "--set-selections"] + self._dpkg_args, |
73 | 198 | stdin=subprocess.PIPE) | 198 | stdin=subprocess.PIPE) |
75 | 199 | process.communicate(selection) | 199 | # We need bytes here to communicate with the process. |
76 | 200 | process.communicate(selection.encode("utf-8")) | ||
77 | 200 | 201 | ||
78 | 201 | def set_package_hold(self, version): | 202 | def set_package_hold(self, version): |
79 | 202 | """Add a dpkg hold for a package. | 203 | """Add a dpkg hold for a package. |
80 | 203 | 204 | ||
81 | === modified file 'landscape/package/reporter.py' | |||
82 | --- landscape/package/reporter.py 2017-03-20 09:43:08 +0000 | |||
83 | +++ landscape/package/reporter.py 2017-03-21 17:01:15 +0000 | |||
84 | @@ -20,7 +20,6 @@ | |||
85 | 20 | from landscape.lib.fs import touch_file | 20 | from landscape.lib.fs import touch_file |
86 | 21 | from landscape.lib.lsb_release import parse_lsb_release, LSB_RELEASE_FILENAME | 21 | from landscape.lib.lsb_release import parse_lsb_release, LSB_RELEASE_FILENAME |
87 | 22 | 22 | ||
88 | 23 | from landscape.compat import convert_buffer_to_string | ||
89 | 24 | from landscape.package.taskhandler import ( | 23 | from landscape.package.taskhandler import ( |
90 | 25 | PackageTaskHandlerConfiguration, PackageTaskHandler, run_task_handler) | 24 | PackageTaskHandlerConfiguration, PackageTaskHandler, run_task_handler) |
91 | 26 | from landscape.package.store import UnknownHashIDRequest, FakePackageStore | 25 | from landscape.package.store import UnknownHashIDRequest, FakePackageStore |
92 | @@ -231,6 +230,9 @@ | |||
93 | 231 | self._reactor.call_later( | 230 | self._reactor.call_later( |
94 | 232 | LOCK_RETRY_DELAYS[retry], self._apt_update, deferred) | 231 | LOCK_RETRY_DELAYS[retry], self._apt_update, deferred) |
95 | 233 | out, err, code = yield deferred | 232 | out, err, code = yield deferred |
96 | 233 | out = out.decode("utf-8") | ||
97 | 234 | err = err.decode("utf-8") | ||
98 | 235 | |||
99 | 234 | timestamp = self._reactor.time() | 236 | timestamp = self._reactor.time() |
100 | 235 | 237 | ||
101 | 236 | touch_file(self._config.update_stamp_filename) | 238 | touch_file(self._config.update_stamp_filename) |
102 | @@ -730,7 +732,7 @@ | |||
103 | 730 | messages = global_store.get_messages_by_ids(not_sent) | 732 | messages = global_store.get_messages_by_ids(not_sent) |
104 | 731 | sent = [] | 733 | sent = [] |
105 | 732 | for message_id, message in messages: | 734 | for message_id, message in messages: |
107 | 733 | message = bpickle.loads(convert_buffer_to_string(message)) | 735 | message = bpickle.loads(message) |
108 | 734 | if message["type"] not in got_type: | 736 | if message["type"] not in got_type: |
109 | 735 | got_type.add(message["type"]) | 737 | got_type.add(message["type"]) |
110 | 736 | sent.append(message_id) | 738 | sent.append(message_id) |
111 | 737 | 739 | ||
112 | === modified file 'landscape/package/skeleton.py' | |||
113 | --- landscape/package/skeleton.py 2017-03-10 12:19:57 +0000 | |||
114 | +++ landscape/package/skeleton.py 2017-03-21 17:01:15 +0000 | |||
115 | @@ -50,10 +50,14 @@ | |||
116 | 50 | """ | 50 | """ |
117 | 51 | if self._hash is not None: | 51 | if self._hash is not None: |
118 | 52 | return self._hash | 52 | return self._hash |
120 | 53 | digest = sha1("[%d %s %s]" % (self.type, self.name, self.version)) | 53 | # We use ascii here as encoding for backwards compatibility as it was |
121 | 54 | # default encoding for conversion from unicode to bytes in Python 2.7. | ||
122 | 55 | package_info = b"[%d %s %s]" % ( | ||
123 | 56 | self.type, self.name.encode("ascii"), self.version.encode("ascii")) | ||
124 | 57 | digest = sha1(package_info) | ||
125 | 54 | self.relations.sort() | 58 | self.relations.sort() |
126 | 55 | for pair in self.relations: | 59 | for pair in self.relations: |
128 | 56 | digest.update("[%d %s]" % pair) | 60 | digest.update(b"[%d %s]" % (pair[0], pair[1].encode("ascii"))) |
129 | 57 | return digest.digest() | 61 | return digest.digest() |
130 | 58 | 62 | ||
131 | 59 | def set_hash(self, package_hash): | 63 | def set_hash(self, package_hash): |
132 | 60 | 64 | ||
133 | === modified file 'landscape/package/store.py' | |||
134 | --- landscape/package/store.py 2017-03-17 09:26:45 +0000 | |||
135 | +++ landscape/package/store.py 2017-03-21 17:01:15 +0000 | |||
136 | @@ -7,9 +7,7 @@ | |||
137 | 7 | from pysqlite2 import dbapi2 as sqlite3 | 7 | from pysqlite2 import dbapi2 as sqlite3 |
138 | 8 | 8 | ||
139 | 9 | from twisted.python.compat import iteritems, long | 9 | from twisted.python.compat import iteritems, long |
140 | 10 | from twisted.python.compat import StringType as basestring | ||
141 | 11 | 10 | ||
142 | 12 | from landscape.compat import convert_buffer_to_string | ||
143 | 13 | from landscape.lib import bpickle | 11 | from landscape.lib import bpickle |
144 | 14 | from landscape.lib.store import with_cursor | 12 | from landscape.lib.store import with_cursor |
145 | 15 | 13 | ||
146 | @@ -50,7 +48,10 @@ | |||
147 | 50 | 48 | ||
148 | 51 | @with_cursor | 49 | @with_cursor |
149 | 52 | def get_hash_id(self, cursor, hash): | 50 | def get_hash_id(self, cursor, hash): |
151 | 53 | """Return the id associated to C{hash}, or C{None} if not available.""" | 51 | """Return the id associated to C{hash}, or C{None} if not available. |
152 | 52 | |||
153 | 53 | @param hash: a C{bytes} representing a hash. | ||
154 | 54 | """ | ||
155 | 54 | cursor.execute("SELECT id FROM hash WHERE hash=?", | 55 | cursor.execute("SELECT id FROM hash WHERE hash=?", |
156 | 55 | (sqlite3.Binary(hash),)) | 56 | (sqlite3.Binary(hash),)) |
157 | 56 | value = cursor.fetchone() | 57 | value = cursor.fetchone() |
158 | @@ -62,7 +63,7 @@ | |||
159 | 62 | def get_hash_ids(self, cursor): | 63 | def get_hash_ids(self, cursor): |
160 | 63 | """Return a C{dict} holding all the available hash=>id mappings.""" | 64 | """Return a C{dict} holding all the available hash=>id mappings.""" |
161 | 64 | cursor.execute("SELECT hash, id FROM hash") | 65 | cursor.execute("SELECT hash, id FROM hash") |
163 | 65 | return dict([(str(row[0]), row[1]) for row in cursor.fetchall()]) | 66 | return {bytes(row[0]): row[1] for row in cursor.fetchall()} |
164 | 66 | 67 | ||
165 | 67 | @with_cursor | 68 | @with_cursor |
166 | 68 | def get_id_hash(self, cursor, id): | 69 | def get_id_hash(self, cursor, id): |
167 | @@ -71,7 +72,7 @@ | |||
168 | 71 | cursor.execute("SELECT hash FROM hash WHERE id=?", (id,)) | 72 | cursor.execute("SELECT hash FROM hash WHERE id=?", (id,)) |
169 | 72 | value = cursor.fetchone() | 73 | value = cursor.fetchone() |
170 | 73 | if value: | 74 | if value: |
172 | 74 | return str(value[0]) | 75 | return bytes(value[0]) |
173 | 75 | return None | 76 | return None |
174 | 76 | 77 | ||
175 | 77 | @with_cursor | 78 | @with_cursor |
176 | @@ -149,7 +150,7 @@ | |||
177 | 149 | the attached lookaside databases, falling back to the main one, as | 150 | the attached lookaside databases, falling back to the main one, as |
178 | 150 | described in L{add_hash_id_db}. | 151 | described in L{add_hash_id_db}. |
179 | 151 | """ | 152 | """ |
181 | 152 | assert isinstance(hash, basestring) | 153 | assert isinstance(hash, bytes) |
182 | 153 | 154 | ||
183 | 154 | # Check if we can find the hash=>id mapping in the lookaside stores | 155 | # Check if we can find the hash=>id mapping in the lookaside stores |
184 | 155 | for store in self._hash_id_stores: | 156 | for store in self._hash_id_stores: |
185 | @@ -332,7 +333,7 @@ | |||
186 | 332 | result = cursor.execute( | 333 | result = cursor.execute( |
187 | 333 | "SELECT id, data FROM message WHERE id IN (%s) " | 334 | "SELECT id, data FROM message WHERE id IN (%s) " |
188 | 334 | "ORDER BY id" % params, tuple(message_ids)).fetchall() | 335 | "ORDER BY id" % params, tuple(message_ids)).fetchall() |
190 | 335 | return [(row[0], row[1]) for row in result] | 336 | return [(row[0], bytes(row[1])) for row in result] |
191 | 336 | 337 | ||
192 | 337 | 338 | ||
193 | 338 | class HashIDRequest(object): | 339 | class HashIDRequest(object): |
194 | @@ -346,7 +347,7 @@ | |||
195 | 346 | def hashes(self, cursor): | 347 | def hashes(self, cursor): |
196 | 347 | cursor.execute("SELECT hashes FROM hash_id_request WHERE id=?", | 348 | cursor.execute("SELECT hashes FROM hash_id_request WHERE id=?", |
197 | 348 | (self.id,)) | 349 | (self.id,)) |
199 | 349 | return bpickle.loads(convert_buffer_to_string(cursor.fetchone()[0])) | 350 | return bpickle.loads(bytes(cursor.fetchone()[0])) |
200 | 350 | 351 | ||
201 | 351 | @with_cursor | 352 | @with_cursor |
202 | 352 | def _get_timestamp(self, cursor): | 353 | def _get_timestamp(self, cursor): |
203 | @@ -395,7 +396,7 @@ | |||
204 | 395 | 396 | ||
205 | 396 | self.queue = row[0] | 397 | self.queue = row[0] |
206 | 397 | self.timestamp = row[1] | 398 | self.timestamp = row[1] |
208 | 398 | self.data = bpickle.loads(convert_buffer_to_string(row[2])) | 399 | self.data = bpickle.loads(bytes(row[2])) |
209 | 399 | 400 | ||
210 | 400 | @with_cursor | 401 | @with_cursor |
211 | 401 | def remove(self, cursor): | 402 | def remove(self, cursor): |
212 | 402 | 403 | ||
213 | === modified file 'landscape/package/tests/helpers.py' | |||
214 | --- landscape/package/tests/helpers.py 2017-03-13 15:15:46 +0000 | |||
215 | +++ landscape/package/tests/helpers.py 2017-03-21 17:01:15 +0000 | |||
216 | @@ -6,7 +6,7 @@ | |||
217 | 6 | import apt_inst | 6 | import apt_inst |
218 | 7 | import apt_pkg | 7 | import apt_pkg |
219 | 8 | 8 | ||
221 | 9 | from landscape.lib.fs import append_binary_file, append_text_file | 9 | from landscape.lib.fs import append_binary_file |
222 | 10 | from landscape.lib.fs import create_binary_file | 10 | from landscape.lib.fs import create_binary_file |
223 | 11 | from landscape.package.facade import AptFacade | 11 | from landscape.package.facade import AptFacade |
224 | 12 | 12 | ||
225 | @@ -49,11 +49,17 @@ | |||
226 | 49 | """ % { | 49 | """ % { |
227 | 50 | "name": name, "version": version, | 50 | "name": name, "version": version, |
228 | 51 | "architecture": architecture, | 51 | "architecture": architecture, |
230 | 52 | "description": description.encode("utf-8")}) | 52 | "description": description}).encode("utf-8") |
231 | 53 | # We want to re-order the TagSection, but it requires bytes as input. | ||
232 | 54 | # As we also want to write a binary file, we have to explicitly pass | ||
233 | 55 | # the hardly documented `bytes=True` to TagSection as it would be | ||
234 | 56 | # returned as unicode in Python 3 otherwise. In future versions of | ||
235 | 57 | # apt_pkg there should be a TagSection.write() which is recommended. | ||
236 | 53 | package_stanza = apt_pkg.rewrite_section( | 58 | package_stanza = apt_pkg.rewrite_section( |
240 | 54 | apt_pkg.TagSection(package_stanza), apt_pkg.REWRITE_PACKAGE_ORDER, | 59 | apt_pkg.TagSection(package_stanza, bytes=True), |
241 | 55 | control_fields.items()) | 60 | apt_pkg.REWRITE_PACKAGE_ORDER, |
242 | 56 | append_binary_file(packages_file, "\n" + package_stanza + "\n") | 61 | list(control_fields.items())) |
243 | 62 | append_binary_file(packages_file, b"\n" + package_stanza + b"\n") | ||
244 | 57 | 63 | ||
245 | 58 | def _add_system_package(self, name, architecture="all", version="1.0", | 64 | def _add_system_package(self, name, architecture="all", version="1.0", |
246 | 59 | control_fields=None): | 65 | control_fields=None): |
247 | @@ -72,9 +78,9 @@ | |||
248 | 72 | control = deb.control.extractdata("control") | 78 | control = deb.control.extractdata("control") |
249 | 73 | deb_file.close() | 79 | deb_file.close() |
250 | 74 | lines = control.splitlines() | 80 | lines = control.splitlines() |
254 | 75 | lines.insert(1, "Status: install ok installed") | 81 | lines.insert(1, b"Status: install ok installed") |
255 | 76 | status = "\n".join(lines) | 82 | status = b"\n".join(lines) |
256 | 77 | append_text_file(self.dpkg_status, status + "\n\n") | 83 | append_binary_file(self.dpkg_status, status + b"\n\n") |
257 | 78 | 84 | ||
258 | 79 | def _add_package_to_deb_dir(self, path, name, architecture="all", | 85 | def _add_package_to_deb_dir(self, path, name, architecture="all", |
259 | 80 | version="1.0", description="description", | 86 | version="1.0", description="description", |
260 | @@ -161,7 +167,8 @@ | |||
261 | 161 | "Mzepwz6J7y5jpkIOH6sDKssF1rmUqYzBX2piZj9zyFad5RHv8dLoXsqua2spF3v+PQ" | 167 | "Mzepwz6J7y5jpkIOH6sDKssF1rmUqYzBX2piZj9zyFad5RHv8dLoXsqua2spF3v+PQ" |
262 | 162 | "ffXIlN8aYepsu3x2u0202VX+QFC10st6vvMfDdacgtdzKtpe5G5tuFYx5elcpXm27O" | 168 | "ffXIlN8aYepsu3x2u0202VX+QFC10st6vvMfDdacgtdzKtpe5G5tuFYx5elcpXm27O" |
263 | 163 | "d8LH7Oj3mqP7VgD8P6dTmJ33dsPnpuBnPO3SvLDNlu6ay9It6yZon0BIZRMApGwSgY" | 169 | "d8LH7Oj3mqP7VgD8P6dTmJ33dsPnpuBnPO3SvLDNlu6ay9It6yZon0BIZRMApGwSgY" |
265 | 164 | "BaNgFIyCUTAKRsEoGAWjYBSMglEwCkbBKBgFo2AUjIJRMApGAUkAADhX8vgAKAAA ") | 170 | "BaNgFIyCUTAKRsEoGAWjYBSMglEwCkbBKBgFo2AUjIJRMApGAUkAADhX8vgAKAAA " |
266 | 171 | ).encode("ascii") | ||
267 | 165 | 172 | ||
268 | 166 | PKGDEB2 = ("ITxhcmNoPgpkZWJpYW4tYmluYXJ5ICAgMTE2NjExNDUyMiAgMCAgICAgMCAgICAgMT" | 173 | PKGDEB2 = ("ITxhcmNoPgpkZWJpYW4tYmluYXJ5ICAgMTE2NjExNDUyMiAgMCAgICAgMCAgICAgMT" |
269 | 167 | "AwNjQ0ICA0ICAgICAgICAgYAoyLjAKY29udHJvbC50YXIuZ3ogIDExNjYxMTQ1MjIg" | 174 | "AwNjQ0ICA0ICAgICAgICAgYAoyLjAKY29udHJvbC50YXIuZ3ogIDExNjYxMTQ1MjIg" |
270 | @@ -183,7 +190,8 @@ | |||
271 | 183 | "jR45xB99RGrkMGEq4Pbf0L3UWDL4XIRIk6Hjx7Urzj6SSxS/YTzKbu28sqe/64oPmF" | 190 | "jR45xB99RGrkMGEq4Pbf0L3UWDL4XIRIk6Hjx7Urzj6SSxS/YTzKbu28sqe/64oPmF" |
272 | 184 | "JGPj3lqR1cLMdz12u04rLHp/gM2y0mv3HOc/GqxvCl7PqWh7kbux6VrFk69zlefZsu" | 191 | "JGPj3lqR1cLMdz12u04rLHp/gM2y0mv3HOc/GqxvCl7PqWh7kbux6VrFk69zlefZsu" |
273 | 185 | "v5WPycH/NUv7VgF8N6vfeBcgXp3NlnBFNDw5eZsd1as/aK+JzyvZ0TGEbBKBgFo2AU" | 192 | "v5WPycH/NUv7VgF8N6vfeBcgXp3NlnBFNDw5eZsd1as/aK+JzyvZ0TGEbBKBgFo2AU" |
275 | 186 | "jIJRMApGwSgYBaNgFIyCUTAKRsEoGAWjYBSMglEwCkbBKBgFJAEAu4OlKQAoAAAK") | 193 | "jIJRMApGwSgYBaNgFIyCUTAKRsEoGAWjYBSMglEwCkbBKBgFJAEAu4OlKQAoAAAK" |
276 | 194 | ).encode("ascii") | ||
277 | 187 | 195 | ||
278 | 188 | PKGDEB3 = ("ITxhcmNoPgpkZWJpYW4tYmluYXJ5ICAgMTE2OTE0ODIwMyAgMCAgICAgMCAgICAgMT" | 196 | PKGDEB3 = ("ITxhcmNoPgpkZWJpYW4tYmluYXJ5ICAgMTE2OTE0ODIwMyAgMCAgICAgMCAgICAgMT" |
279 | 189 | "AwNjQ0ICA0ICAgICAgICAgYAoyLjAKY29udHJvbC50YXIuZ3ogIDExNjkxNDgyMDMg" | 197 | "AwNjQ0ICA0ICAgICAgICAgYAoyLjAKY29udHJvbC50YXIuZ3ogIDExNjkxNDgyMDMg" |
280 | @@ -206,7 +214,8 @@ | |||
281 | 206 | "bOTd7zh0Xz0y5bdGmDrbLp/dbhNtdpU/EFSt9LKe7/xHgzWn4PWcirYXuVsbrlVMeT" | 214 | "bOTd7zh0Xz0y5bdGmDrbLp/dbhNtdpU/EFSt9LKe7/xHgzWn4PWcirYXuVsbrlVMeT" |
282 | 207 | "pXaZ4t+zkfi5/zY57qTy3Yw7B+XU7g+8L07rmG7Fe2bVxmyHZLZ+0V8Sl2Xj8mMIyC" | 215 | "pXaZ4t+zkfi5/zY57qTy3Yw7B+XU7g+8L07rmG7Fe2bVxmyHZLZ+0V8Sl2Xj8mMIyC" |
283 | 208 | "UTAKRsEoGAWjYBSMglEwCkbBKBgFo2AUjIJRMApGwSgYBaNgFIyCUTAKSAIAY/FOKA" | 216 | "UTAKRsEoGAWjYBSMglEwCkbBKBgFo2AUjIJRMApGwSgYBaNgFIyCUTAKSAIAY/FOKA" |
285 | 209 | "AoAAAK") | 217 | "AoAAAK" |
286 | 218 | ).encode("ascii") | ||
287 | 210 | 219 | ||
288 | 211 | PKGDEB4 = ("ITxhcmNoPgpkZWJpYW4tYmluYXJ5ICAgMTI3NjUxMTU3OC41MCAgICAgMCAgICAgNj" | 220 | PKGDEB4 = ("ITxhcmNoPgpkZWJpYW4tYmluYXJ5ICAgMTI3NjUxMTU3OC41MCAgICAgMCAgICAgNj" |
289 | 212 | "Q0ICAgICA0\nICAgICAgICAgYAoyLjAKY29udHJvbC50YXIuZ3ogIDEyNzY1MTE1Nz" | 221 | "Q0ICAgICA0\nICAgICAgICAgYAoyLjAKY29udHJvbC50YXIuZ3ogIDEyNzY1MTE1Nz" |
290 | @@ -221,7 +230,8 @@ | |||
291 | 221 | "ICAgYAofiwgAWgUWTAL/7dFBCsMgEEDRWfcUniCZ\nsU57kJ5ASJdFSOz9K9kULLQr" | 230 | "ICAgYAofiwgAWgUWTAL/7dFBCsMgEEDRWfcUniCZ\nsU57kJ5ASJdFSOz9K9kULLQr" |
292 | 222 | "C4H/NiPqQvnTLMNpc3XfZ9PPfW2W1JOae9s3i5okuPzBc6t5bU9Z\nS6nf7v067z93" | 231 | "C4H/NiPqQvnTLMNpc3XfZ9PPfW2W1JOae9s3i5okuPzBc6t5bU9Z\nS6nf7v067z93" |
293 | 223 | "ENO8lcd9fP/LZ/d3f4td/6h+lqD0H+7W6ocl13wSAAAAAAAAAAAAAAAAAAfzAqr5\n" | 232 | "ENO8lcd9fP/LZ/d3f4td/6h+lqD0H+7W6ocl13wSAAAAAAAAAAAAAAAAAAfzAqr5\n" |
295 | 224 | "GFYAKAAACg==\n") | 233 | "GFYAKAAACg==\n" |
296 | 234 | ).encode("ascii") | ||
297 | 225 | 235 | ||
298 | 226 | PKGDEB_MINIMAL = ( | 236 | PKGDEB_MINIMAL = ( |
299 | 227 | "ITxhcmNoPgpkZWJpYW4tYmluYXJ5ICAgMTMxNzg5MDQ3OSAgMCAgICAgMCAgICAgMTAwNj" | 237 | "ITxhcmNoPgpkZWJpYW4tYmluYXJ5ICAgMTMxNzg5MDQ3OSAgMCAgICAgMCAgICAgMTAwNj" |
300 | @@ -234,7 +244,8 @@ | |||
301 | 234 | "AAAAAAAAAAAAAAAAAAAAAMBF70s1/foAKAAAZGF0YS50YXIu Z3ogICAgIDEzMTc4OTA0N" | 244 | "AAAAAAAAAAAAAAAAAAAAAMBF70s1/foAKAAAZGF0YS50YXIu Z3ogICAgIDEzMTc4OTA0N" |
302 | 235 | "zkgIDAgICAgIDAgICAgIDEwMDY0NCAgMTA3ICAgICAgIGAKH4sIAAAA AAACA+3KsQ3CQB" | 245 | "zkgIDAgICAgIDAgICAgIDEwMDY0NCAgMTA3ICAgICAgIGAKH4sIAAAA AAACA+3KsQ3CQB" |
303 | 236 | "AEwCvlK4D/N4frMSGBkQz0jwmQiHCEo5lkpd09HOPv6mrMfGcbs37nR7R2Pg01" | 246 | "AEwCvlK4D/N4frMSGBkQz0jwmQiHCEo5lkpd09HOPv6mrMfGcbs37nR7R2Pg01" |
305 | 237 | "ew5r32rvNUrGDp73x7SUEpfrbZl//LZ2AAAAAAAAAAAA2NELx33R7wAoAAAK") | 247 | "ew5r32rvNUrGDp73x7SUEpfrbZl//LZ2AAAAAAAAAAAA2NELx33R7wAoAAAK" |
306 | 248 | ).encode("ascii") | ||
307 | 238 | 249 | ||
308 | 239 | PKGDEB_SIMPLE_RELATIONS = ( | 250 | PKGDEB_SIMPLE_RELATIONS = ( |
309 | 240 | "ITxhcmNoPgpkZWJpYW4tYmluYXJ5ICAgMTMxODUxNjMyMiAgMCAgICAgMCAgICAgMTAwNj" | 251 | "ITxhcmNoPgpkZWJpYW4tYmluYXJ5ICAgMTMxODUxNjMyMiAgMCAgICAgMCAgICAgMTAwNj" |
310 | @@ -249,7 +260,8 @@ | |||
311 | 249 | "EKgcHt1gAoAABkYXRhLnRhci5neiAgICAgMTMxODUxNjMyMiAgMCAgICAgMCAg ICAgMTA" | 260 | "EKgcHt1gAoAABkYXRhLnRhci5neiAgICAgMTMxODUxNjMyMiAgMCAgICAgMCAg ICAgMTA" |
312 | 250 | "wNjQ0ICAxMDcgICAgICAgYAofiwgAAAAAAAID7cqxDcJQEETBK8UVwH2b+64HQgIjGegf " | 261 | "wNjQ0ICAxMDcgICAgICAgYAofiwgAAAAAAAID7cqxDcJQEETBK8UVwH2b+64HQgIjGegf " |
313 | 251 | "CJCIIMLRTPKC3d0+/i6f5qpX21z52bdorR+m7Fl9imw5jhVDxQbu19txHYY4nS/r8uX3aw" | 262 | "CJCIIMLRTPKC3d0+/i6f5qpX21z52bdorR+m7Fl9imw5jhVDxQbu19txHYY4nS/r8uX3aw" |
315 | 252 | "cAAAAA AAAAAIANPQALnD6FACgAAAo=") | 263 | "cAAAAA AAAAAIANPQALnD6FACgAAAo=" |
316 | 264 | ).encode("ascii") | ||
317 | 253 | 265 | ||
318 | 254 | 266 | ||
319 | 255 | PKGDEB_VERSION_RELATIONS = ( | 267 | PKGDEB_VERSION_RELATIONS = ( |
320 | @@ -265,7 +277,8 @@ | |||
321 | 265 | "AAAACAy/sAwTtOtwAoAABkYXRhLnRhci5neiAgICAgMTMxODUxNjQ5OCAgMCAg ICAgMCA" | 277 | "AAAACAy/sAwTtOtwAoAABkYXRhLnRhci5neiAgICAgMTMxODUxNjQ5OCAgMCAg ICAgMCA" |
322 | 266 | "gICAgMTAwNjQ0ICAxMDcgICAgICAgYAofiwgAAAAAAAID7cqxEcIwEETRK0UVgCT7UD0Q " | 278 | "gICAgMTAwNjQ0ICAxMDcgICAgICAgYAofiwgAAAAAAAID7cqxEcIwEETRK0UVgCT7UD0Q " |
323 | 267 | "EpgZA/0DATNEEOHoveQHu7t9/F19GpmvtpH1s2/R2mGeemYfc9RW+9SjZGzgfr0d11LidL" | 279 | "EpgZA/0DATNEEOHoveQHu7t9/F19GpmvtpH1s2/R2mGeemYfc9RW+9SjZGzgfr0d11LidL" |
325 | 268 | "6sy5ff rx0AAAAAAAAAAAA29AD/ixlwACgAAAo=") | 280 | "6sy5ff rx0AAAAAAAAAAAA29AD/ixlwACgAAAo=" |
326 | 281 | ).encode("ascii") | ||
327 | 269 | 282 | ||
328 | 270 | 283 | ||
329 | 271 | PKGDEB_MULTIPLE_RELATIONS = ( | 284 | PKGDEB_MULTIPLE_RELATIONS = ( |
330 | @@ -282,7 +295,8 @@ | |||
331 | 282 | "0YS50YXIuZ3ogICAgIDEzMTg1ODAwNzkgIDAgICAgIDAgICAgIDEwMDY0NCAgMTA3ICAg " | 295 | "0YS50YXIuZ3ogICAgIDEzMTg1ODAwNzkgIDAgICAgIDAgICAgIDEwMDY0NCAgMTA3ICAg " |
332 | 283 | "ICAgIGAKH4sIAAAAAAACA+3KsRHCMBBE0StFFYBkfFY9EBKYGWP3DwTMEEGEo/eSH+wejv" | 296 | "ICAgIGAKH4sIAAAAAAACA+3KsRHCMBBE0StFFYBkfFY9EBKYGWP3DwTMEEGEo/eSH+wejv" |
333 | 284 | "F39aln vtp61s++RWvTeBpy6tmjtjqMLUrGDrb7el5Kicv1tsxffr92AAAAAAAAAAAA2NE" | 297 | "F39aln vtp61s++RWvTeBpy6tmjtjqMLUrGDrb7el5Kicv1tsxffr92AAAAAAAAAAAA2NE" |
335 | 285 | "Db6L1AQAoAAAK") | 298 | "Db6L1AQAoAAAK" |
336 | 299 | ).encode("ascii") | ||
337 | 286 | 300 | ||
338 | 287 | 301 | ||
339 | 288 | PKGDEB_OR_RELATIONS = ( | 302 | PKGDEB_OR_RELATIONS = ( |
340 | @@ -299,7 +313,8 @@ | |||
341 | 299 | "6ICAgICAxMzE3ODg4ODY5ICAwICAgICAwICAgICAxMDA2NDQgIDEwNyAgICAgICBgCh+L " | 313 | "6ICAgICAxMzE3ODg4ODY5ICAwICAgICAwICAgICAxMDA2NDQgIDEwNyAgICAgICBgCh+L " |
342 | 300 | "CAAAAAAAAgPtyrsRwjAURNFXiioAfZBcjwkJzIyB/oGAGSIc4eic5Aa7h2P8XX6Zen+3TD" | 314 | "CAAAAAAAAgPtyrsRwjAURNFXiioAfZBcjwkJzIyB/oGAGSIc4eic5Aa7h2P8XX6Zen+3TD" |
343 | 301 | "1/9yNK" | 315 | "1/9yNK" |
345 | 302 | "GadWR2ltRC651hGpxw4et/u8phTny3Vdfvy2dgAAAAAAAAAAANjRE6Lr2rEAKAAACg==") | 316 | "GadWR2ltRC651hGpxw4et/u8phTny3Vdfvy2dgAAAAAAAAAAANjRE6Lr2rEAKAAACg==" |
346 | 317 | ).encode("ascii") | ||
347 | 303 | 318 | ||
348 | 304 | 319 | ||
349 | 305 | HASH1 = base64.decodestring(b"/ezv4AefpJJ8DuYFSq4RiEHJYP4=") | 320 | HASH1 = base64.decodestring(b"/ezv4AefpJJ8DuYFSq4RiEHJYP4=") |
350 | 306 | 321 | ||
351 | === modified file 'landscape/package/tests/test_reporter.py' | |||
352 | --- landscape/package/tests/test_reporter.py 2017-03-20 09:43:08 +0000 | |||
353 | +++ landscape/package/tests/test_reporter.py 2017-03-21 17:01:15 +0000 | |||
354 | @@ -27,7 +27,6 @@ | |||
355 | 27 | LandscapeTest, BrokerServiceHelper, EnvironSaverHelper) | 27 | LandscapeTest, BrokerServiceHelper, EnvironSaverHelper) |
356 | 28 | from landscape.reactor import FakeReactor | 28 | from landscape.reactor import FakeReactor |
357 | 29 | 29 | ||
358 | 30 | from landscape.compat import convert_buffer_to_string | ||
359 | 31 | 30 | ||
360 | 32 | SAMPLE_LSB_RELEASE = "DISTRIB_CODENAME=codename\n" | 31 | SAMPLE_LSB_RELEASE = "DISTRIB_CODENAME=codename\n" |
361 | 33 | 32 | ||
362 | @@ -96,21 +95,21 @@ | |||
363 | 96 | os.chmod(self.reporter.apt_update_filename, 0o755) | 95 | os.chmod(self.reporter.apt_update_filename, 0o755) |
364 | 97 | 96 | ||
365 | 98 | def test_set_package_ids_with_all_known(self): | 97 | def test_set_package_ids_with_all_known(self): |
369 | 99 | self.store.add_hash_id_request(["hash1", "hash2"]) | 98 | self.store.add_hash_id_request([b"hash1", b"hash2"]) |
370 | 100 | request2 = self.store.add_hash_id_request(["hash3", "hash4"]) | 99 | request2 = self.store.add_hash_id_request([b"hash3", b"hash4"]) |
371 | 101 | self.store.add_hash_id_request(["hash5", "hash6"]) | 100 | self.store.add_hash_id_request([b"hash5", b"hash6"]) |
372 | 102 | 101 | ||
373 | 103 | self.store.add_task("reporter", | 102 | self.store.add_task("reporter", |
374 | 104 | {"type": "package-ids", "ids": [123, 456], | 103 | {"type": "package-ids", "ids": [123, 456], |
375 | 105 | "request-id": request2.id}) | 104 | "request-id": request2.id}) |
376 | 106 | 105 | ||
377 | 107 | def got_result(result): | 106 | def got_result(result): |
384 | 108 | self.assertEqual(self.store.get_hash_id("hash1"), None) | 107 | self.assertEqual(self.store.get_hash_id(b"hash1"), None) |
385 | 109 | self.assertEqual(self.store.get_hash_id("hash2"), None) | 108 | self.assertEqual(self.store.get_hash_id(b"hash2"), None) |
386 | 110 | self.assertEqual(self.store.get_hash_id("hash3"), 123) | 109 | self.assertEqual(self.store.get_hash_id(b"hash3"), 123) |
387 | 111 | self.assertEqual(self.store.get_hash_id("hash4"), 456) | 110 | self.assertEqual(self.store.get_hash_id(b"hash4"), 456) |
388 | 112 | self.assertEqual(self.store.get_hash_id("hash5"), None) | 111 | self.assertEqual(self.store.get_hash_id(b"hash5"), None) |
389 | 113 | self.assertEqual(self.store.get_hash_id("hash6"), None) | 112 | self.assertEqual(self.store.get_hash_id(b"hash6"), None) |
390 | 114 | 113 | ||
391 | 115 | deferred = self.reporter.handle_tasks() | 114 | deferred = self.reporter.handle_tasks() |
392 | 116 | return deferred.addCallback(got_result) | 115 | return deferred.addCallback(got_result) |
393 | @@ -129,7 +128,7 @@ | |||
394 | 129 | 128 | ||
395 | 130 | message_store.set_accepted_types(["add-packages"]) | 129 | message_store.set_accepted_types(["add-packages"]) |
396 | 131 | 130 | ||
398 | 132 | request1 = self.store.add_hash_id_request(["foo", HASH1, "bar"]) | 131 | request1 = self.store.add_hash_id_request([b"foo", HASH1, b"bar"]) |
399 | 133 | 132 | ||
400 | 134 | self.store.add_task("reporter", | 133 | self.store.add_task("reporter", |
401 | 135 | {"type": "package-ids", | 134 | {"type": "package-ids", |
402 | @@ -184,7 +183,7 @@ | |||
403 | 184 | 183 | ||
404 | 185 | message_store.set_accepted_types(["add-packages"]) | 184 | message_store.set_accepted_types(["add-packages"]) |
405 | 186 | 185 | ||
407 | 187 | request1 = self.store.add_hash_id_request(["foo", HASH1, "bar"]) | 186 | request1 = self.store.add_hash_id_request([b"foo", HASH1, b"bar"]) |
408 | 188 | 187 | ||
409 | 189 | self.store.add_task("reporter", | 188 | self.store.add_task("reporter", |
410 | 190 | {"type": "package-ids", | 189 | {"type": "package-ids", |
411 | @@ -238,7 +237,7 @@ | |||
412 | 238 | deferred = Deferred() | 237 | deferred = Deferred() |
413 | 239 | deferred.errback(Boom()) | 238 | deferred.errback(Boom()) |
414 | 240 | 239 | ||
416 | 241 | request_id = self.store.add_hash_id_request(["foo", HASH1, "bar"]).id | 240 | request_id = self.store.add_hash_id_request([b"foo", HASH1, b"bar"]).id |
417 | 242 | 241 | ||
418 | 243 | self.store.add_task("reporter", {"type": "package-ids", | 242 | self.store.add_task("reporter", {"type": "package-ids", |
419 | 244 | "ids": [123, None, 456], | 243 | "ids": [123, None, 456], |
420 | @@ -259,7 +258,7 @@ | |||
421 | 259 | return result.addCallback(got_result, send_mock) | 258 | return result.addCallback(got_result, send_mock) |
422 | 260 | 259 | ||
423 | 261 | def test_set_package_ids_removes_request_id_when_done(self): | 260 | def test_set_package_ids_removes_request_id_when_done(self): |
425 | 262 | request = self.store.add_hash_id_request(["hash1"]) | 261 | request = self.store.add_hash_id_request([b"hash1"]) |
426 | 263 | self.store.add_task("reporter", {"type": "package-ids", "ids": [123], | 262 | self.store.add_task("reporter", {"type": "package-ids", "ids": [123], |
427 | 264 | "request-id": request.id}) | 263 | "request-id": request.id}) |
428 | 265 | 264 | ||
429 | @@ -562,7 +561,7 @@ | |||
430 | 562 | self.assertTrue(self.reporter._apt_sources_have_changed()) | 561 | self.assertTrue(self.reporter._apt_sources_have_changed()) |
431 | 563 | 562 | ||
432 | 564 | def test_remove_expired_hash_id_request(self): | 563 | def test_remove_expired_hash_id_request(self): |
434 | 565 | request = self.store.add_hash_id_request(["hash1"]) | 564 | request = self.store.add_hash_id_request([b"hash1"]) |
435 | 566 | request.message_id = 9999 | 565 | request.message_id = 9999 |
436 | 567 | 566 | ||
437 | 568 | request.timestamp -= HASH_ID_REQUEST_TIMEOUT | 567 | request.timestamp -= HASH_ID_REQUEST_TIMEOUT |
438 | @@ -575,7 +574,7 @@ | |||
439 | 575 | return result.addCallback(got_result) | 574 | return result.addCallback(got_result) |
440 | 576 | 575 | ||
441 | 577 | def test_remove_expired_hash_id_request_wont_remove_before_timeout(self): | 576 | def test_remove_expired_hash_id_request_wont_remove_before_timeout(self): |
443 | 578 | request1 = self.store.add_hash_id_request(["hash1"]) | 577 | request1 = self.store.add_hash_id_request([b"hash1"]) |
444 | 579 | request1.message_id = 9999 | 578 | request1.message_id = 9999 |
445 | 580 | request1.timestamp -= HASH_ID_REQUEST_TIMEOUT / 2 | 579 | request1.timestamp -= HASH_ID_REQUEST_TIMEOUT / 2 |
446 | 581 | 580 | ||
447 | @@ -592,7 +591,7 @@ | |||
448 | 592 | return result.addCallback(got_result) | 591 | return result.addCallback(got_result) |
449 | 593 | 592 | ||
450 | 594 | def test_remove_expired_hash_id_request_updates_timestamps(self): | 593 | def test_remove_expired_hash_id_request_updates_timestamps(self): |
452 | 595 | request = self.store.add_hash_id_request(["hash1"]) | 594 | request = self.store.add_hash_id_request([b"hash1"]) |
453 | 596 | message_store = self.broker_service.message_store | 595 | message_store = self.broker_service.message_store |
454 | 597 | message_id = message_store.add({"type": "add-packages", | 596 | message_id = message_store.add({"type": "add-packages", |
455 | 598 | "packages": [], | 597 | "packages": [], |
456 | @@ -607,7 +606,7 @@ | |||
457 | 607 | return result.addCallback(got_result) | 606 | return result.addCallback(got_result) |
458 | 608 | 607 | ||
459 | 609 | def test_remove_expired_hash_id_request_removes_when_no_message_id(self): | 608 | def test_remove_expired_hash_id_request_removes_when_no_message_id(self): |
461 | 610 | request = self.store.add_hash_id_request(["hash1"]) | 609 | request = self.store.add_hash_id_request([b"hash1"]) |
462 | 611 | 610 | ||
463 | 612 | def got_result(result): | 611 | def got_result(result): |
464 | 613 | self.assertRaises(UnknownHashIDRequest, | 612 | self.assertRaises(UnknownHashIDRequest, |
465 | @@ -1305,9 +1304,9 @@ | |||
466 | 1305 | spawn_patcher = mock.patch.object(reporter, "spawn_process", | 1304 | spawn_patcher = mock.patch.object(reporter, "spawn_process", |
467 | 1306 | side_effect=[ | 1305 | side_effect=[ |
468 | 1307 | # Simulate series of failures to acquire the apt lock. | 1306 | # Simulate series of failures to acquire the apt lock. |
472 | 1308 | succeed(('', '', 100)), | 1307 | succeed((b'', b'', 100)), |
473 | 1309 | succeed(('', '', 100)), | 1308 | succeed((b'', b'', 100)), |
474 | 1310 | succeed(('', '', 100))]) | 1309 | succeed((b'', b'', 100))]) |
475 | 1311 | spawn_patcher.start() | 1310 | spawn_patcher.start() |
476 | 1312 | self.addCleanup(spawn_patcher.stop) | 1311 | self.addCleanup(spawn_patcher.stop) |
477 | 1313 | 1312 | ||
478 | @@ -1343,8 +1342,8 @@ | |||
479 | 1343 | spawn_patcher = mock.patch.object(reporter, "spawn_process", | 1342 | spawn_patcher = mock.patch.object(reporter, "spawn_process", |
480 | 1344 | side_effect=[ | 1343 | side_effect=[ |
481 | 1345 | # Simulate a failed apt lock grab then a successful one. | 1344 | # Simulate a failed apt lock grab then a successful one. |
484 | 1346 | succeed(('', '', 100)), | 1345 | succeed((b'', b'', 100)), |
485 | 1347 | succeed(('output', 'error', 0))]) | 1346 | succeed((b'output', b'error', 0))]) |
486 | 1348 | spawn_patcher.start() | 1347 | spawn_patcher.start() |
487 | 1349 | self.addCleanup(spawn_patcher.stop) | 1348 | self.addCleanup(spawn_patcher.stop) |
488 | 1350 | 1349 | ||
489 | @@ -1628,7 +1627,7 @@ | |||
490 | 1628 | return deferred | 1627 | return deferred |
491 | 1629 | 1628 | ||
492 | 1630 | @mock.patch("landscape.package.reporter.spawn_process", | 1629 | @mock.patch("landscape.package.reporter.spawn_process", |
494 | 1631 | return_value=succeed(("", "", 0))) | 1630 | return_value=succeed((b"", b"", 0))) |
495 | 1632 | def test_run_apt_update_honors_http_proxy(self, mock_spawn_process): | 1631 | def test_run_apt_update_honors_http_proxy(self, mock_spawn_process): |
496 | 1633 | """ | 1632 | """ |
497 | 1634 | The PackageReporter.run_apt_update method honors the http_proxy | 1633 | The PackageReporter.run_apt_update method honors the http_proxy |
498 | @@ -1647,7 +1646,7 @@ | |||
499 | 1647 | env={"http_proxy": "http://proxy_server:8080"}) | 1646 | env={"http_proxy": "http://proxy_server:8080"}) |
500 | 1648 | 1647 | ||
501 | 1649 | @mock.patch("landscape.package.reporter.spawn_process", | 1648 | @mock.patch("landscape.package.reporter.spawn_process", |
503 | 1650 | return_value=succeed(("", "", 0))) | 1649 | return_value=succeed((b"", b"", 0))) |
504 | 1651 | def test_run_apt_update_honors_https_proxy(self, mock_spawn_process): | 1650 | def test_run_apt_update_honors_https_proxy(self, mock_spawn_process): |
505 | 1652 | """ | 1651 | """ |
506 | 1653 | The PackageReporter.run_apt_update method honors the https_proxy | 1652 | The PackageReporter.run_apt_update method honors the https_proxy |
507 | @@ -1869,8 +1868,7 @@ | |||
508 | 1869 | "SELECT id, data FROM message").fetchall()) | 1868 | "SELECT id, data FROM message").fetchall()) |
509 | 1870 | self.assertEqual(1, len(stored)) | 1869 | self.assertEqual(1, len(stored)) |
510 | 1871 | self.assertEqual(1, stored[0][0]) | 1870 | self.assertEqual(1, stored[0][0]) |
513 | 1872 | self.assertEqual(message, | 1871 | self.assertEqual(message, bpickle.loads(bytes(stored[0][1]))) |
512 | 1873 | bpickle.loads(convert_buffer_to_string(stored[0][1]))) | ||
514 | 1874 | result.addCallback(callback) | 1872 | result.addCallback(callback) |
515 | 1875 | result.chainDeferred(deferred) | 1873 | result.chainDeferred(deferred) |
516 | 1876 | 1874 | ||
517 | 1877 | 1875 | ||
518 | === modified file 'landscape/package/tests/test_store.py' | |||
519 | --- landscape/package/tests/test_store.py 2017-01-09 14:29:54 +0000 | |||
520 | +++ landscape/package/tests/test_store.py 2017-03-21 17:01:15 +0000 | |||
521 | @@ -19,12 +19,12 @@ | |||
522 | 19 | self.store2 = HashIdStore(self.filename) | 19 | self.store2 = HashIdStore(self.filename) |
523 | 20 | 20 | ||
524 | 21 | def test_set_and_get_hash_id(self): | 21 | def test_set_and_get_hash_id(self): |
528 | 22 | self.store1.set_hash_ids({"ha\x00sh1": 123, "ha\x00sh2": 456}) | 22 | self.store1.set_hash_ids({b"ha\x00sh1": 123, b"ha\x00sh2": 456}) |
529 | 23 | self.assertEqual(self.store1.get_hash_id("ha\x00sh1"), 123) | 23 | self.assertEqual(self.store1.get_hash_id(b"ha\x00sh1"), 123) |
530 | 24 | self.assertEqual(self.store1.get_hash_id("ha\x00sh2"), 456) | 24 | self.assertEqual(self.store1.get_hash_id(b"ha\x00sh2"), 456) |
531 | 25 | 25 | ||
532 | 26 | def test_get_hash_ids(self): | 26 | def test_get_hash_ids(self): |
534 | 27 | hash_ids = {"hash1": 123, "hash2": 456} | 27 | hash_ids = {b"hash1": 123, b"hash2": 456} |
535 | 28 | self.store1.set_hash_ids(hash_ids) | 28 | self.store1.set_hash_ids(hash_ids) |
536 | 29 | self.assertEqual(self.store1.get_hash_ids(), hash_ids) | 29 | self.assertEqual(self.store1.get_hash_ids(), hash_ids) |
537 | 30 | 30 | ||
538 | @@ -80,33 +80,33 @@ | |||
539 | 80 | self.assertEqual([None], rollbacks) | 80 | self.assertEqual([None], rollbacks) |
540 | 81 | 81 | ||
541 | 82 | def test_get_id_hash(self): | 82 | def test_get_id_hash(self): |
545 | 83 | self.store1.set_hash_ids({"hash1": 123, "hash2": 456}) | 83 | self.store1.set_hash_ids({b"hash1": 123, b"hash2": 456}) |
546 | 84 | self.assertEqual(self.store2.get_id_hash(123), "hash1") | 84 | self.assertEqual(self.store2.get_id_hash(123), b"hash1") |
547 | 85 | self.assertEqual(self.store2.get_id_hash(456), "hash2") | 85 | self.assertEqual(self.store2.get_id_hash(456), b"hash2") |
548 | 86 | 86 | ||
549 | 87 | def test_clear_hash_ids(self): | 87 | def test_clear_hash_ids(self): |
551 | 88 | self.store1.set_hash_ids({"ha\x00sh1": 123, "ha\x00sh2": 456}) | 88 | self.store1.set_hash_ids({b"ha\x00sh1": 123, b"ha\x00sh2": 456}) |
552 | 89 | self.store1.clear_hash_ids() | 89 | self.store1.clear_hash_ids() |
555 | 90 | self.assertEqual(self.store2.get_hash_id("ha\x00sh1"), None) | 90 | self.assertEqual(self.store2.get_hash_id(b"ha\x00sh1"), None) |
556 | 91 | self.assertEqual(self.store2.get_hash_id("ha\x00sh2"), None) | 91 | self.assertEqual(self.store2.get_hash_id(b"ha\x00sh2"), None) |
557 | 92 | 92 | ||
558 | 93 | def test_get_unexistent_hash(self): | 93 | def test_get_unexistent_hash(self): |
560 | 94 | self.assertEqual(self.store1.get_hash_id("hash1"), None) | 94 | self.assertEqual(self.store1.get_hash_id(b"hash1"), None) |
561 | 95 | 95 | ||
562 | 96 | def test_get_unexistent_id(self): | 96 | def test_get_unexistent_id(self): |
563 | 97 | self.assertEqual(self.store1.get_id_hash(123), None) | 97 | self.assertEqual(self.store1.get_id_hash(123), None) |
564 | 98 | 98 | ||
565 | 99 | def test_overwrite_id_hash(self): | 99 | def test_overwrite_id_hash(self): |
570 | 100 | self.store1.set_hash_ids({"hash1": 123}) | 100 | self.store1.set_hash_ids({b"hash1": 123}) |
571 | 101 | self.store2.set_hash_ids({"hash2": 123}) | 101 | self.store2.set_hash_ids({b"hash2": 123}) |
572 | 102 | self.assertEqual(self.store1.get_hash_id("hash1"), None) | 102 | self.assertEqual(self.store1.get_hash_id(b"hash1"), None) |
573 | 103 | self.assertEqual(self.store1.get_hash_id("hash2"), 123) | 103 | self.assertEqual(self.store1.get_hash_id(b"hash2"), 123) |
574 | 104 | 104 | ||
575 | 105 | def test_overwrite_hash_id(self): | 105 | def test_overwrite_hash_id(self): |
578 | 106 | self.store1.set_hash_ids({"hash1": 123}) | 106 | self.store1.set_hash_ids({b"hash1": 123}) |
579 | 107 | self.store2.set_hash_ids({"hash1": 456}) | 107 | self.store2.set_hash_ids({b"hash1": 456}) |
580 | 108 | self.assertEqual(self.store1.get_id_hash(123), None) | 108 | self.assertEqual(self.store1.get_id_hash(123), None) |
582 | 109 | self.assertEqual(self.store1.get_id_hash(456), "hash1") | 109 | self.assertEqual(self.store1.get_id_hash(456), b"hash1") |
583 | 110 | 110 | ||
584 | 111 | def test_check_sanity(self): | 111 | def test_check_sanity(self): |
585 | 112 | 112 | ||
586 | @@ -184,22 +184,22 @@ | |||
587 | 184 | 184 | ||
588 | 185 | def test_get_hash_id_using_hash_id_dbs(self): | 185 | def test_get_hash_id_using_hash_id_dbs(self): |
589 | 186 | # Without hash=>id dbs | 186 | # Without hash=>id dbs |
592 | 187 | self.assertEqual(self.store1.get_hash_id("hash1"), None) | 187 | self.assertEqual(self.store1.get_hash_id(b"hash1"), None) |
593 | 188 | self.assertEqual(self.store1.get_hash_id("hash2"), None) | 188 | self.assertEqual(self.store1.get_hash_id(b"hash2"), None) |
594 | 189 | 189 | ||
595 | 190 | # This hash=>id will be overriden | 190 | # This hash=>id will be overriden |
597 | 191 | self.store1.set_hash_ids({"hash1": 1}) | 191 | self.store1.set_hash_ids({b"hash1": 1}) |
598 | 192 | 192 | ||
599 | 193 | # Add a couple of hash=>id dbs | 193 | # Add a couple of hash=>id dbs |
604 | 194 | self.store1.add_hash_id_db(self.hash_id_db_factory({"hash1": 2, | 194 | self.store1.add_hash_id_db(self.hash_id_db_factory({b"hash1": 2, |
605 | 195 | "hash2": 3})) | 195 | b"hash2": 3})) |
606 | 196 | self.store1.add_hash_id_db(self.hash_id_db_factory({"hash2": 4, | 196 | self.store1.add_hash_id_db(self.hash_id_db_factory({b"hash2": 4, |
607 | 197 | "ha\x00sh1": 5})) | 197 | b"ha\x00sh1": 5})) |
608 | 198 | 198 | ||
609 | 199 | # Check look-up priorities and binary hashes | 199 | # Check look-up priorities and binary hashes |
613 | 200 | self.assertEqual(self.store1.get_hash_id("hash1"), 2) | 200 | self.assertEqual(self.store1.get_hash_id(b"hash1"), 2) |
614 | 201 | self.assertEqual(self.store1.get_hash_id("hash2"), 3) | 201 | self.assertEqual(self.store1.get_hash_id(b"hash2"), 3) |
615 | 202 | self.assertEqual(self.store1.get_hash_id("ha\x00sh1"), 5) | 202 | self.assertEqual(self.store1.get_hash_id(b"ha\x00sh1"), 5) |
616 | 203 | 203 | ||
617 | 204 | def test_get_id_hash_using_hash_id_db(self): | 204 | def test_get_id_hash_using_hash_id_db(self): |
618 | 205 | """ | 205 | """ |
619 | @@ -207,13 +207,13 @@ | |||
620 | 207 | to query them first, falling back to the regular db in case | 207 | to query them first, falling back to the regular db in case |
621 | 208 | the desired mapping is not found. | 208 | the desired mapping is not found. |
622 | 209 | """ | 209 | """ |
630 | 210 | self.store1.add_hash_id_db(self.hash_id_db_factory({"hash1": 123})) | 210 | self.store1.add_hash_id_db(self.hash_id_db_factory({b"hash1": 123})) |
631 | 211 | self.store1.add_hash_id_db(self.hash_id_db_factory({"hash1": 999, | 211 | self.store1.add_hash_id_db(self.hash_id_db_factory({b"hash1": 999, |
632 | 212 | "hash2": 456})) | 212 | b"hash2": 456})) |
633 | 213 | self.store1.set_hash_ids({"hash3": 789}) | 213 | self.store1.set_hash_ids({b"hash3": 789}) |
634 | 214 | self.assertEqual(self.store1.get_id_hash(123), "hash1") | 214 | self.assertEqual(self.store1.get_id_hash(123), b"hash1") |
635 | 215 | self.assertEqual(self.store1.get_id_hash(456), "hash2") | 215 | self.assertEqual(self.store1.get_id_hash(456), b"hash2") |
636 | 216 | self.assertEqual(self.store1.get_id_hash(789), "hash3") | 216 | self.assertEqual(self.store1.get_id_hash(789), b"hash3") |
637 | 217 | 217 | ||
638 | 218 | def test_add_and_get_available_packages(self): | 218 | def test_add_and_get_available_packages(self): |
639 | 219 | self.store1.add_available([1, 2]) | 219 | self.store1.add_available([1, 2]) |
640 | 220 | 220 | ||
641 | === modified file 'landscape/schema.py' | |||
642 | --- landscape/schema.py 2017-03-10 12:40:17 +0000 | |||
643 | +++ landscape/schema.py 2017-03-21 17:01:15 +0000 | |||
644 | @@ -67,8 +67,8 @@ | |||
645 | 67 | class Bytes(object): | 67 | class Bytes(object): |
646 | 68 | """A binary string.""" | 68 | """A binary string.""" |
647 | 69 | def coerce(self, value): | 69 | def coerce(self, value): |
650 | 70 | if not isinstance(value, str): | 70 | if not isinstance(value, bytes): |
651 | 71 | raise InvalidError("%r isn't a str" % (value,)) | 71 | raise InvalidError("%r isn't a bytestring" % (value,)) |
652 | 72 | return value | 72 | return value |
653 | 73 | 73 | ||
654 | 74 | 74 | ||
655 | 75 | 75 | ||
656 | === modified file 'py3_ready_tests' | |||
657 | --- py3_ready_tests 2017-03-15 08:40:11 +0000 | |||
658 | +++ py3_ready_tests 2017-03-21 17:01:15 +0000 | |||
659 | @@ -1,2 +1,5 @@ | |||
660 | 1 | landscape.lib.tests | 1 | landscape.lib.tests |
661 | 2 | landscape.sysinfo.tests | 2 | landscape.sysinfo.tests |
662 | 3 | landscape.package.tests.test_store | ||
663 | 4 | landscape.package.tests.test_reporter | ||
664 | 5 |
Command: TRIAL_ARGS=-j4 make check /ci.lscape. net/job/ latch-test- xenial/ 3632/
Result: Success
Revno: 981
Branch: lp:~gocept/landscape-client/py3-package-store-reporter
Jenkins: https:/