Merge lp:~gocept/landscape-client/py3-package-store-reporter into lp:~landscape/landscape-client/trunk
- py3-package-store-reporter
- Merge into trunk
Status: | Merged |
---|---|
Approved by: | Данило Шеган |
Approved revision: | 987 |
Merged at revision: | 965 |
Proposed branch: | lp:~gocept/landscape-client/py3-package-store-reporter |
Merge into: | lp:~landscape/landscape-client/trunk |
Diff against target: |
664 lines (+122/-118) 11 files modified
landscape/broker/store.py (+3/-10) landscape/compat.py (+0/-13) landscape/package/facade.py (+2/-1) landscape/package/reporter.py (+4/-2) landscape/package/skeleton.py (+6/-2) landscape/package/store.py (+10/-9) landscape/package/tests/helpers.py (+32/-17) landscape/package/tests/test_reporter.py (+25/-27) landscape/package/tests/test_store.py (+35/-35) landscape/schema.py (+2/-2) py3_ready_tests (+3/-0) |
To merge this branch: | bzr merge lp:~gocept/landscape-client/py3-package-store-reporter |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Данило Шеган (community) | Approve | ||
🤖 Landscape Builder | test results | Approve | |
Daniel Havlik (community) | Approve | ||
Review via email: mp+320148@code.launchpad.net |
Commit message
Support py2/3 in landscape.
Description of the change
We are on the journey to get the Bytes / Unicode story solved for Python 2/3 compatibility.
This MP considers code in landscape.
🤖 Landscape Builder (landscape-builder) : | # |
- 982. By Steffen Allner
-
Comply with PEP-8 and coding style.
🤖 Landscape Builder (landscape-builder) wrote : | # |
🤖 Landscape Builder (landscape-builder) : | # |
🤖 Landscape Builder (landscape-builder) wrote : | # |
Command: TRIAL_ARGS=-j4 make check
Result: Success
Revno: 982
Branch: lp:~gocept/landscape-client/py3-package-store-reporter
Jenkins: https:/
Данило Шеган (danilo) wrote : | # |
FWIW, bpickle is now back to landscape.
- 983. By Steffen Allner
-
Backmerge from trunk with py2/3 bpickle.
Steffen Allner (sallner) wrote : | # |
> FWIW, bpickle is now back to landscape.
> conflicts in the imports now (Launchpad does not regenerate the diff when the
> target changes).
I have merged the trunk back into this branch. Should be updated now.
🤖 Landscape Builder (landscape-builder) : | # |
🤖 Landscape Builder (landscape-builder) wrote : | # |
Command: TRIAL_ARGS=-j4 make check
Result: Success
Revno: 983
Branch: lp:~gocept/landscape-client/py3-package-store-reporter
Jenkins: https:/
Данило Шеган (danilo) wrote : | # |
Looks good: a few minor nits inline. I'll just do a test run with this before I mark it approved.
Данило Шеган (danilo) : | # |
Steffen Allner (sallner) wrote : | # |
See inline comments. I would especially interested in a direction for the StrictVersion case.
- 984. By Steffen Allner
-
Remove no longer needed helper method.
- 985. By Steffen Allner
-
Remove superfluous assignment.
- 986. By Steffen Allner
-
Use undocumented parameter to ensure bytes as return value.
Steffen Allner (sallner) wrote : | # |
I cleaned up and documented the mentioned parts.
Additionally, I opened up a new MP which uses only bytes for software comparison[0].
[0] https:/
Данило Шеган (danilo) wrote : | # |
Replied inline. If you want, we can deal with the is_version_higher() fix in a follow-up branch and get this one landed, but please get a Gocept review as well.
Данило Шеган (danilo) wrote : | # |
Cool, haven't refreshed with inline comments for a while: there's a conflict in this branch now. Please fix a conflict and get a ~gocept review and I'll get this branch landed.
Данило Шеган (danilo) wrote : | # |
After looking at the other branch, I guess you might want to land that one first. Then we can merge trunk here, resolve a conflict and land this one too.
Daniel Havlik (nilo) wrote : | # |
If conflict is fixed: +1
Steffen Allner (sallner) wrote : | # |
Then let's wait for the other branch to arrive and I can fix both afterwards.
Данило Шеган (danilo) wrote : | # |
Landed the other branch, please re-merge trunk into this one and resolve the conflict.
- 987. By Steffen Allner
-
Backmerge from trunk.
🤖 Landscape Builder (landscape-builder) : | # |
🤖 Landscape Builder (landscape-builder) wrote : | # |
Command: TRIAL_ARGS=-j4 make check
Result: Fail
Revno: 987
Branch: lp:~gocept/landscape-client/py3-package-store-reporter
Jenkins: https:/
Steffen Allner (sallner) wrote : | # |
It seems there is again the same flaky test, we have seen before. Danilo, do you have an idea, how we could make it more robust? I can never reproduce that failre with Python 2. Maybe you want to restart it.
🤖 Landscape Builder (landscape-builder) : | # |
🤖 Landscape Builder (landscape-builder) wrote : | # |
Command: TRIAL_ARGS=-j4 make check
Result: Success
Revno: 987
Branch: lp:~gocept/landscape-client/py3-package-store-reporter
Jenkins: https:/
Preview Diff
1 | === modified file 'landscape/broker/store.py' |
2 | --- landscape/broker/store.py 2017-03-17 09:26:45 +0000 |
3 | +++ landscape/broker/store.py 2017-03-21 17:01:15 +0000 |
4 | @@ -101,7 +101,7 @@ |
5 | |
6 | from landscape import DEFAULT_SERVER_API |
7 | from landscape.lib import bpickle |
8 | -from landscape.lib.fs import create_binary_file |
9 | +from landscape.lib.fs import create_binary_file, read_binary_file |
10 | from landscape.lib.versioning import sort_versions, is_version_higher |
11 | |
12 | |
13 | @@ -260,7 +260,7 @@ |
14 | for filename in self._walk_pending_messages(): |
15 | if max is not None and len(messages) >= max: |
16 | break |
17 | - data = self._get_content(self._message_dir(filename)) |
18 | + data = read_binary_file(self._message_dir(filename)) |
19 | try: |
20 | message = bpickle.loads(data) |
21 | except ValueError as e: |
22 | @@ -436,13 +436,6 @@ |
23 | def _message_dir(self, *args): |
24 | return os.path.join(self._directory, *args) |
25 | |
26 | - def _get_content(self, filename): |
27 | - file = open(filename, 'rb') |
28 | - try: |
29 | - return file.read() |
30 | - finally: |
31 | - file.close() |
32 | - |
33 | def _reprocess_holding(self): |
34 | """ |
35 | Unhold accepted messages left behind, and hold unaccepted |
36 | @@ -454,7 +447,7 @@ |
37 | for old_filename in self._walk_messages(): |
38 | flags = self._get_flags(old_filename) |
39 | try: |
40 | - message = bpickle.loads(self._get_content(old_filename)) |
41 | + message = bpickle.loads(read_binary_file(old_filename)) |
42 | except ValueError as e: |
43 | logging.exception(e) |
44 | if HELD not in flags: |
45 | |
46 | === modified file 'landscape/compat.py' |
47 | --- landscape/compat.py 2017-03-17 09:26:45 +0000 |
48 | +++ landscape/compat.py 2017-03-21 17:01:15 +0000 |
49 | @@ -46,16 +46,3 @@ |
50 | return s.decode(encoding, errors) |
51 | else: |
52 | return s |
53 | - |
54 | - |
55 | -def convert_buffer_to_string(mem_view): |
56 | - """ |
57 | - Converts a buffer in Python 2 or a memoryview in Python 3 to str. |
58 | - |
59 | - @param mem_view: The view to convert. |
60 | - """ |
61 | - if _PY3: |
62 | - result = mem_view.decode('ascii') |
63 | - else: |
64 | - result = str(mem_view) |
65 | - return result |
66 | |
67 | === modified file 'landscape/package/facade.py' |
68 | --- landscape/package/facade.py 2017-03-13 15:38:09 +0000 |
69 | +++ landscape/package/facade.py 2017-03-21 17:01:15 +0000 |
70 | @@ -196,7 +196,8 @@ |
71 | process = subprocess.Popen( |
72 | ["dpkg", "--set-selections"] + self._dpkg_args, |
73 | stdin=subprocess.PIPE) |
74 | - process.communicate(selection) |
75 | + # We need bytes here to communicate with the process. |
76 | + process.communicate(selection.encode("utf-8")) |
77 | |
78 | def set_package_hold(self, version): |
79 | """Add a dpkg hold for a package. |
80 | |
81 | === modified file 'landscape/package/reporter.py' |
82 | --- landscape/package/reporter.py 2017-03-20 09:43:08 +0000 |
83 | +++ landscape/package/reporter.py 2017-03-21 17:01:15 +0000 |
84 | @@ -20,7 +20,6 @@ |
85 | from landscape.lib.fs import touch_file |
86 | from landscape.lib.lsb_release import parse_lsb_release, LSB_RELEASE_FILENAME |
87 | |
88 | -from landscape.compat import convert_buffer_to_string |
89 | from landscape.package.taskhandler import ( |
90 | PackageTaskHandlerConfiguration, PackageTaskHandler, run_task_handler) |
91 | from landscape.package.store import UnknownHashIDRequest, FakePackageStore |
92 | @@ -231,6 +230,9 @@ |
93 | self._reactor.call_later( |
94 | LOCK_RETRY_DELAYS[retry], self._apt_update, deferred) |
95 | out, err, code = yield deferred |
96 | + out = out.decode("utf-8") |
97 | + err = err.decode("utf-8") |
98 | + |
99 | timestamp = self._reactor.time() |
100 | |
101 | touch_file(self._config.update_stamp_filename) |
102 | @@ -730,7 +732,7 @@ |
103 | messages = global_store.get_messages_by_ids(not_sent) |
104 | sent = [] |
105 | for message_id, message in messages: |
106 | - message = bpickle.loads(convert_buffer_to_string(message)) |
107 | + message = bpickle.loads(message) |
108 | if message["type"] not in got_type: |
109 | got_type.add(message["type"]) |
110 | sent.append(message_id) |
111 | |
112 | === modified file 'landscape/package/skeleton.py' |
113 | --- landscape/package/skeleton.py 2017-03-10 12:19:57 +0000 |
114 | +++ landscape/package/skeleton.py 2017-03-21 17:01:15 +0000 |
115 | @@ -50,10 +50,14 @@ |
116 | """ |
117 | if self._hash is not None: |
118 | return self._hash |
119 | - digest = sha1("[%d %s %s]" % (self.type, self.name, self.version)) |
120 | + # We use ascii here as encoding for backwards compatibility as it was |
121 | + # default encoding for conversion from unicode to bytes in Python 2.7. |
122 | + package_info = b"[%d %s %s]" % ( |
123 | + self.type, self.name.encode("ascii"), self.version.encode("ascii")) |
124 | + digest = sha1(package_info) |
125 | self.relations.sort() |
126 | for pair in self.relations: |
127 | - digest.update("[%d %s]" % pair) |
128 | + digest.update(b"[%d %s]" % (pair[0], pair[1].encode("ascii"))) |
129 | return digest.digest() |
130 | |
131 | def set_hash(self, package_hash): |
132 | |
133 | === modified file 'landscape/package/store.py' |
134 | --- landscape/package/store.py 2017-03-17 09:26:45 +0000 |
135 | +++ landscape/package/store.py 2017-03-21 17:01:15 +0000 |
136 | @@ -7,9 +7,7 @@ |
137 | from pysqlite2 import dbapi2 as sqlite3 |
138 | |
139 | from twisted.python.compat import iteritems, long |
140 | -from twisted.python.compat import StringType as basestring |
141 | |
142 | -from landscape.compat import convert_buffer_to_string |
143 | from landscape.lib import bpickle |
144 | from landscape.lib.store import with_cursor |
145 | |
146 | @@ -50,7 +48,10 @@ |
147 | |
148 | @with_cursor |
149 | def get_hash_id(self, cursor, hash): |
150 | - """Return the id associated to C{hash}, or C{None} if not available.""" |
151 | + """Return the id associated to C{hash}, or C{None} if not available. |
152 | + |
153 | + @param hash: a C{bytes} representing a hash. |
154 | + """ |
155 | cursor.execute("SELECT id FROM hash WHERE hash=?", |
156 | (sqlite3.Binary(hash),)) |
157 | value = cursor.fetchone() |
158 | @@ -62,7 +63,7 @@ |
159 | def get_hash_ids(self, cursor): |
160 | """Return a C{dict} holding all the available hash=>id mappings.""" |
161 | cursor.execute("SELECT hash, id FROM hash") |
162 | - return dict([(str(row[0]), row[1]) for row in cursor.fetchall()]) |
163 | + return {bytes(row[0]): row[1] for row in cursor.fetchall()} |
164 | |
165 | @with_cursor |
166 | def get_id_hash(self, cursor, id): |
167 | @@ -71,7 +72,7 @@ |
168 | cursor.execute("SELECT hash FROM hash WHERE id=?", (id,)) |
169 | value = cursor.fetchone() |
170 | if value: |
171 | - return str(value[0]) |
172 | + return bytes(value[0]) |
173 | return None |
174 | |
175 | @with_cursor |
176 | @@ -149,7 +150,7 @@ |
177 | the attached lookaside databases, falling back to the main one, as |
178 | described in L{add_hash_id_db}. |
179 | """ |
180 | - assert isinstance(hash, basestring) |
181 | + assert isinstance(hash, bytes) |
182 | |
183 | # Check if we can find the hash=>id mapping in the lookaside stores |
184 | for store in self._hash_id_stores: |
185 | @@ -332,7 +333,7 @@ |
186 | result = cursor.execute( |
187 | "SELECT id, data FROM message WHERE id IN (%s) " |
188 | "ORDER BY id" % params, tuple(message_ids)).fetchall() |
189 | - return [(row[0], row[1]) for row in result] |
190 | + return [(row[0], bytes(row[1])) for row in result] |
191 | |
192 | |
193 | class HashIDRequest(object): |
194 | @@ -346,7 +347,7 @@ |
195 | def hashes(self, cursor): |
196 | cursor.execute("SELECT hashes FROM hash_id_request WHERE id=?", |
197 | (self.id,)) |
198 | - return bpickle.loads(convert_buffer_to_string(cursor.fetchone()[0])) |
199 | + return bpickle.loads(bytes(cursor.fetchone()[0])) |
200 | |
201 | @with_cursor |
202 | def _get_timestamp(self, cursor): |
203 | @@ -395,7 +396,7 @@ |
204 | |
205 | self.queue = row[0] |
206 | self.timestamp = row[1] |
207 | - self.data = bpickle.loads(convert_buffer_to_string(row[2])) |
208 | + self.data = bpickle.loads(bytes(row[2])) |
209 | |
210 | @with_cursor |
211 | def remove(self, cursor): |
212 | |
213 | === modified file 'landscape/package/tests/helpers.py' |
214 | --- landscape/package/tests/helpers.py 2017-03-13 15:15:46 +0000 |
215 | +++ landscape/package/tests/helpers.py 2017-03-21 17:01:15 +0000 |
216 | @@ -6,7 +6,7 @@ |
217 | import apt_inst |
218 | import apt_pkg |
219 | |
220 | -from landscape.lib.fs import append_binary_file, append_text_file |
221 | +from landscape.lib.fs import append_binary_file |
222 | from landscape.lib.fs import create_binary_file |
223 | from landscape.package.facade import AptFacade |
224 | |
225 | @@ -49,11 +49,17 @@ |
226 | """ % { |
227 | "name": name, "version": version, |
228 | "architecture": architecture, |
229 | - "description": description.encode("utf-8")}) |
230 | + "description": description}).encode("utf-8") |
231 | + # We want to re-order the TagSection, but it requires bytes as input. |
232 | + # As we also want to write a binary file, we have to explicitly pass |
233 | + # the hardly documented `bytes=True` to TagSection as it would be |
234 | + # returned as unicode in Python 3 otherwise. In future versions of |
235 | + # apt_pkg there should be a TagSection.write() which is recommended. |
236 | package_stanza = apt_pkg.rewrite_section( |
237 | - apt_pkg.TagSection(package_stanza), apt_pkg.REWRITE_PACKAGE_ORDER, |
238 | - control_fields.items()) |
239 | - append_binary_file(packages_file, "\n" + package_stanza + "\n") |
240 | + apt_pkg.TagSection(package_stanza, bytes=True), |
241 | + apt_pkg.REWRITE_PACKAGE_ORDER, |
242 | + list(control_fields.items())) |
243 | + append_binary_file(packages_file, b"\n" + package_stanza + b"\n") |
244 | |
245 | def _add_system_package(self, name, architecture="all", version="1.0", |
246 | control_fields=None): |
247 | @@ -72,9 +78,9 @@ |
248 | control = deb.control.extractdata("control") |
249 | deb_file.close() |
250 | lines = control.splitlines() |
251 | - lines.insert(1, "Status: install ok installed") |
252 | - status = "\n".join(lines) |
253 | - append_text_file(self.dpkg_status, status + "\n\n") |
254 | + lines.insert(1, b"Status: install ok installed") |
255 | + status = b"\n".join(lines) |
256 | + append_binary_file(self.dpkg_status, status + b"\n\n") |
257 | |
258 | def _add_package_to_deb_dir(self, path, name, architecture="all", |
259 | version="1.0", description="description", |
260 | @@ -161,7 +167,8 @@ |
261 | "Mzepwz6J7y5jpkIOH6sDKssF1rmUqYzBX2piZj9zyFad5RHv8dLoXsqua2spF3v+PQ" |
262 | "ffXIlN8aYepsu3x2u0202VX+QFC10st6vvMfDdacgtdzKtpe5G5tuFYx5elcpXm27O" |
263 | "d8LH7Oj3mqP7VgD8P6dTmJ33dsPnpuBnPO3SvLDNlu6ay9It6yZon0BIZRMApGwSgY" |
264 | - "BaNgFIyCUTAKRsEoGAWjYBSMglEwCkbBKBgFo2AUjIJRMApGAUkAADhX8vgAKAAA ") |
265 | + "BaNgFIyCUTAKRsEoGAWjYBSMglEwCkbBKBgFo2AUjIJRMApGAUkAADhX8vgAKAAA " |
266 | + ).encode("ascii") |
267 | |
268 | PKGDEB2 = ("ITxhcmNoPgpkZWJpYW4tYmluYXJ5ICAgMTE2NjExNDUyMiAgMCAgICAgMCAgICAgMT" |
269 | "AwNjQ0ICA0ICAgICAgICAgYAoyLjAKY29udHJvbC50YXIuZ3ogIDExNjYxMTQ1MjIg" |
270 | @@ -183,7 +190,8 @@ |
271 | "jR45xB99RGrkMGEq4Pbf0L3UWDL4XIRIk6Hjx7Urzj6SSxS/YTzKbu28sqe/64oPmF" |
272 | "JGPj3lqR1cLMdz12u04rLHp/gM2y0mv3HOc/GqxvCl7PqWh7kbux6VrFk69zlefZsu" |
273 | "v5WPycH/NUv7VgF8N6vfeBcgXp3NlnBFNDw5eZsd1as/aK+JzyvZ0TGEbBKBgFo2AU" |
274 | - "jIJRMApGwSgYBaNgFIyCUTAKRsEoGAWjYBSMglEwCkbBKBgFJAEAu4OlKQAoAAAK") |
275 | + "jIJRMApGwSgYBaNgFIyCUTAKRsEoGAWjYBSMglEwCkbBKBgFJAEAu4OlKQAoAAAK" |
276 | + ).encode("ascii") |
277 | |
278 | PKGDEB3 = ("ITxhcmNoPgpkZWJpYW4tYmluYXJ5ICAgMTE2OTE0ODIwMyAgMCAgICAgMCAgICAgMT" |
279 | "AwNjQ0ICA0ICAgICAgICAgYAoyLjAKY29udHJvbC50YXIuZ3ogIDExNjkxNDgyMDMg" |
280 | @@ -206,7 +214,8 @@ |
281 | "bOTd7zh0Xz0y5bdGmDrbLp/dbhNtdpU/EFSt9LKe7/xHgzWn4PWcirYXuVsbrlVMeT" |
282 | "pXaZ4t+zkfi5/zY57qTy3Yw7B+XU7g+8L07rmG7Fe2bVxmyHZLZ+0V8Sl2Xj8mMIyC" |
283 | "UTAKRsEoGAWjYBSMglEwCkbBKBgFo2AUjIJRMApGwSgYBaNgFIyCUTAKSAIAY/FOKA" |
284 | - "AoAAAK") |
285 | + "AoAAAK" |
286 | + ).encode("ascii") |
287 | |
288 | PKGDEB4 = ("ITxhcmNoPgpkZWJpYW4tYmluYXJ5ICAgMTI3NjUxMTU3OC41MCAgICAgMCAgICAgNj" |
289 | "Q0ICAgICA0\nICAgICAgICAgYAoyLjAKY29udHJvbC50YXIuZ3ogIDEyNzY1MTE1Nz" |
290 | @@ -221,7 +230,8 @@ |
291 | "ICAgYAofiwgAWgUWTAL/7dFBCsMgEEDRWfcUniCZ\nsU57kJ5ASJdFSOz9K9kULLQr" |
292 | "C4H/NiPqQvnTLMNpc3XfZ9PPfW2W1JOae9s3i5okuPzBc6t5bU9Z\nS6nf7v067z93" |
293 | "ENO8lcd9fP/LZ/d3f4td/6h+lqD0H+7W6ocl13wSAAAAAAAAAAAAAAAAAAfzAqr5\n" |
294 | - "GFYAKAAACg==\n") |
295 | + "GFYAKAAACg==\n" |
296 | + ).encode("ascii") |
297 | |
298 | PKGDEB_MINIMAL = ( |
299 | "ITxhcmNoPgpkZWJpYW4tYmluYXJ5ICAgMTMxNzg5MDQ3OSAgMCAgICAgMCAgICAgMTAwNj" |
300 | @@ -234,7 +244,8 @@ |
301 | "AAAAAAAAAAAAAAAAAAAAAMBF70s1/foAKAAAZGF0YS50YXIu Z3ogICAgIDEzMTc4OTA0N" |
302 | "zkgIDAgICAgIDAgICAgIDEwMDY0NCAgMTA3ICAgICAgIGAKH4sIAAAA AAACA+3KsQ3CQB" |
303 | "AEwCvlK4D/N4frMSGBkQz0jwmQiHCEo5lkpd09HOPv6mrMfGcbs37nR7R2Pg01" |
304 | - "ew5r32rvNUrGDp73x7SUEpfrbZl//LZ2AAAAAAAAAAAA2NELx33R7wAoAAAK") |
305 | + "ew5r32rvNUrGDp73x7SUEpfrbZl//LZ2AAAAAAAAAAAA2NELx33R7wAoAAAK" |
306 | +).encode("ascii") |
307 | |
308 | PKGDEB_SIMPLE_RELATIONS = ( |
309 | "ITxhcmNoPgpkZWJpYW4tYmluYXJ5ICAgMTMxODUxNjMyMiAgMCAgICAgMCAgICAgMTAwNj" |
310 | @@ -249,7 +260,8 @@ |
311 | "EKgcHt1gAoAABkYXRhLnRhci5neiAgICAgMTMxODUxNjMyMiAgMCAgICAgMCAg ICAgMTA" |
312 | "wNjQ0ICAxMDcgICAgICAgYAofiwgAAAAAAAID7cqxDcJQEETBK8UVwH2b+64HQgIjGegf " |
313 | "CJCIIMLRTPKC3d0+/i6f5qpX21z52bdorR+m7Fl9imw5jhVDxQbu19txHYY4nS/r8uX3aw" |
314 | - "cAAAAA AAAAAIANPQALnD6FACgAAAo=") |
315 | + "cAAAAA AAAAAIANPQALnD6FACgAAAo=" |
316 | +).encode("ascii") |
317 | |
318 | |
319 | PKGDEB_VERSION_RELATIONS = ( |
320 | @@ -265,7 +277,8 @@ |
321 | "AAAACAy/sAwTtOtwAoAABkYXRhLnRhci5neiAgICAgMTMxODUxNjQ5OCAgMCAg ICAgMCA" |
322 | "gICAgMTAwNjQ0ICAxMDcgICAgICAgYAofiwgAAAAAAAID7cqxEcIwEETRK0UVgCT7UD0Q " |
323 | "EpgZA/0DATNEEOHoveQHu7t9/F19GpmvtpH1s2/R2mGeemYfc9RW+9SjZGzgfr0d11LidL" |
324 | - "6sy5ff rx0AAAAAAAAAAAA29AD/ixlwACgAAAo=") |
325 | + "6sy5ff rx0AAAAAAAAAAAA29AD/ixlwACgAAAo=" |
326 | +).encode("ascii") |
327 | |
328 | |
329 | PKGDEB_MULTIPLE_RELATIONS = ( |
330 | @@ -282,7 +295,8 @@ |
331 | "0YS50YXIuZ3ogICAgIDEzMTg1ODAwNzkgIDAgICAgIDAgICAgIDEwMDY0NCAgMTA3ICAg " |
332 | "ICAgIGAKH4sIAAAAAAACA+3KsRHCMBBE0StFFYBkfFY9EBKYGWP3DwTMEEGEo/eSH+wejv" |
333 | "F39aln vtp61s++RWvTeBpy6tmjtjqMLUrGDrb7el5Kicv1tsxffr92AAAAAAAAAAAA2NE" |
334 | - "Db6L1AQAoAAAK") |
335 | + "Db6L1AQAoAAAK" |
336 | +).encode("ascii") |
337 | |
338 | |
339 | PKGDEB_OR_RELATIONS = ( |
340 | @@ -299,7 +313,8 @@ |
341 | "6ICAgICAxMzE3ODg4ODY5ICAwICAgICAwICAgICAxMDA2NDQgIDEwNyAgICAgICBgCh+L " |
342 | "CAAAAAAAAgPtyrsRwjAURNFXiioAfZBcjwkJzIyB/oGAGSIc4eic5Aa7h2P8XX6Zen+3TD" |
343 | "1/9yNK" |
344 | - "GadWR2ltRC651hGpxw4et/u8phTny3Vdfvy2dgAAAAAAAAAAANjRE6Lr2rEAKAAACg==") |
345 | + "GadWR2ltRC651hGpxw4et/u8phTny3Vdfvy2dgAAAAAAAAAAANjRE6Lr2rEAKAAACg==" |
346 | +).encode("ascii") |
347 | |
348 | |
349 | HASH1 = base64.decodestring(b"/ezv4AefpJJ8DuYFSq4RiEHJYP4=") |
350 | |
351 | === modified file 'landscape/package/tests/test_reporter.py' |
352 | --- landscape/package/tests/test_reporter.py 2017-03-20 09:43:08 +0000 |
353 | +++ landscape/package/tests/test_reporter.py 2017-03-21 17:01:15 +0000 |
354 | @@ -27,7 +27,6 @@ |
355 | LandscapeTest, BrokerServiceHelper, EnvironSaverHelper) |
356 | from landscape.reactor import FakeReactor |
357 | |
358 | -from landscape.compat import convert_buffer_to_string |
359 | |
360 | SAMPLE_LSB_RELEASE = "DISTRIB_CODENAME=codename\n" |
361 | |
362 | @@ -96,21 +95,21 @@ |
363 | os.chmod(self.reporter.apt_update_filename, 0o755) |
364 | |
365 | def test_set_package_ids_with_all_known(self): |
366 | - self.store.add_hash_id_request(["hash1", "hash2"]) |
367 | - request2 = self.store.add_hash_id_request(["hash3", "hash4"]) |
368 | - self.store.add_hash_id_request(["hash5", "hash6"]) |
369 | + self.store.add_hash_id_request([b"hash1", b"hash2"]) |
370 | + request2 = self.store.add_hash_id_request([b"hash3", b"hash4"]) |
371 | + self.store.add_hash_id_request([b"hash5", b"hash6"]) |
372 | |
373 | self.store.add_task("reporter", |
374 | {"type": "package-ids", "ids": [123, 456], |
375 | "request-id": request2.id}) |
376 | |
377 | def got_result(result): |
378 | - self.assertEqual(self.store.get_hash_id("hash1"), None) |
379 | - self.assertEqual(self.store.get_hash_id("hash2"), None) |
380 | - self.assertEqual(self.store.get_hash_id("hash3"), 123) |
381 | - self.assertEqual(self.store.get_hash_id("hash4"), 456) |
382 | - self.assertEqual(self.store.get_hash_id("hash5"), None) |
383 | - self.assertEqual(self.store.get_hash_id("hash6"), None) |
384 | + self.assertEqual(self.store.get_hash_id(b"hash1"), None) |
385 | + self.assertEqual(self.store.get_hash_id(b"hash2"), None) |
386 | + self.assertEqual(self.store.get_hash_id(b"hash3"), 123) |
387 | + self.assertEqual(self.store.get_hash_id(b"hash4"), 456) |
388 | + self.assertEqual(self.store.get_hash_id(b"hash5"), None) |
389 | + self.assertEqual(self.store.get_hash_id(b"hash6"), None) |
390 | |
391 | deferred = self.reporter.handle_tasks() |
392 | return deferred.addCallback(got_result) |
393 | @@ -129,7 +128,7 @@ |
394 | |
395 | message_store.set_accepted_types(["add-packages"]) |
396 | |
397 | - request1 = self.store.add_hash_id_request(["foo", HASH1, "bar"]) |
398 | + request1 = self.store.add_hash_id_request([b"foo", HASH1, b"bar"]) |
399 | |
400 | self.store.add_task("reporter", |
401 | {"type": "package-ids", |
402 | @@ -184,7 +183,7 @@ |
403 | |
404 | message_store.set_accepted_types(["add-packages"]) |
405 | |
406 | - request1 = self.store.add_hash_id_request(["foo", HASH1, "bar"]) |
407 | + request1 = self.store.add_hash_id_request([b"foo", HASH1, b"bar"]) |
408 | |
409 | self.store.add_task("reporter", |
410 | {"type": "package-ids", |
411 | @@ -238,7 +237,7 @@ |
412 | deferred = Deferred() |
413 | deferred.errback(Boom()) |
414 | |
415 | - request_id = self.store.add_hash_id_request(["foo", HASH1, "bar"]).id |
416 | + request_id = self.store.add_hash_id_request([b"foo", HASH1, b"bar"]).id |
417 | |
418 | self.store.add_task("reporter", {"type": "package-ids", |
419 | "ids": [123, None, 456], |
420 | @@ -259,7 +258,7 @@ |
421 | return result.addCallback(got_result, send_mock) |
422 | |
423 | def test_set_package_ids_removes_request_id_when_done(self): |
424 | - request = self.store.add_hash_id_request(["hash1"]) |
425 | + request = self.store.add_hash_id_request([b"hash1"]) |
426 | self.store.add_task("reporter", {"type": "package-ids", "ids": [123], |
427 | "request-id": request.id}) |
428 | |
429 | @@ -562,7 +561,7 @@ |
430 | self.assertTrue(self.reporter._apt_sources_have_changed()) |
431 | |
432 | def test_remove_expired_hash_id_request(self): |
433 | - request = self.store.add_hash_id_request(["hash1"]) |
434 | + request = self.store.add_hash_id_request([b"hash1"]) |
435 | request.message_id = 9999 |
436 | |
437 | request.timestamp -= HASH_ID_REQUEST_TIMEOUT |
438 | @@ -575,7 +574,7 @@ |
439 | return result.addCallback(got_result) |
440 | |
441 | def test_remove_expired_hash_id_request_wont_remove_before_timeout(self): |
442 | - request1 = self.store.add_hash_id_request(["hash1"]) |
443 | + request1 = self.store.add_hash_id_request([b"hash1"]) |
444 | request1.message_id = 9999 |
445 | request1.timestamp -= HASH_ID_REQUEST_TIMEOUT / 2 |
446 | |
447 | @@ -592,7 +591,7 @@ |
448 | return result.addCallback(got_result) |
449 | |
450 | def test_remove_expired_hash_id_request_updates_timestamps(self): |
451 | - request = self.store.add_hash_id_request(["hash1"]) |
452 | + request = self.store.add_hash_id_request([b"hash1"]) |
453 | message_store = self.broker_service.message_store |
454 | message_id = message_store.add({"type": "add-packages", |
455 | "packages": [], |
456 | @@ -607,7 +606,7 @@ |
457 | return result.addCallback(got_result) |
458 | |
459 | def test_remove_expired_hash_id_request_removes_when_no_message_id(self): |
460 | - request = self.store.add_hash_id_request(["hash1"]) |
461 | + request = self.store.add_hash_id_request([b"hash1"]) |
462 | |
463 | def got_result(result): |
464 | self.assertRaises(UnknownHashIDRequest, |
465 | @@ -1305,9 +1304,9 @@ |
466 | spawn_patcher = mock.patch.object(reporter, "spawn_process", |
467 | side_effect=[ |
468 | # Simulate series of failures to acquire the apt lock. |
469 | - succeed(('', '', 100)), |
470 | - succeed(('', '', 100)), |
471 | - succeed(('', '', 100))]) |
472 | + succeed((b'', b'', 100)), |
473 | + succeed((b'', b'', 100)), |
474 | + succeed((b'', b'', 100))]) |
475 | spawn_patcher.start() |
476 | self.addCleanup(spawn_patcher.stop) |
477 | |
478 | @@ -1343,8 +1342,8 @@ |
479 | spawn_patcher = mock.patch.object(reporter, "spawn_process", |
480 | side_effect=[ |
481 | # Simulate a failed apt lock grab then a successful one. |
482 | - succeed(('', '', 100)), |
483 | - succeed(('output', 'error', 0))]) |
484 | + succeed((b'', b'', 100)), |
485 | + succeed((b'output', b'error', 0))]) |
486 | spawn_patcher.start() |
487 | self.addCleanup(spawn_patcher.stop) |
488 | |
489 | @@ -1628,7 +1627,7 @@ |
490 | return deferred |
491 | |
492 | @mock.patch("landscape.package.reporter.spawn_process", |
493 | - return_value=succeed(("", "", 0))) |
494 | + return_value=succeed((b"", b"", 0))) |
495 | def test_run_apt_update_honors_http_proxy(self, mock_spawn_process): |
496 | """ |
497 | The PackageReporter.run_apt_update method honors the http_proxy |
498 | @@ -1647,7 +1646,7 @@ |
499 | env={"http_proxy": "http://proxy_server:8080"}) |
500 | |
501 | @mock.patch("landscape.package.reporter.spawn_process", |
502 | - return_value=succeed(("", "", 0))) |
503 | + return_value=succeed((b"", b"", 0))) |
504 | def test_run_apt_update_honors_https_proxy(self, mock_spawn_process): |
505 | """ |
506 | The PackageReporter.run_apt_update method honors the https_proxy |
507 | @@ -1869,8 +1868,7 @@ |
508 | "SELECT id, data FROM message").fetchall()) |
509 | self.assertEqual(1, len(stored)) |
510 | self.assertEqual(1, stored[0][0]) |
511 | - self.assertEqual(message, |
512 | - bpickle.loads(convert_buffer_to_string(stored[0][1]))) |
513 | + self.assertEqual(message, bpickle.loads(bytes(stored[0][1]))) |
514 | result.addCallback(callback) |
515 | result.chainDeferred(deferred) |
516 | |
517 | |
518 | === modified file 'landscape/package/tests/test_store.py' |
519 | --- landscape/package/tests/test_store.py 2017-01-09 14:29:54 +0000 |
520 | +++ landscape/package/tests/test_store.py 2017-03-21 17:01:15 +0000 |
521 | @@ -19,12 +19,12 @@ |
522 | self.store2 = HashIdStore(self.filename) |
523 | |
524 | def test_set_and_get_hash_id(self): |
525 | - self.store1.set_hash_ids({"ha\x00sh1": 123, "ha\x00sh2": 456}) |
526 | - self.assertEqual(self.store1.get_hash_id("ha\x00sh1"), 123) |
527 | - self.assertEqual(self.store1.get_hash_id("ha\x00sh2"), 456) |
528 | + self.store1.set_hash_ids({b"ha\x00sh1": 123, b"ha\x00sh2": 456}) |
529 | + self.assertEqual(self.store1.get_hash_id(b"ha\x00sh1"), 123) |
530 | + self.assertEqual(self.store1.get_hash_id(b"ha\x00sh2"), 456) |
531 | |
532 | def test_get_hash_ids(self): |
533 | - hash_ids = {"hash1": 123, "hash2": 456} |
534 | + hash_ids = {b"hash1": 123, b"hash2": 456} |
535 | self.store1.set_hash_ids(hash_ids) |
536 | self.assertEqual(self.store1.get_hash_ids(), hash_ids) |
537 | |
538 | @@ -80,33 +80,33 @@ |
539 | self.assertEqual([None], rollbacks) |
540 | |
541 | def test_get_id_hash(self): |
542 | - self.store1.set_hash_ids({"hash1": 123, "hash2": 456}) |
543 | - self.assertEqual(self.store2.get_id_hash(123), "hash1") |
544 | - self.assertEqual(self.store2.get_id_hash(456), "hash2") |
545 | + self.store1.set_hash_ids({b"hash1": 123, b"hash2": 456}) |
546 | + self.assertEqual(self.store2.get_id_hash(123), b"hash1") |
547 | + self.assertEqual(self.store2.get_id_hash(456), b"hash2") |
548 | |
549 | def test_clear_hash_ids(self): |
550 | - self.store1.set_hash_ids({"ha\x00sh1": 123, "ha\x00sh2": 456}) |
551 | + self.store1.set_hash_ids({b"ha\x00sh1": 123, b"ha\x00sh2": 456}) |
552 | self.store1.clear_hash_ids() |
553 | - self.assertEqual(self.store2.get_hash_id("ha\x00sh1"), None) |
554 | - self.assertEqual(self.store2.get_hash_id("ha\x00sh2"), None) |
555 | + self.assertEqual(self.store2.get_hash_id(b"ha\x00sh1"), None) |
556 | + self.assertEqual(self.store2.get_hash_id(b"ha\x00sh2"), None) |
557 | |
558 | def test_get_unexistent_hash(self): |
559 | - self.assertEqual(self.store1.get_hash_id("hash1"), None) |
560 | + self.assertEqual(self.store1.get_hash_id(b"hash1"), None) |
561 | |
562 | def test_get_unexistent_id(self): |
563 | self.assertEqual(self.store1.get_id_hash(123), None) |
564 | |
565 | def test_overwrite_id_hash(self): |
566 | - self.store1.set_hash_ids({"hash1": 123}) |
567 | - self.store2.set_hash_ids({"hash2": 123}) |
568 | - self.assertEqual(self.store1.get_hash_id("hash1"), None) |
569 | - self.assertEqual(self.store1.get_hash_id("hash2"), 123) |
570 | + self.store1.set_hash_ids({b"hash1": 123}) |
571 | + self.store2.set_hash_ids({b"hash2": 123}) |
572 | + self.assertEqual(self.store1.get_hash_id(b"hash1"), None) |
573 | + self.assertEqual(self.store1.get_hash_id(b"hash2"), 123) |
574 | |
575 | def test_overwrite_hash_id(self): |
576 | - self.store1.set_hash_ids({"hash1": 123}) |
577 | - self.store2.set_hash_ids({"hash1": 456}) |
578 | + self.store1.set_hash_ids({b"hash1": 123}) |
579 | + self.store2.set_hash_ids({b"hash1": 456}) |
580 | self.assertEqual(self.store1.get_id_hash(123), None) |
581 | - self.assertEqual(self.store1.get_id_hash(456), "hash1") |
582 | + self.assertEqual(self.store1.get_id_hash(456), b"hash1") |
583 | |
584 | def test_check_sanity(self): |
585 | |
586 | @@ -184,22 +184,22 @@ |
587 | |
588 | def test_get_hash_id_using_hash_id_dbs(self): |
589 | # Without hash=>id dbs |
590 | - self.assertEqual(self.store1.get_hash_id("hash1"), None) |
591 | - self.assertEqual(self.store1.get_hash_id("hash2"), None) |
592 | + self.assertEqual(self.store1.get_hash_id(b"hash1"), None) |
593 | + self.assertEqual(self.store1.get_hash_id(b"hash2"), None) |
594 | |
595 | # This hash=>id will be overriden |
596 | - self.store1.set_hash_ids({"hash1": 1}) |
597 | + self.store1.set_hash_ids({b"hash1": 1}) |
598 | |
599 | # Add a couple of hash=>id dbs |
600 | - self.store1.add_hash_id_db(self.hash_id_db_factory({"hash1": 2, |
601 | - "hash2": 3})) |
602 | - self.store1.add_hash_id_db(self.hash_id_db_factory({"hash2": 4, |
603 | - "ha\x00sh1": 5})) |
604 | + self.store1.add_hash_id_db(self.hash_id_db_factory({b"hash1": 2, |
605 | + b"hash2": 3})) |
606 | + self.store1.add_hash_id_db(self.hash_id_db_factory({b"hash2": 4, |
607 | + b"ha\x00sh1": 5})) |
608 | |
609 | # Check look-up priorities and binary hashes |
610 | - self.assertEqual(self.store1.get_hash_id("hash1"), 2) |
611 | - self.assertEqual(self.store1.get_hash_id("hash2"), 3) |
612 | - self.assertEqual(self.store1.get_hash_id("ha\x00sh1"), 5) |
613 | + self.assertEqual(self.store1.get_hash_id(b"hash1"), 2) |
614 | + self.assertEqual(self.store1.get_hash_id(b"hash2"), 3) |
615 | + self.assertEqual(self.store1.get_hash_id(b"ha\x00sh1"), 5) |
616 | |
617 | def test_get_id_hash_using_hash_id_db(self): |
618 | """ |
619 | @@ -207,13 +207,13 @@ |
620 | to query them first, falling back to the regular db in case |
621 | the desired mapping is not found. |
622 | """ |
623 | - self.store1.add_hash_id_db(self.hash_id_db_factory({"hash1": 123})) |
624 | - self.store1.add_hash_id_db(self.hash_id_db_factory({"hash1": 999, |
625 | - "hash2": 456})) |
626 | - self.store1.set_hash_ids({"hash3": 789}) |
627 | - self.assertEqual(self.store1.get_id_hash(123), "hash1") |
628 | - self.assertEqual(self.store1.get_id_hash(456), "hash2") |
629 | - self.assertEqual(self.store1.get_id_hash(789), "hash3") |
630 | + self.store1.add_hash_id_db(self.hash_id_db_factory({b"hash1": 123})) |
631 | + self.store1.add_hash_id_db(self.hash_id_db_factory({b"hash1": 999, |
632 | + b"hash2": 456})) |
633 | + self.store1.set_hash_ids({b"hash3": 789}) |
634 | + self.assertEqual(self.store1.get_id_hash(123), b"hash1") |
635 | + self.assertEqual(self.store1.get_id_hash(456), b"hash2") |
636 | + self.assertEqual(self.store1.get_id_hash(789), b"hash3") |
637 | |
638 | def test_add_and_get_available_packages(self): |
639 | self.store1.add_available([1, 2]) |
640 | |
641 | === modified file 'landscape/schema.py' |
642 | --- landscape/schema.py 2017-03-10 12:40:17 +0000 |
643 | +++ landscape/schema.py 2017-03-21 17:01:15 +0000 |
644 | @@ -67,8 +67,8 @@ |
645 | class Bytes(object): |
646 | """A binary string.""" |
647 | def coerce(self, value): |
648 | - if not isinstance(value, str): |
649 | - raise InvalidError("%r isn't a str" % (value,)) |
650 | + if not isinstance(value, bytes): |
651 | + raise InvalidError("%r isn't a bytestring" % (value,)) |
652 | return value |
653 | |
654 | |
655 | |
656 | === modified file 'py3_ready_tests' |
657 | --- py3_ready_tests 2017-03-15 08:40:11 +0000 |
658 | +++ py3_ready_tests 2017-03-21 17:01:15 +0000 |
659 | @@ -1,2 +1,5 @@ |
660 | landscape.lib.tests |
661 | landscape.sysinfo.tests |
662 | +landscape.package.tests.test_store |
663 | +landscape.package.tests.test_reporter |
664 | + |
Command: TRIAL_ARGS=-j4 make check /ci.lscape. net/job/ latch-test- xenial/ 3632/
Result: Success
Revno: 981
Branch: lp:~gocept/landscape-client/py3-package-store-reporter
Jenkins: https:/