Merge lp:~xnox/ubuntu-release-upgrader/lp1409555 into lp:ubuntu-release-upgrader
- lp1409555
- Merge into trunk
Status: | Merged |
---|---|
Merged at revision: | 2859 |
Proposed branch: | lp:~xnox/ubuntu-release-upgrader/lp1409555 |
Merge into: | lp:ubuntu-release-upgrader |
Diff against target: |
854 lines (+212/-120) 22 files modified
DistUpgrade/DistUpgradeAptCdrom.py (+36/-35) DistUpgrade/DistUpgradeAufs.py (+19/-15) DistUpgrade/DistUpgradeCache.py (+4/-4) DistUpgrade/DistUpgradeConfigParser.py (+2/-1) DistUpgrade/DistUpgradeController.py (+27/-12) DistUpgrade/DistUpgradeFetcherCore.py (+2/-2) DistUpgrade/DistUpgradeMain.py (+2/-1) DistUpgrade/DistUpgradePatcher.py (+7/-3) DistUpgrade/DistUpgradeQuirks.py (+17/-10) DistUpgrade/DistUpgradeView.py (+3/-2) DistUpgrade/DistUpgradeViewKDE.py (+3/-1) DistUpgrade/DistUpgradeViewNonInteractive.py (+5/-2) check-new-release-gtk (+2/-2) data/mirrors.cfg (+0/-2) debian/changelog (+7/-0) tests/data-sources-list-test/sources.list.extras (+8/-0) tests/test_cdrom.py (+8/-6) tests/test_prerequists.py (+11/-6) tests/test_quirks.py (+7/-4) tests/test_sources_list.py (+25/-1) tests/test_xorg_fix_intrepid.py (+9/-4) utils/update_mirrors.py (+8/-7) |
To merge this branch: | bzr merge lp:~xnox/ubuntu-release-upgrader/lp1409555 |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Michael Vogt (community) | Approve | ||
Brian Murray | Pending | ||
Ubuntu Core Development Team | Pending | ||
Review via email: mp+246233@code.launchpad.net |
Commit message
Description of the change
Not sure how to test this "for-real".
Upgrade from utopic -> vivid should work with this upgrader, sources.list after the upgrade should not have neither the comment nor the extras.ubuntu.com repository.
I do not know if the comment in the sources.list is translated or not into other languages, if yes I might need a few more changes to completely remove traces of extras.ubuntu.com.
Brian Murray (brian-murray) wrote : | # |
Brian Murray (brian-murray) wrote : | # |
To test it "for-real" we could build the package, copy over the vivid.tar.gz to a utopic system and try upgrading.
Brian Murray (brian-murray) wrote : | # |
There was a failure with the test that I've commented on in-line.
Dimitri John Ledkov (xnox) wrote : | # |
On 13 January 2015 at 22:50, Brian Murray <email address hidden> wrote:
> There was a failure with the test that I've commented on in-line.
>
>> +
>> + sources_file = apt_pkg.
>
> The test was failing for me and it seems to because the sources_file isn't saved when updateSourcesList is called. This is why the other tests use self._verifySou
>
Hm, this could be due to environment changes. I'm developing this on Trusty.
_verifySources() is not sufficient for this test. _verifySources only
checks that "these lines are in the file", I however need to test for
"extras.ubuntu.com lines _are removed from the file_" Thus
_verifySources was passing the test for me, with or without, the
patch. I don't believe wee have done outright removal ever before
(only commenting things out).
I'll check the branch / and test on utopic/vivid in chroot to see what
is going on.
>> + self.assertEqua
>> +deb http://
>> +deb http://
>> +
>> +""")
>> +
>> def test_powerpc_
>> """
>> test transition of powerpc to ports.ubuntu.com
>>
--
Regards,
Dimitri.
Brian Murray (brian-murray) wrote : | # |
In the process of tracking down the test failure I also found a method for removing entries from sources.list, sources.
Dimitri John Ledkov (xnox) wrote : | # |
On 13 January 2015 at 23:00, Brian Murray <email address hidden> wrote:
> In the process of tracking down the test failure I also found a method for removing entries from sources.list, sources.
That was my original implementation, however it's no good. As it
iterates the list for each removal invokes comparators to remove the
first item with the matching value from the list. Given that list is
"objects" rather than strings, the value for the "commented" string
did not match. Furthermore in python one is not allowed to modify a
list whilst iterating over it, thus a new list (or copy will be
required). This single pass is cumbersome, but most straight-forward
way to implement it. Or e.g. python3 on trusty has some subtle object
comparison error.
If it works and passes the upgrade test with both trusty & utopic
python3, I'm all happy to use whichever algorithm that gives correct
end-result.
--
Regards,
Dimitri.
Dimitri John Ledkov (xnox) wrote : | # |
I've now pushes fixes for all open() calls and logging.warn.
The build ends up using system installation of the ubuntu-
...
$ sudo rm /usr/lib/
$ bzr bd
Above two commands make the build complete including the full test suite, in the current branch.
Note that in the pre-build.sh:
"(cd utils && ./demotions.py utopic vivid > demoted.cfg)"
passes for me on Trusty, but fails on Vivid, which may be related to changes/regressions in vivid's python-apt. However, I believe that is not related to my changes.
Michael Vogt (mvo) wrote : | # |
Hi Dimitri, thanks a lot for this work. The branch looks good, whats puzzling is that I see a test failure when running bzr-buildpackage. It looks like nosetests3 is not using the current directory and not honoring PYTHONPATH here which is a bit confusing. Only when python3-distupgrade is not instaleld do the tests work for me when run with nosetest3. Beside this the diff looks fine and I think we should upload it.
Preview Diff
1 | === modified file 'DistUpgrade/DistUpgradeAptCdrom.py' |
2 | --- DistUpgrade/DistUpgradeAptCdrom.py 2014-06-26 06:43:50 +0000 |
3 | +++ DistUpgrade/DistUpgradeAptCdrom.py 2015-01-20 22:44:23 +0000 |
4 | @@ -63,9 +63,11 @@ |
5 | diskname = self._readDiskName() |
6 | pentry = self._generateSourcesListLine(diskname, self.packages) |
7 | sourceslist = apt_pkg.config.find_file("Dir::Etc::sourcelist") |
8 | - content = open(sourceslist).read() |
9 | + with open(sourceslist) as f: |
10 | + content = f.read() |
11 | content = content.replace(pentry, "# %s" % pentry) |
12 | - open(sourceslist, "w").write(content) |
13 | + with open(sourceslist, "w") as f: |
14 | + f.write(content) |
15 | |
16 | def _scanCD(self): |
17 | """ |
18 | @@ -106,9 +108,9 @@ |
19 | cdrom = apt_pkg.Cdrom() |
20 | id = cdrom.ident(apt.progress.base.CdromProgress()) |
21 | label = self._readDiskName() |
22 | - out = open(dbfile, "a") |
23 | - out.write('CD::%s "%s";\n' % (id, label)) |
24 | - out.write('CD::%s::Label "%s";\n' % (id, label)) |
25 | + with open(dbfile, "a") as out: |
26 | + out.write('CD::%s "%s";\n' % (id, label)) |
27 | + out.write('CD::%s::Label "%s";\n' % (id, label)) |
28 | |
29 | def _dropArch(self, packages): |
30 | """ drop architectures that are not ours """ |
31 | @@ -127,7 +129,8 @@ |
32 | diskname = self.cdrompath |
33 | info = os.path.join(self.cdrompath, ".disk", "info") |
34 | if os.path.exists(info): |
35 | - diskname = open(info).read() |
36 | + with open(info) as f: |
37 | + diskname = f.read() |
38 | for special in ('"', ']', '[', '_'): |
39 | diskname = diskname.replace(special, '_') |
40 | return diskname |
41 | @@ -162,17 +165,13 @@ |
42 | "cdrom:[%s]/%s" % (diskname, f[f.find("dists"):])) |
43 | outf = os.path.join(targetdir, os.path.splitext(fname)[0]) |
44 | if f.endswith(".gz"): |
45 | - g = gzip.open(f) |
46 | - try: |
47 | - with open(outf, "wb") as out: |
48 | - # uncompress in 64k chunks |
49 | - while True: |
50 | - s = g.read(64000) |
51 | - out.write(s) |
52 | - if s == b"": |
53 | - break |
54 | - finally: |
55 | - g.close() |
56 | + with gzip.open(f) as g, open(outf, "wb") as out: |
57 | + # uncompress in 64k chunks |
58 | + while True: |
59 | + s = g.read(64000) |
60 | + out.write(s) |
61 | + if s == b"": |
62 | + break |
63 | else: |
64 | shutil.copy(f, outf) |
65 | return True |
66 | @@ -187,17 +186,13 @@ |
67 | "cdrom:[%s]/%s" % (diskname, f[f.find("dists"):])) |
68 | outf = os.path.join(targetdir, os.path.splitext(fname)[0]) |
69 | if f.endswith(".gz"): |
70 | - g = gzip.open(f) |
71 | - try: |
72 | - with open(outf, "wb") as out: |
73 | - # uncompress in 64k chunks |
74 | - while True: |
75 | - s = g.read(64000) |
76 | - out.write(s) |
77 | - if s == b"": |
78 | - break |
79 | - finally: |
80 | - g.close() |
81 | + with gzip.open(f) as g, open(outf, "wb") as out: |
82 | + # uncompress in 64k chunks |
83 | + while True: |
84 | + s = g.read(64000) |
85 | + out.write(s) |
86 | + if s == b"": |
87 | + break |
88 | else: |
89 | shutil.copy(f, outf) |
90 | return True |
91 | @@ -218,15 +213,18 @@ |
92 | if not (ret == 0): |
93 | return False |
94 | # now do the hash sum checks |
95 | - t = apt_pkg.TagFile(open(releasef)) |
96 | - t.step() |
97 | - for entry in t.section["SHA256"].split("\n"): |
98 | + with open(releasef) as f: |
99 | + t = apt_pkg.TagFile(f) |
100 | + t.step() |
101 | + sha256_section = t.section["SHA256"] |
102 | + for entry in sha256_section.split("\n"): |
103 | (hash, size, name) = entry.split() |
104 | f = os.path.join(basepath, name) |
105 | if not os.path.exists(f): |
106 | logging.info("ignoring missing '%s'" % f) |
107 | continue |
108 | - sum = apt_pkg.sha256sum(open(f)) |
109 | + with open(f) as fp: |
110 | + sum = apt_pkg.sha256sum(open(fp)) |
111 | if not (sum == hash): |
112 | logging.error( |
113 | "hash sum mismatch expected %s but got %s" % ( |
114 | @@ -279,9 +277,12 @@ |
115 | |
116 | # prepend to the sources.list |
117 | sourceslist = apt_pkg.config.find_file("Dir::Etc::sourcelist") |
118 | - content = open(sourceslist).read() |
119 | - open(sourceslist, "w").write( |
120 | - "# added by the release upgrader\n%s\n%s" % (debline, content)) |
121 | + with open(sourceslist) as f: |
122 | + content = f.read() |
123 | + with open(sourceslist, "w") as f: |
124 | + f.write( |
125 | + "# added by the release upgrader\n%s\n%s" % |
126 | + (debline, content)) |
127 | self._writeDatabase() |
128 | |
129 | return True |
130 | |
131 | === modified file 'DistUpgrade/DistUpgradeAufs.py' |
132 | --- DistUpgrade/DistUpgradeAufs.py 2014-06-26 06:43:50 +0000 |
133 | +++ DistUpgrade/DistUpgradeAufs.py 2015-01-20 22:44:23 +0000 |
134 | @@ -117,10 +117,11 @@ |
135 | |
136 | def is_aufs_mount(dir): |
137 | " test if the given dir is already mounted with aufs overlay " |
138 | - for line in open("/proc/mounts"): |
139 | - (device, mountpoint, fstype, options, a, b) = line.split() |
140 | - if device == "none" and fstype == "aufs" and mountpoint == dir: |
141 | - return True |
142 | + with open("/proc/mounts") as f: |
143 | + for line in f: |
144 | + (device, mountpoint, fstype, options, a, b) = line.split() |
145 | + if device == "none" and fstype == "aufs" and mountpoint == dir: |
146 | + return True |
147 | return False |
148 | |
149 | |
150 | @@ -182,7 +183,8 @@ |
151 | # create something vaguely rollbackable |
152 | |
153 | # get the mount points before the aufs buisiness starts |
154 | - mounts = open("/proc/mounts").read() |
155 | + with open("/proc/mounts") as f: |
156 | + mounts = f.read() |
157 | from .DistUpgradeMain import SYSTEM_DIRS |
158 | systemdirs = SYSTEM_DIRS |
159 | |
160 | @@ -234,16 +236,18 @@ |
161 | # include sub mounts) |
162 | needs_bind_mount = set() |
163 | needs_bind_mount.add("/var/cache/apt/archives") |
164 | - for line in open("/proc/mounts"): |
165 | - (device, mountpoint, fstype, options, a, b) = line.split() |
166 | - if is_real_fs(fstype) and is_submount(mountpoint, systemdirs): |
167 | - logging.warning("mountpoint %s submount of systemdir" % mountpoint) |
168 | - return False |
169 | - if (fstype != "aufs" and |
170 | - not is_real_fs(fstype) and |
171 | - is_submount(mountpoint, systemdirs)): |
172 | - logging.debug("found %s that needs bind mount", mountpoint) |
173 | - needs_bind_mount.add(mountpoint) |
174 | + with open("/proc/mounts") as f: |
175 | + for line in f: |
176 | + (device, mountpoint, fstype, options, a, b) = line.split() |
177 | + if is_real_fs(fstype) and is_submount(mountpoint, systemdirs): |
178 | + logging.warning("mountpoint %s submount of systemdir" % |
179 | + mountpoint) |
180 | + return False |
181 | + if (fstype != "aufs" and |
182 | + not is_real_fs(fstype) and |
183 | + is_submount(mountpoint, systemdirs)): |
184 | + logging.debug("found %s that needs bind mount", mountpoint) |
185 | + needs_bind_mount.add(mountpoint) |
186 | |
187 | # aufs mounts do not support stacked filesystems, so |
188 | # if we mount /var we will loose the tmpfs stuff |
189 | |
190 | === modified file 'DistUpgrade/DistUpgradeCache.py' |
191 | --- DistUpgrade/DistUpgradeCache.py 2014-05-02 13:07:42 +0000 |
192 | +++ DistUpgrade/DistUpgradeCache.py 2015-01-20 22:44:23 +0000 |
193 | @@ -54,7 +54,7 @@ |
194 | def _set_kernel_initrd_size(): |
195 | size = estimate_kernel_size_in_boot() |
196 | if size == 0: |
197 | - logging.warn("estimate_kernel_size_in_boot() returned '0'?") |
198 | + logging.warning("estimate_kernel_size_in_boot() returned '0'?") |
199 | size = 28*1024*1024 |
200 | # add small safety buffer |
201 | size += 1*1024*1024 |
202 | @@ -613,7 +613,7 @@ |
203 | # check if we got a new kernel (if we are not inside a |
204 | # chroot) |
205 | if inside_chroot(): |
206 | - logging.warn("skipping kernel checks because we run inside a chroot") |
207 | + logging.warning("skipping kernel checks because we run inside a chroot") |
208 | else: |
209 | self.checkForKernel() |
210 | |
211 | @@ -849,7 +849,7 @@ |
212 | except SystemError as e: |
213 | # warn here, but don't fail, its possible that meta-packages |
214 | # conflict (like ubuntu-desktop vs xubuntu-desktop) LP: #775411 |
215 | - logging.warn("Can't mark '%s' for upgrade (%s)" % (key, e)) |
216 | + logging.warning("Can't mark '%s' for upgrade (%s)" % (key, e)) |
217 | |
218 | # check if we have a meta-pkg, if not, try to guess which one to pick |
219 | if not metaPkgInstalled(): |
220 | @@ -1082,7 +1082,7 @@ |
221 | st = os.statvfs(d) |
222 | free = st.f_bavail * st.f_frsize |
223 | else: |
224 | - logging.warn("directory '%s' does not exists" % d) |
225 | + logging.warning("directory '%s' does not exists" % d) |
226 | free = 0 |
227 | if fs_id in mnt_map: |
228 | logging.debug("Dir %s mounted on %s" % |
229 | |
230 | === modified file 'DistUpgrade/DistUpgradeConfigParser.py' |
231 | --- DistUpgrade/DistUpgradeConfigParser.py 2014-06-26 06:30:22 +0000 |
232 | +++ DistUpgrade/DistUpgradeConfigParser.py 2015-01-20 22:44:23 +0000 |
233 | @@ -86,7 +86,8 @@ |
234 | p = os.path.join(self.datadir, filename) |
235 | if not os.path.exists(p): |
236 | logging.error("getListFromFile: no '%s' found" % p) |
237 | - items = [x.strip() for x in open(p)] |
238 | + with open(p) as f: |
239 | + items = [x.strip() for x in f] |
240 | return [s for s in items if not s.startswith("#") and not s == ""] |
241 | |
242 | |
243 | |
244 | === modified file 'DistUpgrade/DistUpgradeController.py' |
245 | --- DistUpgrade/DistUpgradeController.py 2014-11-10 13:14:32 +0000 |
246 | +++ DistUpgrade/DistUpgradeController.py 2015-01-20 22:44:23 +0000 |
247 | @@ -363,7 +363,8 @@ |
248 | logging.debug("_pythonSymlinkCheck run") |
249 | if os.path.exists('/usr/share/python/debian_defaults'): |
250 | config = SafeConfigParser() |
251 | - config.readfp(open('/usr/share/python/debian_defaults')) |
252 | + with open('/usr/share/python/debian_defaults') as f: |
253 | + config.readfp(f) |
254 | try: |
255 | expected_default = config.get('DEFAULT', 'default-version') |
256 | except NoOptionError: |
257 | @@ -552,7 +553,8 @@ |
258 | s += "deb http://archive.ubuntu.com/ubuntu %s main restricted" % self.toDist |
259 | s += "deb http://archive.ubuntu.com/ubuntu %s-updates main restricted" % self.toDist |
260 | s += "deb http://security.ubuntu.com/ubuntu %s-security main restricted" % self.toDist |
261 | - open("/etc/apt/sources.list","w").write(s) |
262 | + with open("/etc/apt/sources.list","w") as f: |
263 | + f.write(s) |
264 | break |
265 | |
266 | # this must map, i.e. second in "from" must be the second in "to" |
267 | @@ -565,6 +567,20 @@ |
268 | for x in pockets] |
269 | self.sources_disabled = False |
270 | |
271 | + # Special quirk to remove extras.ubuntu.com |
272 | + new_list = [] |
273 | + for entry in self.sources.list[:]: |
274 | + if "/extras.ubuntu.com" in entry.uri: |
275 | + continue |
276 | + if entry.line.startswith( |
277 | + "## This software is not part of Ubuntu, but is offered by third-party"): |
278 | + continue |
279 | + if entry.line.startswith( |
280 | + "## developers who want to ship their latest software."): |
281 | + continue |
282 | + new_list.append(entry) |
283 | + self.sources.list = new_list |
284 | + |
285 | # look over the stuff we have |
286 | foundToDist = False |
287 | # collect information on what components (main,universe) are enabled for what distro (sub)version |
288 | @@ -657,8 +673,7 @@ |
289 | "/security.ubuntu.com" in entry.uri or |
290 | "%s-security" % self.fromDist in entry.dist or |
291 | "%s-backports" % self.fromDist in entry.dist or |
292 | - "/archive.canonical.com" in entry.uri or |
293 | - "/extras.ubuntu.com" in entry.uri): |
294 | + "/archive.canonical.com" in entry.uri): |
295 | validTo = False |
296 | if entry.dist in toDists: |
297 | # so the self.sources.list is already set to the new |
298 | @@ -1127,7 +1142,8 @@ |
299 | # the previous release, no packages have been installed |
300 | # yet (LP: #328655, #356781) |
301 | if os.path.exists("/var/run/ubuntu-release-upgrader-apt-exception"): |
302 | - e = open("/var/run/ubuntu-release-upgrader-apt-exception").read() |
303 | + with open("/var/run/ubuntu-release-upgrader-apt-exception") as f: |
304 | + e = f.read() |
305 | logging.error("found exception: '%s'" % e) |
306 | # if its a ordering bug we can cleanly revert but we need to write |
307 | # a marker for the parent process to know its this kind of error |
308 | @@ -1473,14 +1489,13 @@ |
309 | # go over the sources.list and try to find a valid mirror |
310 | # that we can use to add the backports dir |
311 | logging.debug("writing prerequists sources.list at: '%s' " % out) |
312 | - outfile = open(out, "w") |
313 | mirrorlines = self._getPreReqMirrorLines(dumb) |
314 | - for line in open(template): |
315 | - template = Template(line) |
316 | - outline = template.safe_substitute(mirror=mirrorlines) |
317 | - outfile.write(outline) |
318 | - logging.debug("adding '%s' prerequists" % outline) |
319 | - outfile.close() |
320 | + with open(out, "w") as outfile, open(template) as infile: |
321 | + for line in infile: |
322 | + template = Template(line) |
323 | + outline = template.safe_substitute(mirror=mirrorlines) |
324 | + outfile.write(outline) |
325 | + logging.debug("adding '%s' prerequists" % outline) |
326 | return True |
327 | |
328 | def getRequiredBackports(self): |
329 | |
330 | === modified file 'DistUpgrade/DistUpgradeFetcherCore.py' |
331 | --- DistUpgrade/DistUpgradeFetcherCore.py 2014-05-02 13:07:42 +0000 |
332 | +++ DistUpgrade/DistUpgradeFetcherCore.py 2015-01-20 22:44:23 +0000 |
333 | @@ -246,13 +246,13 @@ |
334 | af2 |
335 | result = fetcher.run() |
336 | if result != fetcher.RESULT_CONTINUE: |
337 | - logging.warn("fetch result != continue (%s)" % result) |
338 | + logging.warning("fetch result != continue (%s)" % result) |
339 | return False |
340 | # check that both files are really there and non-null |
341 | for f in [os.path.basename(self.new_dist.upgradeToolSig), |
342 | os.path.basename(self.new_dist.upgradeTool)]: |
343 | if not (os.path.exists(f) and os.path.getsize(f) > 0): |
344 | - logging.warn("file '%s' missing" % f) |
345 | + logging.warning("file '%s' missing" % f) |
346 | return False |
347 | return True |
348 | return False |
349 | |
350 | === modified file 'DistUpgrade/DistUpgradeMain.py' |
351 | --- DistUpgrade/DistUpgradeMain.py 2014-05-02 13:07:42 +0000 |
352 | +++ DistUpgrade/DistUpgradeMain.py 2015-01-20 22:44:23 +0000 |
353 | @@ -133,7 +133,8 @@ |
354 | try: |
355 | s=subprocess.Popen(["lspci","-nn"], stdout=subprocess.PIPE, |
356 | universal_newlines=True).communicate()[0] |
357 | - open(os.path.join(logdir, "lspci.txt"), "w").write(s) |
358 | + with open(os.path.join(logdir, "lspci.txt"), "w") as f: |
359 | + f.write(s) |
360 | except OSError as e: |
361 | logging.debug("lspci failed: %s" % e) |
362 | |
363 | |
364 | === modified file 'DistUpgrade/DistUpgradePatcher.py' |
365 | --- DistUpgrade/DistUpgradePatcher.py 2014-06-26 06:30:22 +0000 |
366 | +++ DistUpgrade/DistUpgradePatcher.py 2015-01-20 22:44:23 +0000 |
367 | @@ -39,12 +39,15 @@ |
368 | STATE_EXPECT_DATA) = range(2) |
369 | |
370 | # this is inefficient for big files |
371 | - orig_lines = open(orig, encoding="UTF-8").readlines() |
372 | + with open(orig, encoding="UTF-8") as f: |
373 | + orig_lines = f.readlines() |
374 | start = end = 0 |
375 | |
376 | # we start in wait-for-commend state |
377 | state = STATE_EXPECT_COMMAND |
378 | - for line in open(edpatch, encoding="UTF-8"): |
379 | + with open(edpatch, encoding="UTF-8") as f: |
380 | + lines = f.readlines() |
381 | + for line in lines: |
382 | if state == STATE_EXPECT_COMMAND: |
383 | # in commands get rid of whitespace, |
384 | line = line.strip() |
385 | @@ -102,5 +105,6 @@ |
386 | md5.update(result.encode("UTF-8")) |
387 | if md5.hexdigest() != result_md5sum: |
388 | raise PatchError("the md5sum after patching is not correct") |
389 | - open(orig, "w", encoding="UTF-8").write(result) |
390 | + with open(orig, "w", encoding="UTF-8") as f: |
391 | + f.write(result) |
392 | return True |
393 | |
394 | === modified file 'DistUpgrade/DistUpgradeQuirks.py' |
395 | --- DistUpgrade/DistUpgradeQuirks.py 2014-12-12 08:10:50 +0000 |
396 | +++ DistUpgrade/DistUpgradeQuirks.py 2015-01-20 22:44:23 +0000 |
397 | @@ -263,7 +263,8 @@ |
398 | if not os.path.exists(cpuinfo_path): |
399 | logging.error("cannot open %s ?!?" % cpuinfo_path) |
400 | return True |
401 | - cpuinfo = open(cpuinfo_path).read() |
402 | + with open(cpuinfo_path) as f: |
403 | + cpuinfo = f.read() |
404 | # check family |
405 | if re.search("^cpu family\s*:\s*[345]\s*", cpuinfo, re.MULTILINE): |
406 | logging.debug("found cpu family [345], no i686+") |
407 | @@ -304,7 +305,7 @@ |
408 | try: |
409 | os.kill(1, 0) |
410 | except: |
411 | - logging.warn("no init found") |
412 | + logging.warning("no init found") |
413 | res = self._view.askYesNoQuestion( |
414 | _("No init available"), |
415 | _("Your system appears to be a virtualised environment " |
416 | @@ -325,8 +326,9 @@ |
417 | if not os.path.exists("/proc/cpuinfo"): |
418 | logging.error("cannot open /proc/cpuinfo ?!?") |
419 | return False |
420 | - cpuinfo = open("/proc/cpuinfo") |
421 | - if re.search("^Processor\s*:\s*ARMv[45]", cpuinfo.read(), |
422 | + with open("/proc/cpuinfo") as f: |
423 | + cpuinfo = f.read() |
424 | + if re.search("^Processor\s*:\s*ARMv[45]", cpuinfo, |
425 | re.MULTILINE): |
426 | return False |
427 | return True |
428 | @@ -401,7 +403,8 @@ |
429 | # upgrade from Precise will fail if PAE is not in cpu flags |
430 | logging.debug("_checkPae") |
431 | pae = 0 |
432 | - cpuinfo = open('/proc/cpuinfo').read() |
433 | + with open('/proc/cpuinfo') as f: |
434 | + cpuinfo = f.read() |
435 | if re.search("^flags\s+:.* pae ", cpuinfo, re.MULTILINE): |
436 | pae = 1 |
437 | if not pae: |
438 | @@ -420,7 +423,9 @@ |
439 | XORG = "/etc/X11/xorg.conf" |
440 | if not os.path.exists(XORG): |
441 | return False |
442 | - for line in open(XORG): |
443 | + with open(XORG) as f: |
444 | + lines = f.readlines() |
445 | + for line in lines: |
446 | s = line.split("#")[0].strip() |
447 | # check for fglrx driver entry |
448 | if (s.lower().startswith("driver") and |
449 | @@ -464,7 +469,8 @@ |
450 | logging.debug("already at target hash, skipping '%s'" % path) |
451 | continue |
452 | elif md5.hexdigest() != md5sum: |
453 | - logging.warn("unexpected target md5sum, skipping: '%s'" % path) |
454 | + logging.warning("unexpected target md5sum, skipping: '%s'" |
455 | + % path) |
456 | continue |
457 | # patchable, do it |
458 | from .DistUpgradePatcher import patch |
459 | @@ -488,7 +494,7 @@ |
460 | # get pkg |
461 | if (pkgname not in self.controller.cache or |
462 | not self.controller.cache[pkgname].candidate): |
463 | - logging.warn("can not find '%s' in cache") |
464 | + logging.warning("can not find '%s' in cache") |
465 | return False |
466 | pkg = self.controller.cache[pkgname] |
467 | for (module, pciid_list) in \ |
468 | @@ -570,7 +576,8 @@ |
469 | os.makedirs("/etc/dpkg/dpkg.cfg.d/") |
470 | except OSError: |
471 | pass |
472 | - open(cfg, "w").write("foreign-architecture %s\n" % foreign_arch) |
473 | + with open(cfg, "w") as f: |
474 | + f.write("foreign-architecture %s\n" % foreign_arch) |
475 | |
476 | def ensure_recommends_are_installed_on_desktops(self): |
477 | """ ensure that on a desktop install recommends are installed |
478 | @@ -580,5 +587,5 @@ |
479 | if not apt.apt_pkg.config.find_b("Apt::Install-Recommends"): |
480 | msg = "Apt::Install-Recommends was disabled," |
481 | msg += " enabling it just for the upgrade" |
482 | - logging.warn(msg) |
483 | + logging.warning(msg) |
484 | apt.apt_pkg.config.set("Apt::Install-Recommends", "1") |
485 | |
486 | === modified file 'DistUpgrade/DistUpgradeView.py' |
487 | --- DistUpgrade/DistUpgradeView.py 2014-05-02 13:07:42 +0000 |
488 | +++ DistUpgrade/DistUpgradeView.py 2015-01-20 22:44:23 +0000 |
489 | @@ -139,7 +139,7 @@ |
490 | # FIXME: workaround issue in libapt/python-apt that does not |
491 | # raise a exception if *all* files fails to download |
492 | if status == apt_pkg.STAT_FAILED: |
493 | - logging.warn("update_status: dlFailed on '%s' " % uri) |
494 | + logging.warning("update_status: dlFailed on '%s' " % uri) |
495 | if uri.endswith("Release.gpg") or uri.endswith("Release"): |
496 | # only care about failures from network, not gpg, bzip, those |
497 | # are different issues |
498 | @@ -226,7 +226,8 @@ |
499 | except Exception as e: |
500 | print("Exception during pm.DoInstall(): ", e) |
501 | logging.exception("Exception during pm.DoInstall()") |
502 | - open("/var/run/ubuntu-release-upgrader-apt-exception","w").write(str(e)) |
503 | + with open("/var/run/ubuntu-release-upgrader-apt-exception","w") as f: |
504 | + f.write(str(e)) |
505 | os._exit(pm.ResultFailed) |
506 | os._exit(res) |
507 | self.child_pid = pid |
508 | |
509 | === modified file 'DistUpgrade/DistUpgradeViewKDE.py' |
510 | --- DistUpgrade/DistUpgradeViewKDE.py 2014-10-20 10:24:33 +0000 |
511 | +++ DistUpgrade/DistUpgradeViewKDE.py 2015-01-20 22:44:23 +0000 |
512 | @@ -864,7 +864,9 @@ |
513 | time.sleep(0.01) |
514 | |
515 | if sys.argv[1] == "--show-in-terminal": |
516 | - for c in open(sys.argv[2]).read(): |
517 | + with open(sys.argv[2]) as f: |
518 | + chars = f.read() |
519 | + for c in chars: |
520 | view.terminal_text.insertWithTermCodes( c ) |
521 | #print(c, ord(c)) |
522 | QApplication.processEvents() |
523 | |
524 | === modified file 'DistUpgrade/DistUpgradeViewNonInteractive.py' |
525 | --- DistUpgrade/DistUpgradeViewNonInteractive.py 2014-05-02 13:07:42 +0000 |
526 | +++ DistUpgrade/DistUpgradeViewNonInteractive.py 2015-01-20 22:44:23 +0000 |
527 | @@ -133,7 +133,8 @@ |
528 | if not os.path.exists(maintainer_script): |
529 | logging.error("can not find failed maintainer script '%s' " % maintainer_script) |
530 | return |
531 | - interp = open(maintainer_script).readline()[2:].strip().split()[0] |
532 | + with open(maintainer_script) as f: |
533 | + interp = f.readline()[2:].strip().split()[0] |
534 | if ("bash" in interp) or ("/bin/sh" in interp): |
535 | debug_opts = ["-ex"] |
536 | elif ("perl" in interp): |
537 | @@ -143,7 +144,9 @@ |
538 | logging.warning("unknown interpreter: '%s'" % interp) |
539 | |
540 | # check if debconf is used and fiddle a bit more if it is |
541 | - if ". /usr/share/debconf/confmodule" in open(maintainer_script).read(): |
542 | + with open(maintainer_script) as f: |
543 | + maintainer_script_text = f.read() |
544 | + if ". /usr/share/debconf/confmodule" in maintainer_script_text: |
545 | environ["DEBCONF_DEBUG"] = "developer" |
546 | environ["DEBIAN_HAS_FRONTEND"] = "1" |
547 | interp = "/usr/share/debconf/frontend" |
548 | |
549 | === modified file 'check-new-release-gtk' |
550 | --- check-new-release-gtk 2014-08-05 06:33:47 +0000 |
551 | +++ check-new-release-gtk 2015-01-20 22:44:23 +0000 |
552 | @@ -101,7 +101,7 @@ |
553 | # go into nag mode |
554 | if (ignore_dist == new_dist.name and |
555 | meta_release.no_longer_supported is None): |
556 | - logging.warn("found new dist '%s' but it is on the ignore list" % new_dist.name) |
557 | + logging.warning("found new dist '%s' but it is on the ignore list" % new_dist.name) |
558 | sys.exit() |
559 | |
560 | # show alert on unsupported distros |
561 | @@ -157,7 +157,7 @@ |
562 | |
563 | def timeout(self, user_data): |
564 | if self.new_dist is None: |
565 | - logging.warn("timeout reached, exiting") |
566 | + logging.warning("timeout reached, exiting") |
567 | Gtk.main_quit() |
568 | |
569 | if __name__ == "__main__": |
570 | |
571 | === modified file 'data/mirrors.cfg' |
572 | --- data/mirrors.cfg 2013-04-30 18:17:22 +0000 |
573 | +++ data/mirrors.cfg 2015-01-20 22:44:23 +0000 |
574 | @@ -11,8 +11,6 @@ |
575 | ftp://ports.ubuntu.com/ubuntu-ports/ |
576 | http://old-releases.ubuntu.com/ |
577 | ftp://old-releases.ubuntu.com/ |
578 | -http://extras.ubuntu.com/ubuntu |
579 | -ftp://extras.ubuntu.com/ubuntu |
580 | |
581 | #commercial (both urls are valid) |
582 | http://archive.canonical.com |
583 | |
584 | === modified file 'debian/changelog' |
585 | --- debian/changelog 2014-12-12 08:15:22 +0000 |
586 | +++ debian/changelog 2015-01-20 22:44:23 +0000 |
587 | @@ -1,3 +1,10 @@ |
588 | +ubuntu-release-upgrader (1:15.04.4) UNRELEASED; urgency=medium |
589 | + |
590 | + * Remove extras.ubuntu.com (LP: #1409555) |
591 | + * Fix resource warnings & logging.warn deprecation warning. |
592 | + |
593 | + -- Dimitri John Ledkov <dimitri.j.ledkov@linux.intel.com> Mon, 12 Jan 2015 22:52:47 +0000 |
594 | + |
595 | ubuntu-release-upgrader (1:15.04.3) vivid; urgency=low |
596 | |
597 | * add compatbility for vte 2.90 (and we need to keep that until |
598 | |
599 | === added file 'tests/data-sources-list-test/sources.list.extras' |
600 | --- tests/data-sources-list-test/sources.list.extras 1970-01-01 00:00:00 +0000 |
601 | +++ tests/data-sources-list-test/sources.list.extras 2015-01-20 22:44:23 +0000 |
602 | @@ -0,0 +1,8 @@ |
603 | +deb http://archive.ubuntu.com/ubuntu feisty main restricted |
604 | +deb http://archive.ubuntu.com/ubuntu feisty-updates main restricted |
605 | +deb http://security.ubuntu.com/ubuntu/ feisty-security main restricted |
606 | + |
607 | +## This software is not part of Ubuntu, but is offered by third-party |
608 | +## developers who want to ship their latest software. |
609 | +deb http://extras.ubuntu.com/ubuntu/ feisty main |
610 | +deb-src http://extras.ubuntu.com/ubuntu/ feisty main |
611 | |
612 | === modified file 'tests/test_cdrom.py' |
613 | --- tests/test_cdrom.py 2013-08-05 17:22:35 +0000 |
614 | +++ tests/test_cdrom.py 2015-01-20 22:44:23 +0000 |
615 | @@ -26,9 +26,9 @@ |
616 | |
617 | def testWriteDatabase(self): |
618 | expect = \ |
619 | - "CD::36e3f69081b7d10081d167b137886a71-2 " \ |
620 | + "CD::0380987599d9f666b749fbfe29d5b440-2 " \ |
621 | "\"Ubuntu 8.10 _Intrepid Ibex_ - Beta amd64 (20080930.4)\";\n" \ |
622 | - "CD::36e3f69081b7d10081d167b137886a71-2::Label " \ |
623 | + "CD::0380987599d9f666b749fbfe29d5b440-2::Label " \ |
624 | "\"Ubuntu 8.10 _Intrepid Ibex_ - Beta amd64 (20080930.4)\";\n" |
625 | p = CURDIR + "/test-data-cdrom/" |
626 | database = CURDIR + "/test-data-cdrom/cdrom.list" |
627 | @@ -39,7 +39,8 @@ |
628 | os.unlink(database) |
629 | cdrom = AptCdrom(None, p) |
630 | cdrom._writeDatabase() |
631 | - self.assertEqual(expect, open(database).read()) |
632 | + with open(database) as f: |
633 | + self.assertEqual(expect, f.read()) |
634 | |
635 | def testScanCD(self): |
636 | p = CURDIR + "/test-data-cdrom" |
637 | @@ -134,16 +135,17 @@ |
638 | def test_comment_out(self): |
639 | tmpdir = tempfile.mkdtemp() |
640 | sourceslist = os.path.join(tmpdir, "sources.list") |
641 | - open(sourceslist, "w") |
642 | apt_pkg.config.set("dir::etc::sourcelist", sourceslist) |
643 | apt_pkg.config.set("dir::state::lists", tmpdir) |
644 | view = Mock() |
645 | cdrom = AptCdrom(view, CURDIR + "/test-data-cdrom") |
646 | cdrom.add() |
647 | cdrom.comment_out_cdrom_entry() |
648 | - for line in open(sourceslist): |
649 | + with open(sourceslist) as f: |
650 | + sourceslines = f.readlines() |
651 | + for line in sourceslines: |
652 | self.assertTrue(line.startswith("#")) |
653 | - self.assertEqual(len(open(sourceslist).readlines()), 2) |
654 | + self.assertEqual(len(sourceslines), 2) |
655 | |
656 | |
657 | if __name__ == "__main__": |
658 | |
659 | === modified file 'tests/test_prerequists.py' |
660 | --- tests/test_prerequists.py 2013-08-05 17:22:35 +0000 |
661 | +++ tests/test_prerequists.py 2015-01-20 22:44:23 +0000 |
662 | @@ -89,7 +89,8 @@ |
663 | # unset sourceparts |
664 | apt_pkg.config.set("Dir::Etc::sourceparts", tmpdir) |
665 | # write empty status file |
666 | - open(tmpdir + "/status", "w") |
667 | + with open(tmpdir + "/status", "w") as f: |
668 | + f |
669 | os.makedirs(tmpdir + "/lists/partial") |
670 | apt_pkg.config.set("Dir::State", tmpdir) |
671 | apt_pkg.config.set("Dir::State::status", tmpdir + "/status") |
672 | @@ -117,7 +118,8 @@ |
673 | template = os.path.join(self.testdir, "prerequists-sources.list.in") |
674 | out = os.path.join(tmpdir, "prerequists-sources.list") |
675 | # write empty status file |
676 | - open(tmpdir + "/status", "w") |
677 | + with open(tmpdir + "/status", "w") as f: |
678 | + f |
679 | os.makedirs(tmpdir + "/lists/partial") |
680 | apt_pkg.config.set("Dir::State", tmpdir) |
681 | apt_pkg.config.set("Dir::State::status", tmpdir + "/status") |
682 | @@ -143,7 +145,8 @@ |
683 | "prerequists-sources.list.in.no_archive_falllback") |
684 | out = os.path.join(tmpdir, "prerequists-sources.list") |
685 | # write empty status file |
686 | - open(tmpdir + "/status", "w") |
687 | + with open(tmpdir + "/status", "w") as f: |
688 | + f |
689 | os.makedirs(tmpdir + "/lists/partial") |
690 | apt_pkg.config.set("Dir::State", tmpdir) |
691 | apt_pkg.config.set("Dir::State::status", tmpdir + "/status") |
692 | @@ -168,7 +171,8 @@ |
693 | "prerequists-sources.list.in.broken") |
694 | out = os.path.join(tmpdir, "prerequists-sources.list") |
695 | # write empty status file |
696 | - open(tmpdir + "/status", "w") |
697 | + with open(tmpdir + "/status", "w") as f: |
698 | + f |
699 | os.makedirs(tmpdir + "/lists/partial") |
700 | apt_pkg.config.set("Dir::State", tmpdir) |
701 | apt_pkg.config.set("Dir::State::status", tmpdir + "/status") |
702 | @@ -181,13 +185,14 @@ |
703 | self.assertTrue(exp) |
704 | |
705 | def _verifySources(self, filename, expected): |
706 | - sources_list = open(filename).read() |
707 | + with open(filename) as f: |
708 | + sources_list = f.read() |
709 | for l in expected.split("\n"): |
710 | if l: |
711 | self.assertTrue(l in sources_list, |
712 | "expected entry '%s' in '%s' missing, " |
713 | "got:\n%s" % (l, filename, |
714 | - open(filename).read())) |
715 | + sources_list)) |
716 | |
717 | if __name__ == "__main__": |
718 | unittest.main() |
719 | |
720 | === modified file 'tests/test_quirks.py' |
721 | --- tests/test_quirks.py 2014-12-12 08:10:50 +0000 |
722 | +++ tests/test_quirks.py 2015-01-20 22:44:23 +0000 |
723 | @@ -40,8 +40,10 @@ |
724 | """ helper for test_patch to verify that we get the expected result """ |
725 | # simple case is foo |
726 | patchdir = CURDIR + "/patchdir/" |
727 | - self.assertFalse("Hello" in open(patchdir + "foo").read()) |
728 | - self.assertTrue("Hello" in open(patchdir + "foo_orig").read()) |
729 | + with open(patchdir + "foo") as f: |
730 | + self.assertFalse("Hello" in f.read()) |
731 | + with open(patchdir + "foo_orig") as f: |
732 | + self.assertTrue("Hello" in f.read()) |
733 | md5 = hashlib.md5() |
734 | with open(patchdir + "foo", "rb") as patch: |
735 | md5.update(patch.read()) |
736 | @@ -62,8 +64,9 @@ |
737 | md5.update(patch.read()) |
738 | self.assertEqual(md5.hexdigest(), "cddc4be46bedd91db15ddb9f7ddfa804") |
739 | # test that incorrect md5sum after patching rejects the patch |
740 | - self.assertEqual(open(patchdir + "fail").read(), |
741 | - open(patchdir + "fail_orig").read()) |
742 | + with open(patchdir + "fail") as f1, open(patchdir + "fail_orig") as f2: |
743 | + self.assertEqual(f1.read(), |
744 | + f2.read()) |
745 | |
746 | def test_patch(self): |
747 | q = DistUpgradeQuirks(MockController(), MockConfig) |
748 | |
749 | === modified file 'tests/test_sources_list.py' |
750 | --- tests/test_sources_list.py 2014-05-02 12:37:57 +0000 |
751 | +++ tests/test_sources_list.py 2015-01-20 22:44:23 +0000 |
752 | @@ -137,6 +137,29 @@ |
753 | deb http://archive.canonical.com/ubuntu gutsy partner |
754 | """) |
755 | |
756 | + def test_extras_removal(self): |
757 | + """ |
758 | + test removal of extras.ubuntu.com archives |
759 | + """ |
760 | + original = os.path.join(self.testdir, |
761 | + "sources.list.extras") |
762 | + shutil.copy(original, |
763 | + os.path.join(self.testdir, "sources.list")) |
764 | + apt_pkg.config.set("Dir::Etc::sourceparts", |
765 | + os.path.join(self.testdir, "sources.list.d")) |
766 | + v = DistUpgradeViewNonInteractive() |
767 | + d = DistUpgradeController(v, datadir=self.testdir) |
768 | + d.openCache(lock=False) |
769 | + res = d.updateSourcesList() |
770 | + self.assertTrue(res) |
771 | + |
772 | + sources_file = apt_pkg.config.find_file("Dir::Etc::sourcelist") |
773 | + self.assertEqual(open(sources_file).read(),"""deb http://archive.ubuntu.com/ubuntu gutsy main restricted |
774 | +deb http://archive.ubuntu.com/ubuntu gutsy-updates main restricted |
775 | +deb http://security.ubuntu.com/ubuntu/ gutsy-security main restricted |
776 | + |
777 | +""") |
778 | + |
779 | def test_powerpc_transition(self): |
780 | """ |
781 | test transition of powerpc to ports.ubuntu.com |
782 | @@ -431,7 +454,8 @@ |
783 | |
784 | def _verifySources(self, expected): |
785 | sources_file = apt_pkg.config.find_file("Dir::Etc::sourcelist") |
786 | - sources_list = open(sources_file).read() |
787 | + with open(sources_file) as f: |
788 | + sources_list = f.read() |
789 | for l in expected.split("\n"): |
790 | self.assertTrue( |
791 | l in sources_list.split("\n"), |
792 | |
793 | === modified file 'tests/test_xorg_fix_intrepid.py' |
794 | --- tests/test_xorg_fix_intrepid.py 2014-05-03 06:37:43 +0000 |
795 | +++ tests/test_xorg_fix_intrepid.py 2015-01-20 22:44:23 +0000 |
796 | @@ -24,18 +24,23 @@ |
797 | def test_simple(self): |
798 | shutil.copy(self.ORIG, self.NEW) |
799 | replace_driver_from_xorg("fglrx", "ati", self.NEW) |
800 | - self.assertEqual(open(self.NEW).read(), open(self.ORIG).read()) |
801 | + with open(self.NEW) as n, open(self.ORIG) as o: |
802 | + self.assertEqual(n.read(), o.read()) |
803 | |
804 | def test_remove(self): |
805 | shutil.copy(self.FGLRX, self.NEW) |
806 | - self.assertTrue("fglrx" in open(self.NEW).read()) |
807 | + with open(self.NEW) as f: |
808 | + self.assertTrue("fglrx" in f.read()) |
809 | replace_driver_from_xorg("fglrx", "ati", self.NEW) |
810 | - self.assertFalse("fglrx" in open(self.NEW).read()) |
811 | + with open(self.NEW) as f: |
812 | + self.assertFalse("fglrx" in f.read()) |
813 | |
814 | def test_omment(self): |
815 | shutil.copy(self.FGLRX, self.NEW) |
816 | comment_out_driver_from_xorg("fglrx", self.NEW) |
817 | - for line in open(self.NEW): |
818 | + with open(self.NEW) as n: |
819 | + lines = n.readlines() |
820 | + for line in lines: |
821 | if re.match('^#.*Driver.*fglrx', line): |
822 | import logging |
823 | logging.info("commented out line found") |
824 | |
825 | === modified file 'utils/update_mirrors.py' |
826 | --- utils/update_mirrors.py 2014-06-26 06:24:06 +0000 |
827 | +++ utils/update_mirrors.py 2015-01-20 22:44:23 +0000 |
828 | @@ -5,18 +5,19 @@ |
829 | |
830 | # read what we have |
831 | current_mirrors = set() |
832 | -for line in open(sys.argv[1], "r"): |
833 | - current_mirrors.add(line.strip()) |
834 | +with open(sys.argv[1], "r") as f: |
835 | + for line in f: |
836 | + current_mirrors.add(line.strip()) |
837 | |
838 | |
839 | -outfile = open(sys.argv[1], "a") |
840 | d = feedparser.parse("https://launchpad.net/ubuntu/+archivemirrors-rss") |
841 | |
842 | #import pprint |
843 | #pp = pprint.PrettyPrinter(indent=4) |
844 | #pp.pprint(d) |
845 | |
846 | -for entry in d.entries: |
847 | - for link in entry.links: |
848 | - if link.href not in current_mirrors: |
849 | - outfile.write(link.href + "\n") |
850 | +with open(sys.argv[1], "a") as outfile: |
851 | + for entry in d.entries: |
852 | + for link in entry.links: |
853 | + if link.href not in current_mirrors: |
854 | + outfile.write(link.href + "\n") |
I've search my local archive of bug attachments, specifically sources.list files and VarLogDistupgra deAptclonesyste mstate. tar.gz (contains sources.list), and couldn't find the comments you are looking for translated at all so that's good.