Merge lp:~brian-murray/apport/use-launchpad into lp:~apport-hackers/apport/trunk
- use-launchpad
- Merge into trunk
Status: | Merged |
---|---|
Merged at revision: | 2971 |
Proposed branch: | lp:~brian-murray/apport/use-launchpad |
Merge into: | lp:~apport-hackers/apport/trunk |
Diff against target: |
690 lines (+421/-62) 2 files modified
backends/packaging-apt-dpkg.py (+265/-39) test/test_backend_apt_dpkg.py (+156/-23) |
To merge this branch: | bzr merge lp:~brian-murray/apport/use-launchpad |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Martin Pitt (community) | Approve | ||
Review via email: mp+259839@code.launchpad.net |
Commit message
Description of the change
This adds functionality to the install packages routine in package-apt-dpkg.py to download packages from Launchpad.
Some details:
This change was necessary, in case the package name is passed to Launchpad.
- return out
+ return out.replace("\n", "")
I also modified setup_foonux_config in the tests to accept a release, because as far as I know only wily has ddebs in Launchpad.
- 2960. By Brian Murray
-
restore blank line that was removed
Brian Murray (brian-murray) wrote : | # |
- 2961. By Brian Murray
-
Add in a function to check Launchpad for a source package if it doesn't exist in the archive.
Martin Pitt (pitti) wrote : | # |
> I also modified setup_foonux_config in the tests to accept a release, because as far as I know only wily has ddebs in Launchpad.
That's not too bad, but for the short-lived releases we'll have to update that test case very often. But note that ddebs have been enabled in Launchpad globally, there is no "per release" enablement. See
https:/
Thanks for working on this!
Brian Murray (brian-murray) : | # |
Brian Murray (brian-murray) : | # |
- 2962. By Brian Murray
-
Switch tests to using packages from Trusty instead of Wily.
- 2963. By Brian Murray
-
switch to communicating with Launchpad directly and not using python-
launchpadlib. - 2964. By Brian Murray
-
Stop using a hard coded ubuntu when communicating with Launchpad.
- 2965. By Brian Murray
-
remove release information from communication with Launchpad when it isn't needed.
- 2966. By Brian Murray
-
Use get_os_version to determine the distro_id since that check /etc/os-release and will return the right information for 'Ubuntu RTM'.
- 2967. By Brian Murray
-
change install_
packages_ armhf to request version of libc6 from trusty release - 2968. By Brian Murray
-
Prefer the specified package version only if there is one and fall back to the candidate.version if the specificed version isn't found in LP or it isn't provided.
- 2969. By Brian Murray
-
remove release and architecture arguments from get_source_tree.
Martin Pitt (pitti) : | # |
- 2970. By Brian Murray
-
Better manange AcquireFile references, handle unicode in SourceFileUpload data returned from Launchpad (python2 only), improve selection of which packages are marked for install.
- 2971. By Brian Murray
-
Ensure install_
packages_ armhf produces and obsolete message, add a test for installing a LP package version over one from the archive. - 2972. By Brian Murray
-
use strip instead of replace when determining package version.
- 2973. By Brian Murray
-
split out launchpad check from has_internet check.
- 2974. By Brian Murray
-
Use apt_pkg.
config. find_dir to find the location to download packages from Launchpad. - 2975. By Brian Murray
-
use python apt's AcquireFile and dpkg-source instead of dget to download and extract source packages from Launchpad.
- 2976. By Brian Murray
-
fix typo when calling get_system_
architecture( )
Brian Murray (brian-murray) wrote : | # |
I think this is ready for a second review.
- 2977. By Brian Murray
-
stop using quote_plus, because its not needed, and unquote source files so that downloaded files will have + in them instead of %2B which screws up dpkg-source.
- 2978. By Brian Murray
-
Also unquote the binary file url.
- 2979. By Brian Murray
-
add in an arch all package to install from Launchpad
- 2980. By Brian Murray
-
properly return None when no published sources are found, handle arch all binary packages on Launchpad
Martin Pitt (pitti) wrote : | # |
Whee, this already looks much better. I still have a few nitpicks and bugs, but this looks by and large great. I really like using the LP REST API directly, avoids all the trouble with py3-lplib.
Thank you!
- 2981. By Brian Murray
-
refactor how we determine which packages to extract.
- 2982. By Brian Murray
-
Fix handling of Launchpad returning unicode in some cases.
- 2983. By Brian Murray
-
return the first published binary link
- 2984. By Brian Murray
-
address some reviewer feedback
Brian Murray (brian-murray) wrote : | # |
I've addressed some of the issues you've raised in r2984 and asked a question or two.
Martin Pitt (pitti) wrote : | # |
Hey Brian,
Brian Murray [2015-06-11 0:11 -0000]:
> > + def get_distro_
> > + '''Get osname and cache the result.'''
> > +
> > + if self._distro_id is None:
> > + self._distro_id = self.get_
> > + if ' ' in self._distro_id:
> > + self._distro_id = self._distro_
>
> I renamed distro_id to distro_name and modified the doc string. I hope that's sufficient. It could be "ubuntu" or "ubuntu-rtm".
Ah, so this is for RTM, not for concatenating distro and version or
so. Mind adding that as a comment?
> > + # packages get installed
> > + self.assertTrue
> > + 'usr/lib/
>
> It's really challenging to find any packages to test that have been SRU'ed to Trusty two times since the beginning of May, which have binaries in main, and doesn't have arch specific paths. Do you have any suggestions on how we can proceed?
>
You could also just check for the package's
/usr/share/
always be there.
Cheers!
Martin
--
Martin Pitt | http://
Ubuntu Developer (www.ubuntu.com) | Debian Developer (www.debian.org)
- 2985. By Brian Murray
-
change the file for which check to one which is arch independent.
- 2986. By Brian Murray
-
add some clarifying comments.
- 2987. By Brian Murray
-
Handle the case where urllib raises an URLError.
- 2988. By Brian Murray
-
Also handle IOErrors reading data from Launchpad.
- 2989. By Brian Murray
-
Add in a dbgsym package for testing on Launchpad.
Brian Murray (brian-murray) wrote : | # |
Martin - I believe I have now addressed the final outstanding issues, as I've added handling for failures to connect to Launchpad and found a package in Trusty with -dbgsym for testing. I'd appreciate it if you could review it. Thanks!
Martin Pitt (pitti) wrote : | # |
Still two inline comments which are simple to fix.
I now get three test failures from "test/run backend_apt_dpkg":
=======
ERROR: test_install_
sandbox will install older package versions from launchpad
-------
Traceback (most recent call last):
File "backends/
cache.
File "/usr/lib/
apt_
File "/usr/lib/
return self._run_
File "/usr/lib/
raise FetchFailedExce
apt.cache.
Failed to fetch https:/
=======
ERROR: test_install_
install_packages() using packages only available on Launchpad
-------
Traceback (most recent call last):
File "backends/
cache.
File "/usr/lib/
apt_
File "/usr/lib/
return self._run_
File "/usr/lib/
raise FetchFailedExce
apt.cache.
Failed to fetch https:/
Failed to fetch https:/
Failed to fetch https:/
Failed to fetch https:/
test_get_
E: Can not find version '20101020ubuntu
E: Unable to find a sou...
Martin Pitt (pitti) wrote : | # |
For the record, this happens if there is a proxy set in apt. Presumably we shouldn't use python-apt to fetch raw debs from Launchpad, as that wouldn't work through proxies?
- 2990. By Brian Murray
-
Resolve possible failures when there is a proxy set in apt, by temporarily removing it when downloading source packages, and downloading the packages with urlopen for binary packages.
Brian Murray (brian-murray) wrote : | # |
On Tue, Jun 16, 2015 at 02:25:26PM -0000, Martin Pitt wrote:
> For the record, this happens if there is a proxy set in apt.
> Presumably we shouldn't use python-apt to fetch raw debs from
> Launchpad, as that wouldn't work through proxies?
I've resolved this by temporarily overriding the proxy configuration
when downloading source packages. For binary packages I download the
deb via urlopen and then pass the AcquireFile call a file url to the
package. This ensures that it still gets installed with all the other
packages and the file is verified via its sha1sum.
--
Brian Murray
- 2991. By Brian Murray
-
Consolidate network connectivity testing so there is only one test.
Brian Murray (brian-murray) wrote : | # |
I've fixed the two in-line comments and handle apt proxy settings a previously mentioned.
Martin Pitt (pitti) wrote : | # |
Thanks! This looks mostly fine now. The only bug that I see now is that downloading Launchpad debs seems to create permanent symlinks in the current directory:
~/ubuntu/
lrwxrwxrwx 1 martin martin 100 Jun 17 08:34 distro-
lrwxrwxrwx 1 martin martin 109 Jun 17 08:34 oxideqt-
lrwxrwxrwx 1 martin martin 109 Jun 17 08:32 oxideqt-
lrwxrwxrwx 1 martin martin 113 Jun 17 08:34 oxideqt-
lrwxrwxrwx 1 martin martin 113 Jun 17 08:32 oxideqt-
lrwxrwxrwx 1 martin martin 105 Jun 17 08:34 qemu-utils_
The temp dirs in /tmp/ are gone, but these symlinks look like they'd cause problems if apport cannot write into the current directory, might cause problems with multiple runs if the symlinks already exist, and are clutter.
- 2992. By Brian Murray
-
Fix test_backend_
apt_dpkg. py to work with python2 and python3. - 2993. By Brian Murray
-
instead of trying to debs from launchpad manually, just set an apt config so that connections to launchpad bypass any apt proxies.
Martin Pitt (pitti) wrote : | # |
Fantastic! I added the missing "direct" cache config for "launchpad.net" (plus "staging.
Great work, many thanks!
Preview Diff
1 | === modified file 'backends/packaging-apt-dpkg.py' |
2 | --- backends/packaging-apt-dpkg.py 2014-12-17 22:41:30 +0000 |
3 | +++ backends/packaging-apt-dpkg.py 2015-06-17 14:38:27 +0000 |
4 | @@ -14,17 +14,21 @@ |
5 | |
6 | import subprocess, os, glob, stat, sys, tempfile, shutil, time |
7 | import hashlib |
8 | +import json |
9 | |
10 | import warnings |
11 | warnings.filterwarnings('ignore', 'apt API not stable yet', FutureWarning) |
12 | import apt |
13 | try: |
14 | import cPickle as pickle |
15 | - from urllib import urlopen |
16 | - (pickle, urlopen) # pyflakes |
17 | + from urllib import urlopen, quote, unquote |
18 | + (pickle, urlopen, quote, unquote) # pyflakes |
19 | + URLError = IOError |
20 | except ImportError: |
21 | # python 3 |
22 | + from urllib.error import URLError |
23 | from urllib.request import urlopen |
24 | + from urllib.parse import quote, unquote |
25 | import pickle |
26 | |
27 | import apport |
28 | @@ -41,6 +45,8 @@ |
29 | self._contents_dir = None |
30 | self._mirror = None |
31 | self._virtual_mapping_obj = None |
32 | + self._launchpad_base = 'https://api.launchpad.net/devel' |
33 | + self._archive_url = self._launchpad_base + '/%s/main_archive' |
34 | |
35 | def __del__(self): |
36 | try: |
37 | @@ -192,10 +198,98 @@ |
38 | return True |
39 | return False |
40 | |
41 | + def get_lp_binary_package(self, distro_id, package, version, arch): |
42 | + package = quote(package) |
43 | + version = quote(version) |
44 | + ma = self.json_request(self._archive_url % distro_id) |
45 | + if not ma: |
46 | + return (None, None) |
47 | + ma_link = ma['self_link'] |
48 | + pb_url = ma_link + ('/?ws.op=getPublishedBinaries&binary_name=%s&version=%s&exact_match=true' % |
49 | + (package, version)) |
50 | + bpub_url = '' |
51 | + try: |
52 | + pbs = self.json_request(pb_url, entries=True) |
53 | + if not pbs: |
54 | + return (None, None) |
55 | + for pb in pbs: |
56 | + if pb['architecture_specific'] == 'false': |
57 | + bpub_url = pb['self_link'] |
58 | + break |
59 | + else: |
60 | + if pb['distro_arch_series_link'].endswith(arch): |
61 | + bpub_url = pb['self_link'] |
62 | + break |
63 | + except IndexError: |
64 | + return (None, None) |
65 | + if not bpub_url: |
66 | + return (None, None) |
67 | + bf_urls = bpub_url + '?ws.op=binaryFileUrls&include_meta=true' |
68 | + bfs = self.json_request(bf_urls) |
69 | + if not bfs: |
70 | + return (None, None) |
71 | + for bf in bfs: |
72 | + # return the first binary file url since there being more than one |
73 | + # is theoretical |
74 | + return (unquote(bf['url']), bf['sha1']) |
75 | + |
76 | + def json_request(self, url, entries=False): |
77 | + '''Open, read and parse the json of a url |
78 | + |
79 | + Set entries to True when the json data returned by Launchpad |
80 | + has a dictionary with an entries key which contains the data |
81 | + desired. |
82 | + ''' |
83 | + |
84 | + try: |
85 | + response = urlopen(url) |
86 | + except URLError: |
87 | + apport.warning('cannot connect to: %s' % url) |
88 | + return None |
89 | + try: |
90 | + content = response.read() |
91 | + except IOError: |
92 | + apport.warning('failure reading data at: %s' % url) |
93 | + return None |
94 | + if isinstance(content, bytes): |
95 | + content = content.decode('utf-8') |
96 | + if entries: |
97 | + return json.loads(content)['entries'] |
98 | + else: |
99 | + return json.loads(content) |
100 | + |
101 | + def get_lp_source_package(self, distro_id, package, version): |
102 | + package = quote(package) |
103 | + version = quote(version) |
104 | + ma = self.json_request(self._archive_url % distro_id) |
105 | + if not ma: |
106 | + return None |
107 | + ma_link = ma['self_link'] |
108 | + ps_url = ma_link + ('/?ws.op=getPublishedSources&exact_match=true&source_name=%s&version=%s' % |
109 | + (package, version)) |
110 | + # use the first entry as they are sorted chronologically |
111 | + try: |
112 | + ps = self.json_request(ps_url, entries=True)[0]['self_link'] |
113 | + except IndexError: |
114 | + return None |
115 | + sf_urls = ps + '?ws.op=sourceFileUrls' |
116 | + sfus = self.json_request(sf_urls) |
117 | + if not sfus: |
118 | + return None |
119 | + |
120 | + source_files = [] |
121 | + for sfu in sfus: |
122 | + if sys.version_info.major == 2 and isinstance(sfu, unicode): |
123 | + sfu = sfu.encode('utf-8') |
124 | + sfu = unquote(sfu) |
125 | + source_files.append(sfu) |
126 | + |
127 | + return source_files |
128 | + |
129 | def get_architecture(self, package): |
130 | '''Return the architecture of a package. |
131 | |
132 | - This might differ on multiarch architectures (e. g. an i386 Firefox |
133 | + This might differ on multiarch architectures (e. g. an i386 Firefox |
134 | package on a x86_64 system)''' |
135 | |
136 | if self._apt_pkg(package).installed: |
137 | @@ -450,7 +544,32 @@ |
138 | argv[-1] += '=' + version |
139 | try: |
140 | if subprocess.call(argv, cwd=dir, env=env) != 0: |
141 | - return None |
142 | + if not version: |
143 | + return None |
144 | + sf_urls = self.get_lp_source_package(self.get_distro_name(), |
145 | + srcpackage, version) |
146 | + if sf_urls: |
147 | + proxy = '' |
148 | + if apt.apt_pkg.config.find('Acquire::http::Proxy') != '': |
149 | + proxy = apt.apt_pkg.config.find('Acquire::http::Proxy') |
150 | + apt.apt_pkg.config.set('Acquire::http::Proxy', '') |
151 | + fetchProgress = apt.progress.base.AcquireProgress() |
152 | + fetcher = apt.apt_pkg.Acquire(fetchProgress) |
153 | + af_queue = [] |
154 | + for sf in sf_urls: |
155 | + af_queue.append(apt.apt_pkg.AcquireFile(fetcher, |
156 | + sf, destdir=dir)) |
157 | + result = fetcher.run() |
158 | + if result != fetcher.RESULT_CONTINUE: |
159 | + return None |
160 | + if proxy: |
161 | + apt.apt_pkg.config.set('Acquire::http::Proxy', proxy) |
162 | + for dsc in glob.glob(os.path.join(dir, '*.dsc')): |
163 | + subprocess.call(['dpkg-source', '-sn', |
164 | + '-x', dsc], stdout=subprocess.PIPE, |
165 | + cwd=dir) |
166 | + else: |
167 | + return None |
168 | except OSError: |
169 | return None |
170 | |
171 | @@ -512,6 +631,7 @@ |
172 | break |
173 | out.write(block) |
174 | out.flush() |
175 | + out.close() |
176 | ret = subprocess.call(['dpkg', '-i', os.path.join(target_dir, deb)]) |
177 | if ret == 0: |
178 | installed.append(deb.split('_')[0]) |
179 | @@ -555,13 +675,15 @@ |
180 | package servers down, etc.), this should raise a SystemError with a |
181 | meaningful error message. |
182 | ''' |
183 | + if not architecture: |
184 | + architecture = self.get_system_architecture() |
185 | if not configdir: |
186 | apt_sources = '/etc/apt/sources.list' |
187 | self.current_release_codename = self.get_distro_codename() |
188 | else: |
189 | # support architecture specific config, fall back to global config |
190 | apt_sources = os.path.join(configdir, release, 'sources.list') |
191 | - if architecture: |
192 | + if architecture != self.get_system_architecture(): |
193 | arch_apt_sources = os.path.join(configdir, release, |
194 | architecture, 'sources.list') |
195 | if os.path.exists(arch_apt_sources): |
196 | @@ -593,11 +715,10 @@ |
197 | tmp_aptroot = True |
198 | aptroot = tempfile.mkdtemp() |
199 | |
200 | - if architecture: |
201 | - apt.apt_pkg.config.set('APT::Architecture', architecture) |
202 | - else: |
203 | - apt.apt_pkg.config.set('APT::Architecture', self.get_system_architecture()) |
204 | + apt.apt_pkg.config.set('APT::Architecture', architecture) |
205 | apt.apt_pkg.config.set('Acquire::Languages', 'none') |
206 | + # directly connect to Launchpad when downloading deb files |
207 | + apt.apt_pkg.config.set('Acquire::http::Proxy::api.launchpad.net', 'direct') |
208 | |
209 | if verbose: |
210 | fetchProgress = apt.progress.text.AcquireProgress() |
211 | @@ -614,6 +735,8 @@ |
212 | raise SystemError(str(e)) |
213 | cache.open() |
214 | |
215 | + archivedir = apt.apt_pkg.config.find_dir("Dir::Cache::archives") |
216 | + |
217 | obsolete = '' |
218 | |
219 | src_records = apt.apt_pkg.SourceRecords() |
220 | @@ -632,6 +755,10 @@ |
221 | |
222 | # mark packages for installation |
223 | real_pkgs = set() |
224 | + lp_cache = {} |
225 | + fetcher = apt.apt_pkg.Acquire(fetchProgress) |
226 | + # need to keep AcquireFile references |
227 | + acquire_queue = [] |
228 | for (pkg, ver) in packages: |
229 | try: |
230 | cache_pkg = cache[pkg] |
231 | @@ -646,7 +773,17 @@ |
232 | if ver: |
233 | cache_pkg.candidate = cache_pkg.versions[ver] |
234 | except KeyError: |
235 | - obsolete += '%s version %s required, but %s is available\n' % (pkg, ver, cache_pkg.candidate.version) |
236 | + (lp_url, sha1sum) = self.get_lp_binary_package(self.get_distro_name(), |
237 | + pkg, ver, architecture) |
238 | + if lp_url: |
239 | + acquire_queue.append(apt.apt_pkg.AcquireFile(fetcher, |
240 | + lp_url, |
241 | + md5="sha1:%s" % sha1sum, |
242 | + destdir=archivedir)) |
243 | + lp_cache[pkg] = ver |
244 | + else: |
245 | + obsolete += '%s version %s required, but %s is available\n' % (pkg, ver, cache_pkg.candidate.version) |
246 | + |
247 | candidate = cache_pkg.candidate |
248 | real_pkgs.add(pkg) |
249 | |
250 | @@ -702,14 +839,31 @@ |
251 | |
252 | if candidate.architecture != 'all': |
253 | try: |
254 | - dbg = cache[pkg + '-dbg'] |
255 | + dbg_pkg = pkg + '-dbg' |
256 | + dbg = cache[dbg_pkg] |
257 | + pkg_found = False |
258 | # try to get the same version as pkg |
259 | - try: |
260 | - dbg.candidate = dbg.versions[candidate.version] |
261 | - except KeyError: |
262 | - obsolete += 'outdated -dbg package for %s: package version %s -dbg version %s\n' % ( |
263 | - pkg, candidate.version, dbg.candidate.version) |
264 | - real_pkgs.add(pkg + '-dbg') |
265 | + if ver: |
266 | + try: |
267 | + dbg.candidate = dbg.versions[ver] |
268 | + pkg_found = True |
269 | + except KeyError: |
270 | + (lp_url, sha1sum) = self.get_lp_binary_package(self.get_distro_name(), |
271 | + dbg_pkg, ver, architecture) |
272 | + if lp_url: |
273 | + acquire_queue.append(apt.apt_pkg.AcquireFile(fetcher, |
274 | + lp_url, |
275 | + md5="sha1:%s" % sha1sum, |
276 | + destdir=archivedir)) |
277 | + lp_cache[dbg_pkg] = ver |
278 | + pkg_found = True |
279 | + if not pkg_found: |
280 | + try: |
281 | + dbg.candidate = dbg.versions[candidate.version] |
282 | + except KeyError: |
283 | + obsolete += 'outdated -dbg package for %s: package version %s -dbg version %s\n' % ( |
284 | + pkg, ver, dbg.candidate.version) |
285 | + real_pkgs.add(dbg_pkg) |
286 | except KeyError: |
287 | # install all -dbg from the source package |
288 | if src_records.lookup(candidate.source_name): |
289 | @@ -718,37 +872,82 @@ |
290 | dbgs = [] |
291 | if dbgs: |
292 | for p in dbgs: |
293 | - # try to get the same version as pkg |
294 | - try: |
295 | - cache[p].candidate = cache[p].versions[candidate.version] |
296 | - except KeyError: |
297 | - # we don't really expect that, but it's possible that |
298 | - # other binaries have a different version |
299 | - pass |
300 | + # if the package has already been added to |
301 | + # real_pkgs don't search for it again |
302 | + if p in real_pkgs: |
303 | + continue |
304 | + pkg_found = False |
305 | + # prefer the version requested |
306 | + if ver: |
307 | + try: |
308 | + cache[p].candidate = cache[p].versions[ver] |
309 | + pkg_found = True |
310 | + except KeyError: |
311 | + (lp_url, sha1sum) = self.get_lp_binary_package(self.get_distro_name(), |
312 | + p, ver, architecture) |
313 | + if lp_url: |
314 | + acquire_queue.append(apt.apt_pkg.AcquireFile(fetcher, |
315 | + lp_url, |
316 | + md5="sha1:%s" % sha1sum, |
317 | + destdir=archivedir)) |
318 | + lp_cache[p] = ver |
319 | + pkg_found = True |
320 | + if not pkg_found: |
321 | + try: |
322 | + cache[p].candidate = cache[p].versions[candidate.version] |
323 | + except KeyError: |
324 | + # we don't really expect that, but it's possible that |
325 | + # other binaries have a different version |
326 | + pass |
327 | real_pkgs.add(p) |
328 | else: |
329 | try: |
330 | - dbgsym = cache[pkg + '-dbgsym'] |
331 | - real_pkgs.add(pkg + '-dbgsym') |
332 | - try: |
333 | - dbgsym.candidate = dbgsym.versions[candidate.version] |
334 | - except KeyError: |
335 | - obsolete += 'outdated debug symbol package for %s: package version %s dbgsym version %s\n' % ( |
336 | - pkg, candidate.version, dbgsym.candidate.version) |
337 | + dbgsym_pkg = pkg + '-dbgsym' |
338 | + dbgsym = cache[dbgsym_pkg] |
339 | + real_pkgs.add(dbgsym_pkg) |
340 | + pkg_found = False |
341 | + # prefer the version requested |
342 | + if ver: |
343 | + try: |
344 | + dbgsym.candidate = dbgsym.versions[ver] |
345 | + pkg_found = True |
346 | + except KeyError: |
347 | + (lp_url, sha1sum) = self.get_lp_binary_package(self.get_distro_name(), |
348 | + dbgsym_pkg, ver, architecture) |
349 | + if lp_url: |
350 | + acquire_queue.append(apt.apt_pkg.AcquireFile(fetcher, |
351 | + lp_url, |
352 | + md5="sha1:%s" % sha1sum, |
353 | + destdir=archivedir)) |
354 | + lp_cache[dbgsym_pkg] = ver |
355 | + pkg_found = True |
356 | + if not pkg_found: |
357 | + try: |
358 | + dbgsym.candidate = dbgsym.versions[candidate.version] |
359 | + except KeyError: |
360 | + obsolete += 'outdated debug symbol package for %s: package version %s dbgsym version %s\n' % ( |
361 | + pkg, candidate.version, dbgsym.candidate.version) |
362 | except KeyError: |
363 | obsolete += 'no debug symbol package found for %s\n' % pkg |
364 | |
365 | # unpack packages, weed out the ones that are already installed (for |
366 | # permanent sandboxes) |
367 | for p in real_pkgs.copy(): |
368 | - if pkg_versions.get(p) != cache[p].candidate.version: |
369 | - cache[p].mark_install(False, False) |
370 | + if ver: |
371 | + if pkg_versions.get(p) != ver: |
372 | + cache[p].mark_install(False, False) |
373 | + elif pkg_versions.get(p) != cache[p].candidate.version: |
374 | + cache[p].mark_install(False, False) |
375 | + else: |
376 | + real_pkgs.remove(p) |
377 | else: |
378 | - real_pkgs.remove(p) |
379 | + if pkg_versions.get(p) != cache[p].candidate.version: |
380 | + cache[p].mark_install(False, False) |
381 | + else: |
382 | + real_pkgs.remove(p) |
383 | |
384 | last_written = time.time() |
385 | # fetch packages |
386 | - fetcher = apt.apt_pkg.Acquire(fetchProgress) |
387 | try: |
388 | cache.fetch_archives(fetcher=fetcher) |
389 | except apt.cache.FetchFailedException as e: |
390 | @@ -761,9 +960,22 @@ |
391 | if not permanent_rootdir or os.path.getctime(i.destfile) > last_written: |
392 | out = subprocess.check_output(['dpkg-deb', '--show', i.destfile]).decode() |
393 | (p, v) = out.strip().split() |
394 | - pkg_versions[p] = v |
395 | - subprocess.check_call(['dpkg', '-x', i.destfile, rootdir]) |
396 | - real_pkgs.remove(os.path.basename(i.destfile).split('_', 1)[0]) |
397 | + # don't extract the same version of the package if it is |
398 | + # already extracted |
399 | + if pkg_versions.get(p) == v: |
400 | + pass |
401 | + # don't extract the package if it is a different version than |
402 | + # the one we want to extract from Launchpad |
403 | + elif p in lp_cache and lp_cache[p] != v: |
404 | + pass |
405 | + else: |
406 | + subprocess.check_call(['dpkg', '-x', i.destfile, rootdir]) |
407 | + pkg_versions[p] = v |
408 | + pkg_name = os.path.basename(i.destfile).split('_', 1)[0] |
409 | + # because a package may exist multiple times in the fetcher it may |
410 | + # have already been removed |
411 | + if pkg_name in real_pkgs: |
412 | + real_pkgs.remove(pkg_name) |
413 | |
414 | # update package list |
415 | pkgs = list(pkg_versions.keys()) |
416 | @@ -1013,7 +1225,7 @@ |
417 | '''Return the version of a .deb file''' |
418 | |
419 | dpkg = subprocess.Popen(['dpkg-deb', '-f', pkg, 'Version'], stdout=subprocess.PIPE) |
420 | - out = dpkg.communicate(input)[0].decode('UTF-8') |
421 | + out = dpkg.communicate(input)[0].decode('UTF-8').strip() |
422 | assert dpkg.returncode == 0 |
423 | assert out |
424 | return out |
425 | @@ -1038,4 +1250,18 @@ |
426 | |
427 | return self._distro_codename |
428 | |
429 | + _distro_name = None |
430 | + |
431 | + def get_distro_name(self): |
432 | + '''Get osname from /etc/os-release, or if that doesn't exist, |
433 | + 'lsb_release -sir' output and cache the result.''' |
434 | + |
435 | + if self._distro_name is None: |
436 | + self._distro_name = self.get_os_version()[0].lower() |
437 | + if ' ' in self._distro_name: |
438 | + # concatenate distro name e.g. ubuntu-rtm |
439 | + self._distro_name = self._distro_name.replace(' ', '-') |
440 | + |
441 | + return self._distro_name |
442 | + |
443 | impl = __AptDpkgPackageInfo() |
444 | |
445 | === modified file 'test/test_backend_apt_dpkg.py' |
446 | --- test/test_backend_apt_dpkg.py 2015-05-28 14:01:10 +0000 |
447 | +++ test/test_backend_apt_dpkg.py 2015-06-17 14:38:27 +0000 |
448 | @@ -1,7 +1,16 @@ |
449 | import unittest, gzip, imp, subprocess, tempfile, shutil, os, os.path, time |
450 | -import glob, urllib |
451 | +import glob, sys |
452 | from apt import apt_pkg |
453 | |
454 | +try: |
455 | + from urllib import urlopen |
456 | + URLError = IOError |
457 | + (urlopen) # pyflakes |
458 | +except ImportError: |
459 | + # python3 |
460 | + from urllib.request import urlopen |
461 | + from urllib.error import URLError |
462 | + |
463 | if os.environ.get('APPORT_TEST_LOCAL'): |
464 | impl = imp.load_source('', 'backends/packaging-apt-dpkg.py').impl |
465 | else: |
466 | @@ -11,18 +20,21 @@ |
467 | def _has_internet(): |
468 | '''Return if there is sufficient network connection for the tests. |
469 | |
470 | - This checks if http://ddebs.ubuntu.com/ can be downloaded from, to check if |
471 | - we can run the online tests. |
472 | + This checks if https://api.launchpad.net/devel/ubuntu/ can be downloaded |
473 | + from, to check if we can run the online tests. |
474 | ''' |
475 | if os.environ.get('SKIP_ONLINE_TESTS'): |
476 | return False |
477 | if _has_internet.cache is None: |
478 | _has_internet.cache = False |
479 | try: |
480 | - f = urllib.request.urlopen('http://ddebs.ubuntu.com/dbgsym-release-key.asc', timeout=30) |
481 | - if f.readline().startswith(b'-----BEGIN PGP'): |
482 | + if sys.version > '3': |
483 | + f = urlopen('https://api.launchpad.net/devel/ubuntu/', timeout=30) |
484 | + else: |
485 | + f = urlopen('https://api.launchpad.net/devel/ubuntu/') |
486 | + if f.readline().startswith(b'{"all_specifications'): |
487 | _has_internet.cache = True |
488 | - except (IOError, urllib.error.URLError): |
489 | + except URLError: |
490 | pass |
491 | return _has_internet.cache |
492 | |
493 | @@ -807,11 +819,11 @@ |
494 | self._setup_foonux_config() |
495 | obsolete = impl.install_packages(self.rootdir, self.configdir, 'Foonux 1.2', |
496 | [('coreutils', '8.21-1ubuntu5'), |
497 | - ('libc6', '2.19-0ubuntu5'), |
498 | + ('libc6', '2.19-0ubuntu0'), |
499 | ], False, self.cachedir, |
500 | architecture='armhf') |
501 | |
502 | - self.assertEqual(obsolete, 'libc6 version 2.19-0ubuntu5 required, but 2.19-0ubuntu6 is available\n') |
503 | + self.assertEqual(obsolete, 'libc6 version 2.19-0ubuntu0 required, but 2.19-0ubuntu6 is available\n') |
504 | |
505 | self.assertTrue(os.path.exists(os.path.join(self.rootdir, |
506 | 'usr/bin/stat'))) |
507 | @@ -826,6 +838,113 @@ |
508 | self.assertTrue('libc6_2.19-0ubuntu6_armhf.deb' in cache, cache) |
509 | |
510 | @unittest.skipUnless(_has_internet(), 'online test') |
511 | + def test_install_packages_from_launchpad(self): |
512 | + '''install_packages() using packages only available on Launchpad''' |
513 | + |
514 | + self._setup_foonux_config() |
515 | + obsolete = impl.install_packages(self.rootdir, self.configdir, 'Foonux 1.2', |
516 | + [('oxideqt-codecs', |
517 | + '1.6.6-0ubuntu0.14.04.1'), |
518 | + ('distro-info-data', |
519 | + '0.18ubuntu0.2'), |
520 | + ('qemu-utils', |
521 | + '2.0.0+dfsg-2ubuntu1.11') |
522 | + ], False, self.cachedir) |
523 | + |
524 | + def sandbox_ver(pkg, debian=True): |
525 | + if debian: |
526 | + changelog = 'changelog.Debian.gz' |
527 | + else: |
528 | + changelog = 'changelog.gz' |
529 | + with gzip.open(os.path.join(self.rootdir, 'usr/share/doc', pkg, |
530 | + changelog)) as f: |
531 | + return f.readline().decode().split()[1][1:-1] |
532 | + |
533 | + self.assertEqual(obsolete, '') |
534 | + |
535 | + # packages get installed |
536 | + self.assertTrue(os.path.exists(os.path.join(self.rootdir, |
537 | + 'usr/share/doc/oxideqt-codecs/copyright'))) |
538 | + self.assertTrue(os.path.exists(os.path.join(self.rootdir, |
539 | + 'usr/share/distro-info/ubuntu.csv'))) |
540 | + |
541 | + # their versions are as expected |
542 | + self.assertEqual(sandbox_ver('oxideqt-codecs'), |
543 | + '1.6.6-0ubuntu0.14.04.1') |
544 | + self.assertEqual(sandbox_ver('oxideqt-codecs-dbg'), |
545 | + '1.6.6-0ubuntu0.14.04.1') |
546 | + self.assertEqual(sandbox_ver('distro-info-data', debian=False), |
547 | + '0.18ubuntu0.2') |
548 | + |
549 | + # keeps track of package versions |
550 | + with open(os.path.join(self.rootdir, 'packages.txt')) as f: |
551 | + pkglist = f.read().splitlines() |
552 | + self.assertIn('oxideqt-codecs 1.6.6-0ubuntu0.14.04.1', pkglist) |
553 | + self.assertIn('oxideqt-codecs-dbg 1.6.6-0ubuntu0.14.04.1', pkglist) |
554 | + self.assertIn('distro-info-data 0.18ubuntu0.2', pkglist) |
555 | + self.assertIn('qemu-utils-dbgsym 2.0.0+dfsg-2ubuntu1.11', |
556 | + pkglist) |
557 | + |
558 | + # caches packages, and their versions are as expected |
559 | + cache = os.listdir(os.path.join(self.cachedir, 'Foonux 1.2', 'apt', |
560 | + 'var', 'cache', 'apt', 'archives')) |
561 | + |
562 | + # archive and launchpad versions of packages exist in the cache, so use a list |
563 | + cache_versions = [] |
564 | + for p in cache: |
565 | + try: |
566 | + (name, ver) = p.split('_')[:2] |
567 | + cache_versions.append((name, ver)) |
568 | + except ValueError: |
569 | + pass # not a .deb, ignore |
570 | + self.assertIn(('oxideqt-codecs', '1.6.6-0ubuntu0.14.04.1'), cache_versions) |
571 | + self.assertIn(('oxideqt-codecs-dbg', '1.6.6-0ubuntu0.14.04.1'), cache_versions) |
572 | + self.assertIn(('distro-info-data', '0.18ubuntu0.2'), cache_versions) |
573 | + self.assertIn(('qemu-utils-dbgsym', '2.0.0+dfsg-2ubuntu1.11'), cache_versions) |
574 | + |
575 | + @unittest.skipUnless(_has_internet(), 'online test') |
576 | + def test_install_old_packages(self): |
577 | + '''sandbox will install older package versions from launchpad''' |
578 | + |
579 | + self._setup_foonux_config() |
580 | + obsolete = impl.install_packages(self.rootdir, self.configdir, 'Foonux 1.2', |
581 | + [('oxideqt-codecs', |
582 | + '1.7.8-0ubuntu0.14.04.1'), |
583 | + ], False, self.cachedir) |
584 | + |
585 | + self.assertEqual(obsolete, '') |
586 | + |
587 | + def sandbox_ver(pkg): |
588 | + with gzip.open(os.path.join(self.rootdir, 'usr/share/doc', pkg, |
589 | + 'changelog.Debian.gz')) as f: |
590 | + return f.readline().decode().split()[1][1:-1] |
591 | + |
592 | + # the version is as expected |
593 | + self.assertEqual(sandbox_ver('oxideqt-codecs'), |
594 | + '1.7.8-0ubuntu0.14.04.1') |
595 | + |
596 | + # keeps track of package version |
597 | + with open(os.path.join(self.rootdir, 'packages.txt')) as f: |
598 | + pkglist = f.read().splitlines() |
599 | + self.assertIn('oxideqt-codecs 1.7.8-0ubuntu0.14.04.1', pkglist) |
600 | + |
601 | + obsolete = impl.install_packages(self.rootdir, self.configdir, 'Foonux 1.2', |
602 | + [('oxideqt-codecs', |
603 | + '1.6.6-0ubuntu0.14.04.1'), |
604 | + ], False, self.cachedir) |
605 | + |
606 | + self.assertEqual(obsolete, '') |
607 | + |
608 | + # the old version is installed |
609 | + self.assertEqual(sandbox_ver('oxideqt-codecs'), |
610 | + '1.6.6-0ubuntu0.14.04.1') |
611 | + |
612 | + # the old versions is tracked |
613 | + with open(os.path.join(self.rootdir, 'packages.txt')) as f: |
614 | + pkglist = f.read().splitlines() |
615 | + self.assertIn('oxideqt-codecs 1.6.6-0ubuntu0.14.04.1', pkglist) |
616 | + |
617 | + @unittest.skipUnless(_has_internet(), 'online test') |
618 | def test_get_source_tree_sandbox(self): |
619 | self._setup_foonux_config() |
620 | out_dir = os.path.join(self.workdir, 'out') |
621 | @@ -839,7 +958,21 @@ |
622 | self.assertTrue(res.endswith('/base-files-7.2ubuntu5'), |
623 | 'unexpected version: ' + res.split('/')[-1]) |
624 | |
625 | - def _setup_foonux_config(self, updates=False): |
626 | + @unittest.skipUnless(_has_internet(), 'online test') |
627 | + def test_get_source_tree_lp_sandbox(self): |
628 | + self._setup_foonux_config() |
629 | + out_dir = os.path.join(self.workdir, 'out') |
630 | + os.mkdir(out_dir) |
631 | + impl._build_apt_sandbox(self.rootdir, os.path.join(self.configdir, 'Foonux 1.2', 'sources.list')) |
632 | + res = impl.get_source_tree('debian-installer', out_dir, version='20101020ubuntu318.16', |
633 | + sandbox=self.rootdir, apt_update=True) |
634 | + self.assertTrue(os.path.isdir(os.path.join(res, 'debian'))) |
635 | + # this needs to be updated when the release in _setup_foonux_config |
636 | + # changes |
637 | + self.assertTrue(res.endswith('/debian-installer-20101020ubuntu318.16'), |
638 | + 'unexpected version: ' + res.split('/')[-1]) |
639 | + |
640 | + def _setup_foonux_config(self, updates=False, release='trusty'): |
641 | '''Set up directories and configuration for install_packages()''' |
642 | |
643 | self.cachedir = os.path.join(self.workdir, 'cache') |
644 | @@ -850,24 +983,24 @@ |
645 | os.mkdir(self.configdir) |
646 | os.mkdir(os.path.join(self.configdir, 'Foonux 1.2')) |
647 | with open(os.path.join(self.configdir, 'Foonux 1.2', 'sources.list'), 'w') as f: |
648 | - f.write('deb http://archive.ubuntu.com/ubuntu/ trusty main\n') |
649 | - f.write('deb-src http://archive.ubuntu.com/ubuntu/ trusty main\n') |
650 | - f.write('deb http://ddebs.ubuntu.com/ trusty main\n') |
651 | + f.write('deb http://archive.ubuntu.com/ubuntu/ %s main\n' % release) |
652 | + f.write('deb-src http://archive.ubuntu.com/ubuntu/ %s main\n' % release) |
653 | + f.write('deb http://ddebs.ubuntu.com/ %s main\n' % release) |
654 | if updates: |
655 | - f.write('deb http://archive.ubuntu.com/ubuntu/ trusty-updates main\n') |
656 | - f.write('deb-src http://archive.ubuntu.com/ubuntu/ trusty-updates main\n') |
657 | - f.write('deb http://ddebs.ubuntu.com/ trusty-updates main\n') |
658 | + f.write('deb http://archive.ubuntu.com/ubuntu/ %s-updates main\n' % release) |
659 | + f.write('deb-src http://archive.ubuntu.com/ubuntu/ %s-updates main\n' % release) |
660 | + f.write('deb http://ddebs.ubuntu.com/ %s-updates main\n' % release) |
661 | os.mkdir(os.path.join(self.configdir, 'Foonux 1.2', 'armhf')) |
662 | with open(os.path.join(self.configdir, 'Foonux 1.2', 'armhf', 'sources.list'), 'w') as f: |
663 | - f.write('deb http://ports.ubuntu.com/ trusty main\n') |
664 | - f.write('deb-src http://ports.ubuntu.com/ trusty main\n') |
665 | - f.write('deb http://ddebs.ubuntu.com/ trusty main\n') |
666 | + f.write('deb http://ports.ubuntu.com/ %s main\n' % release) |
667 | + f.write('deb-src http://ports.ubuntu.com/ %s main\n' % release) |
668 | + f.write('deb http://ddebs.ubuntu.com/ %s main\n' % release) |
669 | if updates: |
670 | - f.write('deb http://ports.ubuntu.com/ trusty-updates main\n') |
671 | - f.write('deb-src http://ports.ubuntu.com/ trusty-updates main\n') |
672 | - f.write('deb http://ddebs.ubuntu.com/ trusty-updates main\n') |
673 | + f.write('deb http://ports.ubuntu.com/ %s-updates main\n' % release) |
674 | + f.write('deb-src http://ports.ubuntu.com/ %s-updates main\n' % release) |
675 | + f.write('deb http://ddebs.ubuntu.com/ %s-updates main\n' % release) |
676 | with open(os.path.join(self.configdir, 'Foonux 1.2', 'codename'), 'w') as f: |
677 | - f.write('trusty') |
678 | + f.write('%s' % release) |
679 | |
680 | def assert_elf_arch(self, path, expected): |
681 | '''Assert that an ELF file is for an expected machine type. |
682 | @@ -891,7 +1024,7 @@ |
683 | machine = line.split(maxsplit=1)[1] |
684 | break |
685 | else: |
686 | - self.fail('could not fine Machine: in readelf output') |
687 | + self.fail('could not find Machine: in readelf output') |
688 | |
689 | self.assertTrue(archmap[expected] in machine, |
690 | '%s has unexpected machine type "%s" for architecture %s' % ( |
Testing this on some apport retracers in canonistack, it looks like get_source_tree also needs to support getting the package from Launchpad. I received the following error when trying to retrace an xeyes crash with version 7.7+4.
Installing extra package x11-apps to get ExecutablePath
E: Ignore unavailable version '7.7+4' of package 'x11-apps'
E: Unable to find a source package for x11-apps