Merge ppa-dev-tools:add_rdepends_argument into ppa-dev-tools:main
- Git
- lp:ppa-dev-tools
- add_rdepends_argument
- Merge into main
Status: | Merged |
---|---|
Merge reported by: | Bryce Harrington |
Merged at revision: | 565c3c6342b9f5c57808b7d527daeb1b555070c2 |
Proposed branch: | ppa-dev-tools:add_rdepends_argument |
Merge into: | ppa-dev-tools:main |
Diff against target: |
797 lines (+318/-87) 11 files modified
NEWS.md (+16/-0) ppa/constants.py (+17/-0) ppa/ppa.py (+32/-0) ppa/repository.py (+9/-2) ppa/suite.py (+78/-49) ppa/trigger.py (+11/-3) scripts/ppa (+65/-22) tests/helpers.py (+17/-0) tests/test_scripts_ppa.py (+50/-1) tests/test_suite.py (+8/-7) tests/test_trigger.py (+15/-3) |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Sergio Durigan Junior (community) | Approve | ||
Canonical Server | Pending | ||
Canonical Server Reporter | Pending | ||
Review via email: mp+441377@code.launchpad.net |
Commit message
Description of the change
This adds the --show-rdepends feature to the tests command, to make it list trigger URLs for the reverse dependencies of package(s) in a given PPA. This relies on a local mirror of the apt repository be stored on your system at /tmp/ubuntu, which can be created like this:
$ mkdir /tmp/ubuntu
$ rsync -va \
This takes a few minutes to run, and generates a sparse mirror (i.e. you can't use it for other Apt needs). A full mirror (via apt-mirror, or rsync without all the excludes) would also work just fine.
(Future plan is to integrate "lazy mirroring" into the tool itself, so the above steps would become unnecessarily, but at least for this branch's initial implementation of the feature it's required.)
With the mirror in place, the actual running of the command just requires adding the option, for example:
$ ./scripts/ppa tests --show-rdepends --architectures amd64 https:/
Testing is done as usual:
$ make check
$ pytest-3
$ python3 -m ppa.repository
$ python3 -m ppa.suite
$ python3 -m ppa.source_package
$ python3 -m ppa.binary_package
$ python3 -m ppa.trigger
Bryce Harrington (bryce) wrote : | # |
Bryce Harrington (bryce) wrote : | # |
Again, after tests have finished, and with --show-urls added:
$ ./scripts/ppa tests --show-urls --show-rdepends --architectures amd64 https:/
* Triggers:
- Source apache2/
+ apache2@amd64: https:/
+ libembperl-
+ passenger@amd64: https:/
+ libapache2-
+ libsoup2.4@amd64: https:/
+ libsoup3@amd64: https:/
+ gnome-user-
+ libapache2-
+ libapache2-
+ mod-gnutls@amd64: https:/
+ golang-
+ mathjax-
+ pycsw@amd64: https:/
+ apache-
+ apache2-
Sergio Durigan Junior (sergiodj) wrote : | # |
I looked very briefly and have a few comments, but I'll take a better look tomorrow.
Sergio Durigan Junior (sergiodj) wrote : | # |
First of all, thank you very much for implementing this feature. It is indeed one of the most useful parts of bileto, and I'd love to be able to use it from inside my Emacs instead of having to open a browser ;-).
Without entering the merit of the implementation itself, I'd like to discuss the approach you've taken to grab a copy of the archive's metadata needed for the rdep processing. I don't mind having to run rsync manually if needed, but I'm wondering if there are better, even simpler options out there.
For example, I know that Ubuntu Wire maintains http://
Another option I've used in the past (and implemented here: https:/
The main point here is that I believe this metadata fetching could be further automated, which would be even more awesome. WDYT?
Bryce Harrington (bryce) wrote : | # |
Hi Sergio, thanks for reviewing.
Indeed, this is one of the next items on the todo list after this MP is landed. This branch should be considered the proof-of-concept for the basic functionality, to be followed with some convenience improvements and optimizations.
I don't know if you remember but the last few weeks I've researched into some caching and mirroring options and tools. Particularly I was looking at caching at urllib/http level python modules, however I did also look at apt-mirror and other Debian tools. I found one option for the former I want to explore more, but the latter options looked too overkill and dependency-heavy. I didn't look at chdist though, so will try to peek just in case. But the basic idea I have as a solution is to integrate "lazy mirroring" where instead of directly loading Sources.xz and Packages.xz from disk, if what's on disk is older than N hours, it automatically re-downloads and re-caches the file (and ONLY the requested file) on the fly. From my testing so far, on a decent network connection that should be quite performant.
I was hoping to slip that feature work in before Prague (and have at least started sketching it in on another branch) but I'm realizing there's some caveats I need to account for, and time is getting short, so decided to just go with the (reliable) rsync for now and make this a next-release enhancement. There's a couple other convenience improvements I plan to make too, and maybe other folks will also have ideas to add in.
I hadn't looked at chdist, but I did look into apt-mirror and some other Debian and python tools for mirroring apt, and for general http client-side cache management. There's one cache manager module I want to look into a bit more, but the tools I played with were a bit too heavyweight for what I need, and added dependencies.
Anyway, yes I agree more automation of the metadata fetching would indeed be awesome, and plan to do so but maybe needs to wait for next release?
Sergio Durigan Junior (sergiodj) wrote : | # |
Thanks for the detailed explanation, Bryce.
Yes, absolutely no problem to leave this further automation for the next release. I'm glad that we're on the same page on this topic!
I'm leaving a few cosmetic comments below, but otherwise the code looks good to me, so I'm approving the MP. But I have a few other general comments.
- I set up a container in order to fully test the package (including installing it), and I stumbled upon a few problems with missing requirements from the requirements-
- When I tried running the "tests --show-rdepends" command against a public PPA, it asked me to log into LP. This shouldn't really be necessary, unless dealing with private PPAs and such. It's not something really important to address right now (I understand that you have other things on your plate), but it's something to keep in mind IMHO.
Thanks again for working on this.
Bryce Harrington (bryce) wrote : | # |
> Thanks for the detailed explanation, Bryce.
>
> Yes, absolutely no problem to leave this further automation for the next
> release. I'm glad that we're on the same page on this topic!
>
> I'm leaving a few cosmetic comments below, but otherwise the code looks good
> to me, so I'm approving the MP. But I have a few other general comments.
>
> - I set up a container in order to fully test the package (including
> installing it), and I stumbled upon a few problems with missing requirements
> from the requirements-
If you can shoot me either the steps you followed, or the changed requirements, it'd be appreciated as I prepare the release. I posted an MP the other day with some packaging work, that includes adding a missing dependency or two, but could be more. I also found that running tox on the codebase scared up some requirements issues that I will try to look at if not for this release than for a future one. Getting tox to pass without issue would be very nice, and probably would take care of the issues you hit here.
> - When I tried running the "tests --show-rdepends" command against a public
> PPA, it asked me to log into LP. This shouldn't really be necessary, unless
> dealing with private PPAs and such. It's not something really important to
> address right now (I understand that you have other things on your plate), but
> it's something to keep in mind IMHO.
Noted, I'll put this on the list to work on later.
I know exactly what the problem is - when constructing the Launchpad service you specify what level of permissions you need, and I just across the board ask for read/write privs regardless of what the individual command actually needs. Fixing it such that commands only request the priv level they need will take a bit of plumbing, however as an outgrowth of some of the experimentation I've done with qastaging I'm realizing I need a more sophisticated system for selecting Launchpad service API versions, permissions, and so on.
> Thanks again for working on this.
Bryce Harrington (bryce) : | # |
Bryce Harrington (bryce) wrote : | # |
Enumerating objects: 86, done.
Counting objects: 100% (86/86), done.
Delta compression using up to 12 threads
Compressing objects: 100% (67/67), done.
Writing objects: 100% (71/71), 13.37 KiB | 4.46 MiB/s, done.
Total 71 (delta 54), reused 0 (delta 0), pack-reused 0
To git+ssh:
d71ef8d..daf0adc main -> main
Preview Diff
1 | diff --git a/NEWS.md b/NEWS.md |
2 | index f40bac0..32948bd 100644 |
3 | --- a/NEWS.md |
4 | +++ b/NEWS.md |
5 | @@ -1,3 +1,19 @@ |
6 | +# Unreleased # |
7 | + |
8 | +Reverse dependencies, build dependencies, and installation dependencies |
9 | +can be identified for a given source package using cached APT |
10 | +information. This list of packages will be used to generate lists of |
11 | +autopkgtest triggers, which when run should help identify issues that |
12 | +could get flagged in Britney2 runs. While similar to functionality |
13 | +provided by Bileto+Britney2, it is a lighterweight facsimile which |
14 | +doesn't handle special cases so should not be considered an equivalent, |
15 | +just as a preliminary screen to catch basic issues. |
16 | + |
17 | +The location of the cache can be set in a config file or via a cli |
18 | +argument, but by default will store under the user's ~/.config/ |
19 | +directory. |
20 | + |
21 | + |
22 | # 0.3.0 Release # |
23 | |
24 | Autopkgtest trigger action URLs are printed for packages in the PPA when |
25 | diff --git a/ppa/constants.py b/ppa/constants.py |
26 | index 4270392..2d37539 100644 |
27 | --- a/ppa/constants.py |
28 | +++ b/ppa/constants.py |
29 | @@ -19,3 +19,20 @@ ARCHES_AUTOPKGTEST = ["amd64", "arm64", "armhf", "i386", "ppc64el", "s390x"] |
30 | |
31 | URL_LPAPI = "https://api.launchpad.net/devel" |
32 | URL_AUTOPKGTEST = "https://autopkgtest.ubuntu.com" |
33 | + |
34 | +DISTRO_UBUNTU_COMPONENTS = ['main', 'restricted', 'universe', 'multiverse'] |
35 | + |
36 | +DISTRO_UBUNTU_POCKETS = ['release', 'security', 'proposed', 'updates', 'backports'] |
37 | +DISTRO_UBUNTU_POCKETS_UPDATES = ['release', 'security', 'updates'] |
38 | + |
39 | +LOCAL_REPOSITORY_PATH = "/tmp/ubuntu" |
40 | +LOCAL_REPOSITORY_MIRRORING_DIRECTIONS = f""" |
41 | +Tip: You can generate (and refresh) a dists-only mirror thusly:") |
42 | + $ mkdir {LOCAL_REPOSITORY_PATH} |
43 | + $ rsync -va \\ |
44 | + --exclude={{'*/installer*','*/i18n/*','*/uefi/*','*/Contents*','*/by-hash/*','*tar.gz'}} \\ |
45 | + rsync://archive.ubuntu.com/ubuntu/dists {LOCAL_REPOSITORY_PATH} |
46 | + |
47 | +It's recommended to run the rsync command as a cronjob to keep your |
48 | +repository up to date as often as desired. |
49 | +""" |
50 | diff --git a/ppa/ppa.py b/ppa/ppa.py |
51 | index 1536b7d..aa4fd7c 100755 |
52 | --- a/ppa/ppa.py |
53 | +++ b/ppa/ppa.py |
54 | @@ -16,6 +16,10 @@ import sys |
55 | from functools import lru_cache |
56 | from lazr.restfulclient.errors import BadRequest, NotFound |
57 | |
58 | +from .constants import URL_AUTOPKGTEST |
59 | +from .io import open_url |
60 | +from .job import (get_waiting, get_running) |
61 | + |
62 | |
63 | class PpaDoesNotExist(BaseException): |
64 | """Exception indicating a requested PPA could not be found.""" |
65 | @@ -376,6 +380,34 @@ class Ppa: |
66 | print("Successfully published all builds for all architectures") |
67 | return retval |
68 | |
69 | + def get_autopkgtest_waiting(self, releases): |
70 | + """Returns iterator of queued autopkgtests for this PPA. |
71 | + |
72 | + See get_waiting() for details |
73 | + |
74 | + :param list[str] releases: The Ubuntu series codename(s), or None. |
75 | + :rtype: Iterator[Job] |
76 | + :returns: Currently waiting jobs, if any, or an empty list on error |
77 | + """ |
78 | + response = open_url(f"{URL_AUTOPKGTEST}/queues.json", "waiting autopkgtests") |
79 | + if response: |
80 | + return get_waiting(response, releases=releases, ppa=str(self)) |
81 | + return [] |
82 | + |
83 | + def get_autopkgtest_running(self, releases): |
84 | + """Returns iterator of queued autopkgtests for this PPA. |
85 | + |
86 | + See get_running() for details |
87 | + |
88 | + :param list[str] releases: The Ubuntu series codename(s), or None. |
89 | + :rtype: Iterator[Job] |
90 | + :returns: Currently running jobs, if any, or an empty list on error |
91 | + """ |
92 | + response = open_url(f"{URL_AUTOPKGTEST}/static/running.json", "running autopkgtests") |
93 | + if response: |
94 | + return get_running(response, releases=releases, ppa=str(self)) |
95 | + return [] |
96 | + |
97 | |
98 | def ppa_address_split(ppa_address, default_team=None): |
99 | """Parse an address for a PPA into its team and name components. |
100 | diff --git a/ppa/repository.py b/ppa/repository.py |
101 | index ccd5211..f71b06c 100644 |
102 | --- a/ppa/repository.py |
103 | +++ b/ppa/repository.py |
104 | @@ -15,6 +15,10 @@ import os.path |
105 | from functools import lru_cache |
106 | |
107 | from .suite import Suite |
108 | +from .constants import ( |
109 | + LOCAL_REPOSITORY_PATH, |
110 | + LOCAL_REPOSITORY_MIRRORING_DIRECTIONS |
111 | +) |
112 | |
113 | |
114 | class Repository: |
115 | @@ -31,6 +35,8 @@ class Repository: |
116 | """ |
117 | if not cache_dir: |
118 | raise ValueError("undefined cache_dir.") |
119 | + if not os.path.exists(cache_dir): |
120 | + raise FileNotFoundError(f"could not find cache dir '{cache_dir}'") |
121 | |
122 | self.cache_dir = cache_dir |
123 | |
124 | @@ -82,14 +88,15 @@ if __name__ == "__main__": |
125 | from pprint import PrettyPrinter |
126 | pp = PrettyPrinter(indent=4) |
127 | |
128 | + from .debug import error |
129 | + |
130 | print('#########################') |
131 | print('## PpaGroup smoke test ##') |
132 | print('#########################') |
133 | |
134 | - LOCAL_REPOSITORY_PATH = "/var/spool/apt-mirror/skel/archive.ubuntu.com/ubuntu" |
135 | local_dists_path = os.path.join(LOCAL_REPOSITORY_PATH, "dists") |
136 | if not os.path.exists(local_dists_path): |
137 | - print("Error: Missing checkout") |
138 | + error(f'Missing checkout for smoketest\n{LOCAL_REPOSITORY_MIRRORING_DIRECTIONS}') |
139 | sys.exit(1) |
140 | repository = Repository(cache_dir=local_dists_path) |
141 | for suite in repository.suites.values(): |
142 | diff --git a/ppa/suite.py b/ppa/suite.py |
143 | index cbded7a..6e00006 100644 |
144 | --- a/ppa/suite.py |
145 | +++ b/ppa/suite.py |
146 | @@ -19,6 +19,12 @@ import apt_pkg |
147 | |
148 | from .source_package import SourcePackage |
149 | from .binary_package import BinaryPackage |
150 | +from .constants import ( |
151 | + DISTRO_UBUNTU_COMPONENTS, |
152 | + DISTRO_UBUNTU_POCKETS, |
153 | + LOCAL_REPOSITORY_PATH, |
154 | + LOCAL_REPOSITORY_MIRRORING_DIRECTIONS |
155 | +) |
156 | |
157 | |
158 | class Suite: |
159 | @@ -40,6 +46,8 @@ class Suite: |
160 | raise ValueError('undefined suite_name.') |
161 | if not cache_dir: |
162 | raise ValueError('undefined cache_dir.') |
163 | + if not os.path.exists(cache_dir): |
164 | + raise FileNotFoundError(f"could not find cache dir '{cache_dir}'") |
165 | |
166 | self._suite_name = suite_name |
167 | self._cache_dir = cache_dir |
168 | @@ -69,15 +77,28 @@ class Suite: |
169 | def _rebuild_lookup_tables(self) -> bool: |
170 | """Regenerates the provides and rdepends lookup tables. |
171 | |
172 | + Some packages have build dependence that can be satisfied by one |
173 | + of several packages. For example, a package may require either |
174 | + awk or mawk to build. In these cases, the package will be |
175 | + registered in the table as an rdepend for BOTH awk and mawk. |
176 | + |
177 | :rtype: bool |
178 | :returns: True if tables were rebuilt, False otherwise""" |
179 | self._provides_table = {} |
180 | self._rdepends_table = {} |
181 | for source_name, source in self.sources.items(): |
182 | - print(source_name, source) |
183 | - for build_dep_binary_name in source.build_dependencies.keys(): |
184 | - self._rdepends_table.setdefault(build_dep_binary_name, []) |
185 | - self._rdepends_table[build_dep_binary_name].append(source) |
186 | + for build_dep_binary_names in source.build_dependencies.keys(): |
187 | + # This needs to deal with two different kinds of keys. |
188 | + # Basic dependencies are just simple str's, while alternate |
189 | + # dependencies are modeled as tuples. |
190 | + # |
191 | + # So, convert simple str's into single-element lists, so |
192 | + # both cases can be handled via iteration in a for loop. |
193 | + if isinstance(build_dep_binary_names, str): |
194 | + build_dep_binary_names = [build_dep_binary_names] |
195 | + for build_dep_binary_name in build_dep_binary_names: |
196 | + self._rdepends_table.setdefault(build_dep_binary_name, []) |
197 | + self._rdepends_table[build_dep_binary_name].append(source) |
198 | |
199 | for provided_binary_name in source.provides_binaries.keys(): |
200 | self._provides_table[provided_binary_name] = source |
201 | @@ -124,7 +145,10 @@ class Suite: |
202 | """ |
203 | if '-' not in self.name: |
204 | return 'release' |
205 | - return self.name.split('-')[1] |
206 | + pocket = self.name.split('-')[1] |
207 | + if pocket not in DISTRO_UBUNTU_POCKETS: |
208 | + raise RuntimeError(f'Unrecognized pocket "{pocket}"') |
209 | + return pocket |
210 | |
211 | @property |
212 | def architectures(self) -> list[str]: |
213 | @@ -148,6 +172,7 @@ class Suite: |
214 | component |
215 | for component in os.listdir(self._cache_dir) |
216 | if os.path.isdir(os.path.join(self._cache_dir, component)) |
217 | + and component in DISTRO_UBUNTU_COMPONENTS |
218 | ] |
219 | if not components: |
220 | raise RuntimeError(f'Could not load components from {self._cache_dir}') |
221 | @@ -163,16 +188,22 @@ class Suite: |
222 | |
223 | :rtype: dict[str, SourcePackage] |
224 | """ |
225 | - sources = {} |
226 | - for comp in self.components: |
227 | - source_packages_dir = f'{self._cache_dir}/{comp}/source' |
228 | - with apt_pkg.TagFile(f'{source_packages_dir}/Sources.xz') as pkgs: |
229 | - for pkg in pkgs: |
230 | - name = pkg['Package'] |
231 | - sources[name] = SourcePackage(dict(pkg)) |
232 | - if not sources: |
233 | - raise RuntimeError(f'Could not load {source_packages_dir}/Sources.xz') |
234 | - return sources |
235 | + sources = None |
236 | + for sources_file in ['Sources.xz', 'Sources.gz']: |
237 | + for comp in self.components: |
238 | + source_packages_dir = f'{self._cache_dir}/{comp}/source' |
239 | + try: |
240 | + with apt_pkg.TagFile(f'{source_packages_dir}/{sources_file}') as pkgs: |
241 | + if sources is None: |
242 | + sources = {} |
243 | + for pkg in pkgs: |
244 | + name = pkg['Package'] |
245 | + sources[name] = SourcePackage(dict(pkg)) |
246 | + except apt_pkg.Error: |
247 | + pass |
248 | + if sources is not None: |
249 | + return sources |
250 | + raise RuntimeError(f'Could not load {source_packages_dir}/Sources.[xz|gz]') |
251 | |
252 | @property |
253 | @lru_cache |
254 | @@ -184,38 +215,45 @@ class Suite: |
255 | |
256 | :rtype: dict[str, BinaryPackage] |
257 | """ |
258 | - binaries = {} |
259 | - for comp in self.components: |
260 | - for arch in self.architectures: |
261 | - binary_packages_dir = f'{self._cache_dir}/{comp}/binary-{arch}' |
262 | - try: |
263 | - with apt_pkg.TagFile(f'{binary_packages_dir}/Packages.xz') as pkgs: |
264 | - for pkg in pkgs: |
265 | - name = f'{pkg["Package"]}:{arch}' |
266 | - binaries[name] = BinaryPackage(pkg) |
267 | - except apt_pkg.Error: |
268 | - # If an Apt repository is incomplete, such as if |
269 | - # a given architecture was not mirrored, still |
270 | - # note the binaries exist but mark their records |
271 | - # as missing. |
272 | - binaries[name] = None |
273 | - if not binaries: |
274 | - raise ValueError(f'Could not load {binary_packages_dir}/Packages.xz') |
275 | - return binaries |
276 | + binaries = None |
277 | + for packages_file in ["Packages.xz", "Packages.gz"]: |
278 | + for comp in self.components: |
279 | + for arch in self.architectures: |
280 | + binary_packages_dir = f'{self._cache_dir}/{comp}/binary-{arch}' |
281 | + try: |
282 | + with apt_pkg.TagFile(f'{binary_packages_dir}/{packages_file}') as pkgs: |
283 | + if binaries is None: |
284 | + binaries = {} |
285 | + for pkg in pkgs: |
286 | + name = f'{pkg["Package"]}:{arch}' |
287 | + binaries[name] = BinaryPackage(pkg) |
288 | + except apt_pkg.Error: |
289 | + pass |
290 | + if binaries is not None: |
291 | + return binaries |
292 | + raise ValueError(f'Could not load {binary_packages_dir}/Packages.[xz|gz]') |
293 | |
294 | def dependent_packages(self, source_package: SourcePackage) -> dict[str, SourcePackage]: |
295 | """Relevant packages to run autotests against for a given source package. |
296 | |
297 | - Calculates the collection of build and reverse dependencies for |
298 | - a given source package, that would be appropriate to re-run autopkgtests |
299 | + Calculates the collection of reverse dependencies for a given |
300 | + source package that would be appropriate to re-run autopkgtests |
301 | on, using the given @param source_package's name as a trigger. |
302 | |
303 | + For leaf packages (that nothing else depends on as a build |
304 | + requirement), this routine returns an empty dict. |
305 | + |
306 | + For packages that can serve as an alternative dependency of some |
307 | + packages, this will include all such packages as if they were |
308 | + hard dependencies. For example, when examining postgresql-12, this |
309 | + would include all packages dependent on any database. |
310 | + |
311 | :param str source_package_name: The archive name of the source package. |
312 | :rtype: dict[str, SourcePackage] |
313 | :returns: Collection of source packages, keyed by name. |
314 | """ |
315 | # Build the lookup table for provides and rdepends |
316 | - if not self._provides_table or not self._rdepends_table: |
317 | + if not self._rdepends_table: |
318 | if not self._rebuild_lookup_tables(): |
319 | raise RuntimeError("Could not regenerate provides/rdepends lookup tables") |
320 | |
321 | @@ -228,35 +266,26 @@ class Suite: |
322 | for rdep_source in rdeps: |
323 | dependencies[rdep_source.name] = rdep_source |
324 | |
325 | - # Get source packages that provide our build dependencies |
326 | - for build_dependency_name in source_package.build_dependencies.keys(): |
327 | - bdep_source = self._provides_table.get(build_dependency_name) |
328 | - if not bdep_source: |
329 | - raise RuntimeError(f'Could not get source object for bdep {build_dependency_name}') |
330 | - dependencies[bdep_source.name] = bdep_source |
331 | - |
332 | - if not dependencies: |
333 | - raise RuntimeError(f'Could not calculate dependencies for {source_package.name}') |
334 | return dependencies |
335 | |
336 | |
337 | if __name__ == '__main__': |
338 | # pylint: disable=invalid-name |
339 | import sys |
340 | - from .repository import Repository |
341 | - |
342 | from pprint import PrettyPrinter |
343 | pp = PrettyPrinter(indent=4) |
344 | |
345 | + from .repository import Repository |
346 | + from .debug import error |
347 | + |
348 | print('############################') |
349 | print('## Suite class smoke test ##') |
350 | print('############################') |
351 | print() |
352 | |
353 | - LOCAL_REPOSITORY_PATH = '/var/spool/apt-mirror/skel/archive.ubuntu.com/ubuntu' |
354 | local_dists_path = os.path.join(LOCAL_REPOSITORY_PATH, 'dists') |
355 | if not os.path.exists(local_dists_path): |
356 | - print('Error: Missing checkout') |
357 | + error(f'Missing checkout for suite smoketest\n{LOCAL_REPOSITORY_MIRRORING_DIRECTIONS}') |
358 | sys.exit(1) |
359 | |
360 | repository = Repository(cache_dir=local_dists_path) |
361 | diff --git a/ppa/trigger.py b/ppa/trigger.py |
362 | index 234e14a..077dcfc 100755 |
363 | --- a/ppa/trigger.py |
364 | +++ b/ppa/trigger.py |
365 | @@ -27,7 +27,7 @@ class Trigger: |
366 | package and/or architectures, but all such Triggers must be against |
367 | the same series as the Job itself. |
368 | """ |
369 | - def __init__(self, package, version, arch, series, ppa=None): |
370 | + def __init__(self, package, version, arch, series, ppa=None, test_package=None): |
371 | """Initializes a new Trigger for a given package and version. |
372 | |
373 | :param str package: The source package name. |
374 | @@ -35,12 +35,17 @@ class Trigger: |
375 | :param str arch: The architecture for the trigger. |
376 | :param str series: The distro release series codename. |
377 | :param Ppa ppa: (Optional) PPA wrapper object to run tests against. |
378 | + :param str test_package: The package to run autopkgtests from. |
379 | """ |
380 | self.package = package |
381 | self.version = version |
382 | self.arch = arch |
383 | self.series = series |
384 | self.ppa = ppa |
385 | + if test_package: |
386 | + self.test_package = test_package |
387 | + else: |
388 | + self.test_package = package |
389 | |
390 | def __repr__(self) -> str: |
391 | """Machine-parsable unique representation of object. |
392 | @@ -50,7 +55,8 @@ class Trigger: |
393 | """ |
394 | return (f'{self.__class__.__name__}(' |
395 | f'package={self.package!r}, version={self.version!r}, ' |
396 | - f'arch={self.arch!r}, series={self.series!r}, ppa={self.ppa!r})') |
397 | + f'arch={self.arch!r}, series={self.series!r}, ppa={self.ppa!r}, ' |
398 | + f'test_package={self.test_package!r})') |
399 | |
400 | def __str__(self) -> str: |
401 | """Human-readable summary of the object. |
402 | @@ -58,6 +64,8 @@ class Trigger: |
403 | :rtype: str |
404 | :returns: Printable summary of the object. |
405 | """ |
406 | + if self.test_package != self.package: |
407 | + return f"({self.test_package}) {self.package}/{self.version}" |
408 | return f"{self.package}/{self.version}" |
409 | |
410 | @property |
411 | @@ -85,7 +93,7 @@ class Trigger: |
412 | """ |
413 | params = [ |
414 | ("release", self.series), |
415 | - ("package", self.package), |
416 | + ("package", self.test_package), |
417 | ("arch", self.arch), |
418 | ] |
419 | |
420 | diff --git a/scripts/ppa b/scripts/ppa |
421 | index 9097004..7200c22 100755 |
422 | --- a/scripts/ppa |
423 | +++ b/scripts/ppa |
424 | @@ -70,15 +70,12 @@ from ppa.constants import ( |
425 | ARCHES_PPA_DEFAULT, |
426 | ARCHES_AUTOPKGTEST, |
427 | URL_AUTOPKGTEST, |
428 | + LOCAL_REPOSITORY_PATH, |
429 | + LOCAL_REPOSITORY_MIRRORING_DIRECTIONS, |
430 | ) |
431 | from ppa.dict import unpack_to_dict |
432 | from ppa.io import open_url |
433 | -from ppa.job import ( |
434 | - get_waiting, |
435 | - show_waiting, |
436 | - get_running, |
437 | - show_running |
438 | -) |
439 | +from ppa.job import show_waiting, show_running |
440 | from ppa.lp import Lp |
441 | from ppa.ppa import ( |
442 | get_ppa, |
443 | @@ -87,6 +84,7 @@ from ppa.ppa import ( |
444 | PpaDoesNotExist |
445 | ) |
446 | from ppa.ppa_group import PpaGroup, PpaAlreadyExists |
447 | +from ppa.repository import Repository |
448 | from ppa.result import get_results |
449 | from ppa.text import o2str, ansi_hyperlink |
450 | from ppa.trigger import Trigger |
451 | @@ -365,6 +363,10 @@ def create_arg_parser() -> argparse.ArgumentParser: |
452 | dest='show_urls', action='store_true', |
453 | default=False, |
454 | help="Display unformatted trigger action URLs") |
455 | + tests_parser.add_argument('--show-rdepends', |
456 | + dest='show_rdepends', action='store_true', |
457 | + default=False, |
458 | + help="Display test triggers for reverse dependencies") |
459 | |
460 | # Wait Command |
461 | wait_parser = subparser.add_parser( |
462 | @@ -772,6 +774,15 @@ def command_tests(lp, config): |
463 | if not lp: |
464 | return 1 |
465 | |
466 | + apt_repository = None |
467 | + if config.get("show_rdepends"): |
468 | + local_dists_path = os.path.join(LOCAL_REPOSITORY_PATH, "dists") |
469 | + try: |
470 | + apt_repository = Repository(cache_dir=local_dists_path) |
471 | + except FileNotFoundError as e: |
472 | + error(f'Missing checkout\n{LOCAL_REPOSITORY_MIRRORING_DIRECTIONS}') |
473 | + return 1 |
474 | + |
475 | releases = config.get('releases', None) |
476 | if releases is None: |
477 | udi = UbuntuDistroInfo() |
478 | @@ -792,8 +803,8 @@ def command_tests(lp, config): |
479 | # Triggers |
480 | print("* Triggers:") |
481 | for source_pub in the_ppa.get_source_publications(): |
482 | - series = source_pub.distro_series.name |
483 | - if series not in releases: |
484 | + series_codename = source_pub.distro_series.name |
485 | + if series_codename not in releases: |
486 | continue |
487 | pkg = source_pub.source_package_name |
488 | if packages and (pkg not in packages): |
489 | @@ -802,28 +813,64 @@ def command_tests(lp, config): |
490 | url = f"https://launchpad.net/ubuntu/+source/{pkg}/{ver}" |
491 | source_hyperlink = ansi_hyperlink(url, f"{pkg}/{ver}") |
492 | print(f" - Source {source_hyperlink}: {source_pub.status}") |
493 | - triggers = [Trigger(pkg, ver, arch, series, the_ppa) for arch in architectures] |
494 | + triggers = [Trigger(pkg, ver, arch, series_codename, the_ppa) for arch in architectures] |
495 | + |
496 | + rdepends = None |
497 | + if config.get("show_rdepends"): |
498 | + # Construct suite object from repository. |
499 | + # NOTE: If a package has been freshly added to 'proposed' it |
500 | + # will be missed since we consider only packages present |
501 | + # in the release pocket. |
502 | + suite = apt_repository.get_suite(series_codename, 'release') |
503 | + if not suite: |
504 | + raise RuntimeError(f'Could not find suite for "{series_codename}" in the local Apt cache') |
505 | + |
506 | + # Lookup rdepends for the package |
507 | + source_package = suite.sources.get(pkg) |
508 | + if not source_package: |
509 | + raise RuntimeError(f'Could not find source package "{pkg}" in the local Apt cache for "{suite}"') |
510 | + |
511 | + rdepends_source_package_names = suite.dependent_packages(source_package) |
512 | + for rdep_name in rdepends_source_package_names: |
513 | + rdep = suite.sources.get(rdep_name) |
514 | + if not rdep: |
515 | + raise RuntimeError(f'Undefined reverse dependency "{rdep_name}"') |
516 | + |
517 | + triggers.extend([ |
518 | + Trigger(pkg, ver, arch, series_codename, the_ppa, rdep.name) |
519 | + for arch |
520 | + in architectures |
521 | + ]) |
522 | |
523 | if config.get("show_urls"): |
524 | for trigger in triggers: |
525 | - print(f" + {trigger.arch}: {trigger.action_url}♻️ ") |
526 | + title = '' |
527 | + if config.get('show_rdepends'): |
528 | + title = trigger.test_package |
529 | + print(f" + {title}@{trigger.arch}: {trigger.action_url}♻️ ") |
530 | for trigger in triggers: |
531 | - print(f" + {trigger.arch}: {trigger.action_url}💍") |
532 | + title = '' |
533 | + if config.get('show_rdepends'): |
534 | + title = trigger.test_package |
535 | + print(f" + {trigger.package}@{trigger.arch}: {trigger.action_url}💍") |
536 | |
537 | else: |
538 | for trigger in triggers: |
539 | pad = ' ' * (1 + abs(len('ppc64el') - len(trigger.arch))) |
540 | + title = '' |
541 | + if config.get('show_rdepends'): |
542 | + title = trigger.test_package |
543 | + |
544 | basic_trig = ansi_hyperlink( |
545 | - trigger.action_url, f"Trigger basic @{trigger.arch}♻️ " |
546 | + trigger.action_url, f"Trigger basic {title}@{trigger.arch}♻️ " |
547 | ) |
548 | all_proposed_trig = ansi_hyperlink( |
549 | trigger.action_url + "&all-proposed=1", |
550 | - f"Trigger all-proposed @{trigger.arch}💍" |
551 | + f"Trigger all-proposed {title}@{trigger.arch}💍" |
552 | ) |
553 | print(f" + " + pad.join([basic_trig, all_proposed_trig])) |
554 | |
555 | # Results |
556 | - print() |
557 | print("* Results:") |
558 | for release in releases: |
559 | base_results_fmt = f"{URL_AUTOPKGTEST}/results/autopkgtest-%s-%s-%s/" |
560 | @@ -850,18 +897,14 @@ def command_tests(lp, config): |
561 | trigger_sets[trigger] += f" • {subtest}\n" |
562 | |
563 | for trigger, result in trigger_sets.items(): |
564 | - print(f" - {trigger}\n{result}") |
565 | + print(f" - {trigger}\n{result.rstrip()}") |
566 | |
567 | # Running Queue |
568 | - response = open_url(f"{URL_AUTOPKGTEST}/static/running.json", "running autopkgtests") |
569 | - if response: |
570 | - show_running(sorted(get_running(response, releases=releases, ppa=str(the_ppa)), |
571 | - key=lambda k: str(k.submit_time))) |
572 | + show_running(sorted(the_ppa.get_autopkgtest_running(releases), |
573 | + key=lambda k: str(k.submit_time))) |
574 | |
575 | # Waiting Queue |
576 | - response = open_url(f"{URL_AUTOPKGTEST}/queues.json", "waiting autopkgtests") |
577 | - if response: |
578 | - show_waiting(get_waiting(response, releases=releases, ppa=str(the_ppa))) |
579 | + show_waiting(the_ppa.get_autopkgtest_waiting(releases)) |
580 | |
581 | return os.EX_OK |
582 | except KeyboardInterrupt: |
583 | diff --git a/tests/helpers.py b/tests/helpers.py |
584 | index 2329125..3a0d2d4 100644 |
585 | --- a/tests/helpers.py |
586 | +++ b/tests/helpers.py |
587 | @@ -19,6 +19,19 @@ from ppa.constants import ARCHES_PPA_DEFAULT |
588 | from ppa.ppa_group import PpaAlreadyExists |
589 | |
590 | |
591 | +class SeriesMock: |
592 | + def __init__(self, name): |
593 | + self.name = name |
594 | + |
595 | + |
596 | +class PublicationMock: |
597 | + def __init__(self, name, version, status, series): |
598 | + self.source_package_name = name |
599 | + self.source_package_version = version |
600 | + self.status = status |
601 | + self.distro_series = SeriesMock(series) |
602 | + |
603 | + |
604 | class ProcessorMock: |
605 | """A stand-in for a Launchpad Processor object.""" |
606 | def __init__(self, name): |
607 | @@ -32,10 +45,14 @@ class ArchiveMock: |
608 | self.description = description |
609 | self.publish = True |
610 | self.processors = [ProcessorMock(proc_name) for proc_name in ARCHES_PPA_DEFAULT] |
611 | + self.published_sources = [] |
612 | |
613 | def setProcessors(self, processors): |
614 | self.processors = [ProcessorMock(proc.split('/')[-1]) for proc in processors] |
615 | |
616 | + def getPublishedSources(self): |
617 | + return self.published_sources |
618 | + |
619 | def lp_save(self): |
620 | return True |
621 | |
622 | diff --git a/tests/test_scripts_ppa.py b/tests/test_scripts_ppa.py |
623 | index a107f92..c4dca3d 100644 |
624 | --- a/tests/test_scripts_ppa.py |
625 | +++ b/tests/test_scripts_ppa.py |
626 | @@ -18,16 +18,21 @@ import types |
627 | import importlib.machinery |
628 | import argparse |
629 | import pytest |
630 | +from mock import patch |
631 | |
632 | SCRIPT_NAME = "ppa" |
633 | BASE_PATH = os.path.realpath(os.path.join(os.path.dirname(os.path.realpath(__file__)), "..")) |
634 | sys.path.insert(0, BASE_PATH) |
635 | |
636 | +from ppa.ppa import Ppa |
637 | from ppa.constants import ( |
638 | ARCHES_PPA_ALL, |
639 | ARCHES_PPA_DEFAULT |
640 | ) |
641 | -from tests.helpers import LpServiceMock |
642 | +from tests.helpers import ( |
643 | + LpServiceMock, |
644 | + PublicationMock |
645 | +) |
646 | |
647 | if '.pybuild' in BASE_PATH: |
648 | python_version = '.'.join([str(v) for v in sys.version_info[0:2]]) |
649 | @@ -363,6 +368,14 @@ def test_create_arg_parser_tests(): |
650 | assert args.show_urls is True |
651 | args.show_urls = None |
652 | |
653 | + # Check --show-rdepends |
654 | + args = parser.parse_args([command, 'tests-ppa']) |
655 | + assert args.show_rdepends is False |
656 | + args.show_rdepends = None |
657 | + args = parser.parse_args([command, 'tests-ppa', '--show-rdepends']) |
658 | + assert args.show_rdepends is True |
659 | + args.show_rdepends = None |
660 | + |
661 | |
662 | @pytest.mark.parametrize('command_line_options, expected_config', [ |
663 | pytest.param([], {}), |
664 | @@ -578,6 +591,42 @@ def test_command_status(fake_config): |
665 | # TODO: Capture stdout and compare with expected |
666 | |
667 | |
668 | +@pytest.mark.parametrize('params, format, expected_in_stdout', [ |
669 | + ( |
670 | + { |
671 | + 'releases': None, |
672 | + 'architectures': 'amd64', |
673 | + 'show_urls': True |
674 | + }, |
675 | + 'plain', |
676 | + 'arch=amd64&trigger=x%2F1&ppa=me%2Ftesting' |
677 | + ), |
678 | +]) |
679 | +@patch('urllib.request.urlopen') |
680 | +@patch('ppa.io.open_url') |
681 | +def test_command_tests(urlopen_mock, |
682 | + open_url_mock, |
683 | + fake_config, capfd, params, format, expected_in_stdout): |
684 | + '''Checks that the tests command retrieves and displays correct results.''' |
685 | + lp = LpServiceMock() |
686 | + urlopen_mock.return_value = "{}" |
687 | + Ppa.get_autopkgtest_running = lambda x, y: [] |
688 | + Ppa.get_autopkgtest_waiting = lambda x, y: [] |
689 | + |
690 | + # Create a default PPA, for modification later |
691 | + team = lp.people[fake_config['team_name']] |
692 | + team.createPPA(fake_config['ppa_name'], 'x', 'y') |
693 | + the_ppa = lp.me.getPPAByName(fake_config['ppa_name']) |
694 | + |
695 | + # Add some fake publications |
696 | + the_ppa.published_sources.append(PublicationMock('x', '1', 'Published', 'jammy')) |
697 | + |
698 | + config = {**fake_config, **params} |
699 | + assert script.command_tests(lp, config) == 0 |
700 | + out, err = capfd.readouterr() |
701 | + assert expected_in_stdout in out |
702 | + |
703 | + |
704 | @pytest.mark.xfail(reason="Unimplemented") |
705 | def test_command_wait(fake_config): |
706 | # TODO: Set wait period to 1 sec |
707 | diff --git a/tests/test_suite.py b/tests/test_suite.py |
708 | index 0dd430d..c6580b1 100644 |
709 | --- a/tests/test_suite.py |
710 | +++ b/tests/test_suite.py |
711 | @@ -27,8 +27,8 @@ from ppa.binary_package import BinaryPackage |
712 | |
713 | |
714 | @pytest.mark.parametrize('suite_name, cache_dir, expected_repr, expected_str', [ |
715 | - ('x', 'x', "Suite(suite_name='x', cache_dir='x')", "x"), |
716 | - ('a-1', 'x', "Suite(suite_name='a-1', cache_dir='x')", "a-1"), |
717 | + ('x', '/tmp', "Suite(suite_name='x', cache_dir='/tmp')", "x"), |
718 | + ('a-1', '/tmp', "Suite(suite_name='a-1', cache_dir='/tmp')", "a-1"), |
719 | ('b-2', '/tmp', "Suite(suite_name='b-2', cache_dir='/tmp')", "b-2"), |
720 | ]) |
721 | def test_object(suite_name, cache_dir, expected_repr, expected_str): |
722 | @@ -43,6 +43,7 @@ def test_object(suite_name, cache_dir, expected_repr, expected_str): |
723 | @pytest.mark.parametrize('suite_name, cache_dir, expected_exception', [ |
724 | ('x', '', ValueError), |
725 | ('', 'x', ValueError), |
726 | + ('x', 'x', FileNotFoundError), |
727 | ('', '', ValueError), |
728 | ('a-1', None, ValueError), |
729 | (None, 'x', ValueError), |
730 | @@ -127,7 +128,7 @@ def test_series_codename(monkeypatch, info, expected_series_codename): |
731 | |
732 | @pytest.mark.parametrize('info, expected_pocket', [ |
733 | ({'Suite': 'x'}, 'release'), |
734 | - ({'Suite': 'x-y'}, 'y'), |
735 | + ({'Suite': 'x-backports'}, 'backports'), |
736 | ({'Suite': 'lunar'}, 'release'), |
737 | ({'Suite': 'lunar-proposed'}, 'proposed'), |
738 | ({'Suite': 'focal-security'}, 'security'), |
739 | @@ -159,9 +160,9 @@ def test_architectures(monkeypatch, info, expected_architectures): |
740 | |
741 | |
742 | @pytest.mark.parametrize('suite_name, component_paths, expected_components', [ |
743 | - ('a-1', ['a-1/x', 'a-1/y', 'a-1/z'], ['x', 'y', 'z']), |
744 | - ('a-1', ['a-1/x', 'b-1/y', 'c-1/z'], ['x']), |
745 | - ('a-1', ['a-1/x', 'a-1/x/y', 'c-1/z/x'], ['x']), |
746 | + ('a-1', ['a-1/main', 'a-1/universe', 'a-1/multiverse'], ['main', 'universe', 'multiverse']), |
747 | + ('a-1', ['a-1/main', 'b-1/universe', 'c-1/multiverse'], ['main']), |
748 | + ('a-1', ['a-1/main', 'a-1/main/y', 'c-1/multiverse/x'], ['main']), |
749 | ('x', ['x/main', 'x/restricted', 'x/universe', 'x/multiverse'], |
750 | ['main', 'restricted', 'universe', 'multiverse']), |
751 | ]) |
752 | @@ -553,7 +554,7 @@ def test_rebuild_tables_mapping(monkeypatch): |
753 | {'Package': 'd', 'Version': 'x', 'Build-Depends': 'a1, b1, c2', 'Binary': 'd1'}, |
754 | ], |
755 | 'd', |
756 | - ['a', 'b', 'c'], |
757 | + ['a'], |
758 | ), |
759 | ]) |
760 | def test_dependent_packages(monkeypatch, sources_dict, source_package_name, expected_packages): |
761 | diff --git a/tests/test_trigger.py b/tests/test_trigger.py |
762 | index c2b1702..e2a2350 100644 |
763 | --- a/tests/test_trigger.py |
764 | +++ b/tests/test_trigger.py |
765 | @@ -27,10 +27,16 @@ def test_object(): |
766 | assert trigger |
767 | |
768 | |
769 | -def test_repr(): |
770 | +@pytest.mark.parametrize('pkg, ver, arch, series, ppa, testpkg, expected_repr', [ |
771 | + ('a', 'b', 'c', 'd', 'e', 'f', |
772 | + "Trigger(package='a', version='b', arch='c', series='d', ppa='e', test_package='f')"), |
773 | + ('a', 'b', 'c', 'd', 'e', None, |
774 | + "Trigger(package='a', version='b', arch='c', series='d', ppa='e', test_package='a')"), |
775 | +]) |
776 | +def test_repr(pkg, ver, arch, series, ppa, testpkg, expected_repr): |
777 | """Checks Trigger object representation.""" |
778 | - trigger = Trigger('a', 'b', 'c', 'd', 'e') |
779 | - assert repr(trigger) == "Trigger(package='a', version='b', arch='c', series='d', ppa='e')" |
780 | + trigger = Trigger(pkg, ver, arch, series, ppa, testpkg) |
781 | + assert repr(trigger) == expected_repr |
782 | |
783 | |
784 | def test_str(): |
785 | @@ -72,6 +78,12 @@ def test_history_url(trigger, expected): |
786 | ), ( |
787 | Trigger('nut', '2.7.4-1', 'armhf', 'jammy', None), |
788 | "/request.cgi?release=jammy&package=nut&arch=armhf&trigger=nut%2F2.7.4-1" |
789 | + ), ( |
790 | + Trigger('apache2', '2.4', 'amd64', 'kinetic', 'ppa:aaa/bbb'), |
791 | + "/request.cgi?release=kinetic&package=apache2&arch=amd64&trigger=apache2%2F2.4&ppa=ppa%3Aaaa%2Fbbb" |
792 | + ), ( |
793 | + Trigger('apache2', '2.4', 'amd64', 'kinetic', 'ppa:aaa/bbb', 'cinder'), |
794 | + "/request.cgi?release=kinetic&package=cinder&arch=amd64&trigger=apache2%2F2.4&ppa=ppa%3Aaaa%2Fbbb" |
795 | ) |
796 | ]) |
797 | def test_action_url(trigger, expected): |
Example output from running the command:
$ ./scripts/ppa tests --show-rdepends --architectures amd64 https:/ /launchpad. net/~bryce/ +archive/ ubuntu/ apache2- merge-v2. 4.54-3 2.4.54- 3ubuntu1~ lunar1: Published perl@amd64♻ ️ Trigger all-proposed libembperl- perl@amd64💍 mod-perl2@ amd64♻️ Trigger all-proposed libapache2- mod-perl2@ amd64💍 share@amd64♻ ️ Trigger all-proposed gnome-user- share@amd64💍 mod-auth- gssapi@ amd64♻️ Trigger all-proposed libapache2- mod-auth- gssapi@ amd64💍 mod-authn- sasl@amd64♻ ️ Trigger all-proposed libapache2- mod-authn- sasl@amd64💍 github- containers- toolbox@ amd64♻️ Trigger all-proposed golang- github- containers- toolbox@ amd64💍 siunitx@ amd64♻️ Trigger all-proposed mathjax- siunitx@ amd64💍 upload- progress- module@ amd64♻️ Trigger all-proposed apache- upload- progress- module@ amd64💍 mod-xforward@ amd64♻️ Trigger all-proposed apache2- mod-xforward@ amd64💍 mutability@ amd64♻️ Trigger all-proposed dump1090- mutability@ amd64💍 explorer@ amd64♻️ Trigger all-proposed emboss- explorer@ amd64💍 authenhook- perl@amd64♻ ️ Trigger all-proposed libapache- authenhook- perl@amd64💍 mod-auth- kerb@amd64♻ ️ Trigger all-proposed libapache- mod-auth- kerb@amd64💍 mod-encoding@ amd64♻️ Trigger all-proposed libapache- mod-encoding@ amd64💍 mod-evasive@ amd64♻️ Trigger all-proposed libapache- mod-evasive@ amd64💍 mod-jk@ amd64♻️ Trigger all-proposed libapache- mod-jk@ amd64💍 mod-log- sql@amd64♻ ️ Trigger all-proposed libapache- mod-log- sql@amd64💍 mod-musicindex@ amd64♻️ Trigger all-proposed libapache- mod-musicindex@ amd64💍
* Triggers:
- Source apache2/
+ Trigger basic apache2@amd64♻️ Trigger all-proposed apache2@amd64💍
+ Trigger basic libembperl-
+ Trigger basic passenger@amd64♻️ Trigger all-proposed passenger@amd64💍
+ Trigger basic libapache2-
+ Trigger basic libsoup2.4@amd64♻️ Trigger all-proposed libsoup2.4@amd64💍
+ Trigger basic libsoup3@amd64♻️ Trigger all-proposed libsoup3@amd64💍
+ Trigger basic gnome-user-
+ Trigger basic libapache2-
+ Trigger basic libapache2-
+ Trigger basic mod-gnutls@amd64♻️ Trigger all-proposed mod-gnutls@amd64💍
+ Trigger basic golang-
+ Trigger basic mathjax-
+ Trigger basic pycsw@amd64♻️ Trigger all-proposed pycsw@amd64💍
+ Trigger basic apache-
+ Trigger basic apache2-
+ Trigger basic dehydrated@amd64♻️ Trigger all-proposed dehydrated@amd64💍
+ Trigger basic dogtag-pki@amd64♻️ Trigger all-proposed dogtag-pki@amd64💍
+ Trigger basic dump1090-
+ Trigger basic emboss-
+ Trigger basic freeboard@amd64♻️ Trigger all-proposed freeboard@amd64💍
+ Trigger basic gbrowse@amd64♻️ Trigger all-proposed gbrowse@amd64💍
+ Trigger basic gnocchi@amd64♻️ Trigger all-proposed gnocchi@amd64💍
+ Trigger basic gridsite@amd64♻️ Trigger all-proposed gridsite@amd64💍
+ Trigger basic homer-api@amd64♻️ Trigger all-proposed homer-api@amd64💍
+ Trigger basic jsmath@amd64♻️ Trigger all-proposed jsmath@amd64💍
+ Trigger basic libapache-
+ Trigger basic libapache-
+ Trigger basic libapache-
+ Trigger basic libapache-
+ Trigger basic libapache-
+ Trigger basic libapache-
+ Trigger basic libapache-
+ Trigge...