Merge ppa-dev-tools:add_rdepends_argument into ppa-dev-tools:main

Proposed by Bryce Harrington
Status: Merged
Merge reported by: Bryce Harrington
Merged at revision: 565c3c6342b9f5c57808b7d527daeb1b555070c2
Proposed branch: ppa-dev-tools:add_rdepends_argument
Merge into: ppa-dev-tools:main
Diff against target: 797 lines (+318/-87)
11 files modified
NEWS.md (+16/-0)
ppa/constants.py (+17/-0)
ppa/ppa.py (+32/-0)
ppa/repository.py (+9/-2)
ppa/suite.py (+78/-49)
ppa/trigger.py (+11/-3)
scripts/ppa (+65/-22)
tests/helpers.py (+17/-0)
tests/test_scripts_ppa.py (+50/-1)
tests/test_suite.py (+8/-7)
tests/test_trigger.py (+15/-3)
Reviewer Review Type Date Requested Status
Sergio Durigan Junior (community) Approve
Canonical Server Pending
Canonical Server Reporter Pending
Review via email: mp+441377@code.launchpad.net

Description of the change

This adds the --show-rdepends feature to the tests command, to make it list trigger URLs for the reverse dependencies of package(s) in a given PPA. This relies on a local mirror of the apt repository be stored on your system at /tmp/ubuntu, which can be created like this:

  $ mkdir /tmp/ubuntu
  $ rsync -va \
          --exclude={'*/installer-*','*/i18n/*','*/uefi/*','*/Contents*','*/by-hash/*','*tar.gz'} \
          rsync://archive.ubuntu.com/ubuntu/dists /tmp/ubuntu

This takes a few minutes to run, and generates a sparse mirror (i.e. you can't use it for other Apt needs). A full mirror (via apt-mirror, or rsync without all the excludes) would also work just fine.

(Future plan is to integrate "lazy mirroring" into the tool itself, so the above steps would become unnecessarily, but at least for this branch's initial implementation of the feature it's required.)

With the mirror in place, the actual running of the command just requires adding the option, for example:

  $ ./scripts/ppa tests --show-rdepends --architectures amd64 https://launchpad.net/~bryce/+archive/ubuntu/apache2-merge-v2.4.54-3

Testing is done as usual:

  $ make check
  $ pytest-3
  $ python3 -m ppa.repository
  $ python3 -m ppa.suite
  $ python3 -m ppa.source_package
  $ python3 -m ppa.binary_package
  $ python3 -m ppa.trigger

To post a comment you must log in.
Revision history for this message
Bryce Harrington (bryce) wrote :
Download full text (10.0 KiB)

Example output from running the command:

  $ ./scripts/ppa tests --show-rdepends --architectures amd64 https://launchpad.net/~bryce/+archive/ubuntu/apache2-merge-v2.4.54-3
* Triggers:
  - Source apache2/2.4.54-3ubuntu1~lunar1: Published
    + Trigger basic apache2@amd64♻️ Trigger all-proposed apache2@amd64💍
    + Trigger basic libembperl-perl@amd64♻️ Trigger all-proposed libembperl-perl@amd64💍
    + Trigger basic passenger@amd64♻️ Trigger all-proposed passenger@amd64💍
    + Trigger basic libapache2-mod-perl2@amd64♻️ Trigger all-proposed libapache2-mod-perl2@amd64💍
    + Trigger basic libsoup2.4@amd64♻️ Trigger all-proposed libsoup2.4@amd64💍
    + Trigger basic libsoup3@amd64♻️ Trigger all-proposed libsoup3@amd64💍
    + Trigger basic gnome-user-share@amd64♻️ Trigger all-proposed gnome-user-share@amd64💍
    + Trigger basic libapache2-mod-auth-gssapi@amd64♻️ Trigger all-proposed libapache2-mod-auth-gssapi@amd64💍
    + Trigger basic libapache2-mod-authn-sasl@amd64♻️ Trigger all-proposed libapache2-mod-authn-sasl@amd64💍
    + Trigger basic mod-gnutls@amd64♻️ Trigger all-proposed mod-gnutls@amd64💍
    + Trigger basic golang-github-containers-toolbox@amd64♻️ Trigger all-proposed golang-github-containers-toolbox@amd64💍
    + Trigger basic mathjax-siunitx@amd64♻️ Trigger all-proposed mathjax-siunitx@amd64💍
    + Trigger basic pycsw@amd64♻️ Trigger all-proposed pycsw@amd64💍
    + Trigger basic apache-upload-progress-module@amd64♻️ Trigger all-proposed apache-upload-progress-module@amd64💍
    + Trigger basic apache2-mod-xforward@amd64♻️ Trigger all-proposed apache2-mod-xforward@amd64💍
    + Trigger basic dehydrated@amd64♻️ Trigger all-proposed dehydrated@amd64💍
    + Trigger basic dogtag-pki@amd64♻️ Trigger all-proposed dogtag-pki@amd64💍
    + Trigger basic dump1090-mutability@amd64♻️ Trigger all-proposed dump1090-mutability@amd64💍
    + Trigger basic emboss-explorer@amd64♻️ Trigger all-proposed emboss-explorer@amd64💍
    + Trigger basic freeboard@amd64♻️ Trigger all-proposed freeboard@amd64💍
    + Trigger basic gbrowse@amd64♻️ Trigger all-proposed gbrowse@amd64💍
    + Trigger basic gnocchi@amd64♻️ Trigger all-proposed gnocchi@amd64💍
    + Trigger basic gridsite@amd64♻️ Trigger all-proposed gridsite@amd64💍
    + Trigger basic homer-api@amd64♻️ Trigger all-proposed homer-api@amd64💍
    + Trigger basic jsmath@amd64♻️ Trigger all-proposed jsmath@amd64💍
    + Trigger basic libapache-authenhook-perl@amd64♻️ Trigger all-proposed libapache-authenhook-perl@amd64💍
    + Trigger basic libapache-mod-auth-kerb@amd64♻️ Trigger all-proposed libapache-mod-auth-kerb@amd64💍
    + Trigger basic libapache-mod-encoding@amd64♻️ Trigger all-proposed libapache-mod-encoding@amd64💍
    + Trigger basic libapache-mod-evasive@amd64♻️ Trigger all-proposed libapache-mod-evasive@amd64💍
    + Trigger basic libapache-mod-jk@amd64♻️ Trigger all-proposed libapache-mod-jk@amd64💍
    + Trigger basic libapache-mod-log-sql@amd64♻️ Trigger all-proposed libapache-mod-log-sql@amd64💍
    + Trigger basic libapache-mod-musicindex@amd64♻️ Trigger all-proposed libapache-mod-musicindex@amd64💍
    + Trigge...

Revision history for this message
Bryce Harrington (bryce) wrote :
Download full text (40.5 KiB)

Again, after tests have finished, and with --show-urls added:

$ ./scripts/ppa tests --show-urls --show-rdepends --architectures amd64 https://launchpad.net/~bryce/+archive/ubuntu/apache2-merge-v2.4.54-3
* Triggers:
  - Source apache2/2.4.54-3ubuntu1~lunar1: Published
    + apache2@amd64: https://autopkgtest.ubuntu.com/request.cgi?release=lunar&package=apache2&arch=amd64&trigger=apache2%2F2.4.54-3ubuntu1~lunar1&ppa=bryce%2Fapache2-merge-v2.4.54-3♻️
    + libembperl-perl@amd64: https://autopkgtest.ubuntu.com/request.cgi?release=lunar&package=libembperl-perl&arch=amd64&trigger=apache2%2F2.4.54-3ubuntu1~lunar1&ppa=bryce%2Fapache2-merge-v2.4.54-3♻️
    + passenger@amd64: https://autopkgtest.ubuntu.com/request.cgi?release=lunar&package=passenger&arch=amd64&trigger=apache2%2F2.4.54-3ubuntu1~lunar1&ppa=bryce%2Fapache2-merge-v2.4.54-3♻️
    + libapache2-mod-perl2@amd64: https://autopkgtest.ubuntu.com/request.cgi?release=lunar&package=libapache2-mod-perl2&arch=amd64&trigger=apache2%2F2.4.54-3ubuntu1~lunar1&ppa=bryce%2Fapache2-merge-v2.4.54-3♻️
    + libsoup2.4@amd64: https://autopkgtest.ubuntu.com/request.cgi?release=lunar&package=libsoup2.4&arch=amd64&trigger=apache2%2F2.4.54-3ubuntu1~lunar1&ppa=bryce%2Fapache2-merge-v2.4.54-3♻️
    + libsoup3@amd64: https://autopkgtest.ubuntu.com/request.cgi?release=lunar&package=libsoup3&arch=amd64&trigger=apache2%2F2.4.54-3ubuntu1~lunar1&ppa=bryce%2Fapache2-merge-v2.4.54-3♻️
    + gnome-user-share@amd64: https://autopkgtest.ubuntu.com/request.cgi?release=lunar&package=gnome-user-share&arch=amd64&trigger=apache2%2F2.4.54-3ubuntu1~lunar1&ppa=bryce%2Fapache2-merge-v2.4.54-3♻️
    + libapache2-mod-auth-gssapi@amd64: https://autopkgtest.ubuntu.com/request.cgi?release=lunar&package=libapache2-mod-auth-gssapi&arch=amd64&trigger=apache2%2F2.4.54-3ubuntu1~lunar1&ppa=bryce%2Fapache2-merge-v2.4.54-3♻️
    + libapache2-mod-authn-sasl@amd64: https://autopkgtest.ubuntu.com/request.cgi?release=lunar&package=libapache2-mod-authn-sasl&arch=amd64&trigger=apache2%2F2.4.54-3ubuntu1~lunar1&ppa=bryce%2Fapache2-merge-v2.4.54-3♻️
    + mod-gnutls@amd64: https://autopkgtest.ubuntu.com/request.cgi?release=lunar&package=mod-gnutls&arch=amd64&trigger=apache2%2F2.4.54-3ubuntu1~lunar1&ppa=bryce%2Fapache2-merge-v2.4.54-3♻️
    + golang-github-containers-toolbox@amd64: https://autopkgtest.ubuntu.com/request.cgi?release=lunar&package=golang-github-containers-toolbox&arch=amd64&trigger=apache2%2F2.4.54-3ubuntu1~lunar1&ppa=bryce%2Fapache2-merge-v2.4.54-3♻️
    + mathjax-siunitx@amd64: https://autopkgtest.ubuntu.com/request.cgi?release=lunar&package=mathjax-siunitx&arch=amd64&trigger=apache2%2F2.4.54-3ubuntu1~lunar1&ppa=bryce%2Fapache2-merge-v2.4.54-3♻️
    + pycsw@amd64: https://autopkgtest.ubuntu.com/request.cgi?release=lunar&package=pycsw&arch=amd64&trigger=apache2%2F2.4.54-3ubuntu1~lunar1&ppa=bryce%2Fapache2-merge-v2.4.54-3♻️
    + apache-upload-progress-module@amd64: https://autopkgtest.ubuntu.com/request.cgi?release=lunar&package=apache-upload-progress-module&arch=amd64&trigger=apache2%2F2.4.54-3ubuntu1~lunar1&ppa=bryce%2Fapache2-merge-v2.4.54-3♻️
    + apache2-mod-xforward@amd64: https://autopkgtest.ubuntu.com/request.cgi?releas...

Revision history for this message
Sergio Durigan Junior (sergiodj) wrote :

I looked very briefly and have a few comments, but I'll take a better look tomorrow.

Revision history for this message
Sergio Durigan Junior (sergiodj) wrote :

First of all, thank you very much for implementing this feature. It is indeed one of the most useful parts of bileto, and I'd love to be able to use it from inside my Emacs instead of having to open a browser ;-).

Without entering the merit of the implementation itself, I'd like to discuss the approach you've taken to grab a copy of the archive's metadata needed for the rdep processing. I don't mind having to run rsync manually if needed, but I'm wondering if there are better, even simpler options out there.

For example, I know that Ubuntu Wire maintains http://qa.ubuntuwire.org/rdepends/ , which is itself used by the "reverse-depends" tool. The service is good, but unfortunately only outputs the *binary* packages, so there's some massaging needed in order to obtain the source packages (which is what we ultimately need for the autopkgtest URLs).

Another option I've used in the past (and implemented here: https://git.launchpad.net/~ubuntu-server/+git/ubuntu-helpers/tree/sergiodj/list-revdep-autopkgtest-links.sh) is to rely on chdist to pull the archive data for you. In my case, I also relied on chdist's "bin2src" script to do the conversion for me, but it seems like this is something you already have covered in your script. This chdist invocation could even be done inside a TemporaryDirectory so that you guarantee that no metadata will be left behind after the program's execution (although it might be interesting if the user could choose to keep the data for future invocations; anyway, implementation details).

The main point here is that I believe this metadata fetching could be further automated, which would be even more awesome. WDYT?

review: Needs Information
Revision history for this message
Bryce Harrington (bryce) wrote :

Hi Sergio, thanks for reviewing.

Indeed, this is one of the next items on the todo list after this MP is landed. This branch should be considered the proof-of-concept for the basic functionality, to be followed with some convenience improvements and optimizations.

I don't know if you remember but the last few weeks I've researched into some caching and mirroring options and tools. Particularly I was looking at caching at urllib/http level python modules, however I did also look at apt-mirror and other Debian tools. I found one option for the former I want to explore more, but the latter options looked too overkill and dependency-heavy. I didn't look at chdist though, so will try to peek just in case. But the basic idea I have as a solution is to integrate "lazy mirroring" where instead of directly loading Sources.xz and Packages.xz from disk, if what's on disk is older than N hours, it automatically re-downloads and re-caches the file (and ONLY the requested file) on the fly. From my testing so far, on a decent network connection that should be quite performant.

I was hoping to slip that feature work in before Prague (and have at least started sketching it in on another branch) but I'm realizing there's some caveats I need to account for, and time is getting short, so decided to just go with the (reliable) rsync for now and make this a next-release enhancement. There's a couple other convenience improvements I plan to make too, and maybe other folks will also have ideas to add in.

I hadn't looked at chdist, but I did look into apt-mirror and some other Debian and python tools for mirroring apt, and for general http client-side cache management. There's one cache manager module I want to look into a bit more, but the tools I played with were a bit too heavyweight for what I need, and added dependencies.

Anyway, yes I agree more automation of the metadata fetching would indeed be awesome, and plan to do so but maybe needs to wait for next release?

Revision history for this message
Sergio Durigan Junior (sergiodj) wrote :

Thanks for the detailed explanation, Bryce.

Yes, absolutely no problem to leave this further automation for the next release. I'm glad that we're on the same page on this topic!

I'm leaving a few cosmetic comments below, but otherwise the code looks good to me, so I'm approving the MP. But I have a few other general comments.

- I set up a container in order to fully test the package (including installing it), and I stumbled upon a few problems with missing requirements from the requirements-dev.txt file.

- When I tried running the "tests --show-rdepends" command against a public PPA, it asked me to log into LP. This shouldn't really be necessary, unless dealing with private PPAs and such. It's not something really important to address right now (I understand that you have other things on your plate), but it's something to keep in mind IMHO.

Thanks again for working on this.

review: Approve
Revision history for this message
Bryce Harrington (bryce) wrote :

> Thanks for the detailed explanation, Bryce.
>
> Yes, absolutely no problem to leave this further automation for the next
> release. I'm glad that we're on the same page on this topic!
>
> I'm leaving a few cosmetic comments below, but otherwise the code looks good
> to me, so I'm approving the MP. But I have a few other general comments.
>
> - I set up a container in order to fully test the package (including
> installing it), and I stumbled upon a few problems with missing requirements
> from the requirements-dev.txt file.

If you can shoot me either the steps you followed, or the changed requirements, it'd be appreciated as I prepare the release. I posted an MP the other day with some packaging work, that includes adding a missing dependency or two, but could be more. I also found that running tox on the codebase scared up some requirements issues that I will try to look at if not for this release than for a future one. Getting tox to pass without issue would be very nice, and probably would take care of the issues you hit here.

> - When I tried running the "tests --show-rdepends" command against a public
> PPA, it asked me to log into LP. This shouldn't really be necessary, unless
> dealing with private PPAs and such. It's not something really important to
> address right now (I understand that you have other things on your plate), but
> it's something to keep in mind IMHO.

Noted, I'll put this on the list to work on later.

I know exactly what the problem is - when constructing the Launchpad service you specify what level of permissions you need, and I just across the board ask for read/write privs regardless of what the individual command actually needs. Fixing it such that commands only request the priv level they need will take a bit of plumbing, however as an outgrowth of some of the experimentation I've done with qastaging I'm realizing I need a more sophisticated system for selecting Launchpad service API versions, permissions, and so on.

> Thanks again for working on this.

Revision history for this message
Bryce Harrington (bryce) :
Revision history for this message
Bryce Harrington (bryce) wrote :

Enumerating objects: 86, done.
Counting objects: 100% (86/86), done.
Delta compression using up to 12 threads
Compressing objects: 100% (67/67), done.
Writing objects: 100% (71/71), 13.37 KiB | 4.46 MiB/s, done.
Total 71 (delta 54), reused 0 (delta 0), pack-reused 0
To git+ssh://git.launchpad.net/ppa-dev-tools
   d71ef8d..daf0adc main -> main

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
diff --git a/NEWS.md b/NEWS.md
index f40bac0..32948bd 100644
--- a/NEWS.md
+++ b/NEWS.md
@@ -1,3 +1,19 @@
1# Unreleased #
2
3Reverse dependencies, build dependencies, and installation dependencies
4can be identified for a given source package using cached APT
5information. This list of packages will be used to generate lists of
6autopkgtest triggers, which when run should help identify issues that
7could get flagged in Britney2 runs. While similar to functionality
8provided by Bileto+Britney2, it is a lighterweight facsimile which
9doesn't handle special cases so should not be considered an equivalent,
10just as a preliminary screen to catch basic issues.
11
12The location of the cache can be set in a config file or via a cli
13argument, but by default will store under the user's ~/.config/
14directory.
15
16
1# 0.3.0 Release #17# 0.3.0 Release #
218
3Autopkgtest trigger action URLs are printed for packages in the PPA when19Autopkgtest trigger action URLs are printed for packages in the PPA when
diff --git a/ppa/constants.py b/ppa/constants.py
index 4270392..2d37539 100644
--- a/ppa/constants.py
+++ b/ppa/constants.py
@@ -19,3 +19,20 @@ ARCHES_AUTOPKGTEST = ["amd64", "arm64", "armhf", "i386", "ppc64el", "s390x"]
1919
20URL_LPAPI = "https://api.launchpad.net/devel"20URL_LPAPI = "https://api.launchpad.net/devel"
21URL_AUTOPKGTEST = "https://autopkgtest.ubuntu.com"21URL_AUTOPKGTEST = "https://autopkgtest.ubuntu.com"
22
23DISTRO_UBUNTU_COMPONENTS = ['main', 'restricted', 'universe', 'multiverse']
24
25DISTRO_UBUNTU_POCKETS = ['release', 'security', 'proposed', 'updates', 'backports']
26DISTRO_UBUNTU_POCKETS_UPDATES = ['release', 'security', 'updates']
27
28LOCAL_REPOSITORY_PATH = "/tmp/ubuntu"
29LOCAL_REPOSITORY_MIRRORING_DIRECTIONS = f"""
30Tip: You can generate (and refresh) a dists-only mirror thusly:")
31 $ mkdir {LOCAL_REPOSITORY_PATH}
32 $ rsync -va \\
33 --exclude={{'*/installer*','*/i18n/*','*/uefi/*','*/Contents*','*/by-hash/*','*tar.gz'}} \\
34 rsync://archive.ubuntu.com/ubuntu/dists {LOCAL_REPOSITORY_PATH}
35
36It's recommended to run the rsync command as a cronjob to keep your
37repository up to date as often as desired.
38"""
diff --git a/ppa/ppa.py b/ppa/ppa.py
index 1536b7d..aa4fd7c 100755
--- a/ppa/ppa.py
+++ b/ppa/ppa.py
@@ -16,6 +16,10 @@ import sys
16from functools import lru_cache16from functools import lru_cache
17from lazr.restfulclient.errors import BadRequest, NotFound17from lazr.restfulclient.errors import BadRequest, NotFound
1818
19from .constants import URL_AUTOPKGTEST
20from .io import open_url
21from .job import (get_waiting, get_running)
22
1923
20class PpaDoesNotExist(BaseException):24class PpaDoesNotExist(BaseException):
21 """Exception indicating a requested PPA could not be found."""25 """Exception indicating a requested PPA could not be found."""
@@ -376,6 +380,34 @@ class Ppa:
376 print("Successfully published all builds for all architectures")380 print("Successfully published all builds for all architectures")
377 return retval381 return retval
378382
383 def get_autopkgtest_waiting(self, releases):
384 """Returns iterator of queued autopkgtests for this PPA.
385
386 See get_waiting() for details
387
388 :param list[str] releases: The Ubuntu series codename(s), or None.
389 :rtype: Iterator[Job]
390 :returns: Currently waiting jobs, if any, or an empty list on error
391 """
392 response = open_url(f"{URL_AUTOPKGTEST}/queues.json", "waiting autopkgtests")
393 if response:
394 return get_waiting(response, releases=releases, ppa=str(self))
395 return []
396
397 def get_autopkgtest_running(self, releases):
398 """Returns iterator of queued autopkgtests for this PPA.
399
400 See get_running() for details
401
402 :param list[str] releases: The Ubuntu series codename(s), or None.
403 :rtype: Iterator[Job]
404 :returns: Currently running jobs, if any, or an empty list on error
405 """
406 response = open_url(f"{URL_AUTOPKGTEST}/static/running.json", "running autopkgtests")
407 if response:
408 return get_running(response, releases=releases, ppa=str(self))
409 return []
410
379411
380def ppa_address_split(ppa_address, default_team=None):412def ppa_address_split(ppa_address, default_team=None):
381 """Parse an address for a PPA into its team and name components.413 """Parse an address for a PPA into its team and name components.
diff --git a/ppa/repository.py b/ppa/repository.py
index ccd5211..f71b06c 100644
--- a/ppa/repository.py
+++ b/ppa/repository.py
@@ -15,6 +15,10 @@ import os.path
15from functools import lru_cache15from functools import lru_cache
1616
17from .suite import Suite17from .suite import Suite
18from .constants import (
19 LOCAL_REPOSITORY_PATH,
20 LOCAL_REPOSITORY_MIRRORING_DIRECTIONS
21)
1822
1923
20class Repository:24class Repository:
@@ -31,6 +35,8 @@ class Repository:
31 """35 """
32 if not cache_dir:36 if not cache_dir:
33 raise ValueError("undefined cache_dir.")37 raise ValueError("undefined cache_dir.")
38 if not os.path.exists(cache_dir):
39 raise FileNotFoundError(f"could not find cache dir '{cache_dir}'")
3440
35 self.cache_dir = cache_dir41 self.cache_dir = cache_dir
3642
@@ -82,14 +88,15 @@ if __name__ == "__main__":
82 from pprint import PrettyPrinter88 from pprint import PrettyPrinter
83 pp = PrettyPrinter(indent=4)89 pp = PrettyPrinter(indent=4)
8490
91 from .debug import error
92
85 print('#########################')93 print('#########################')
86 print('## PpaGroup smoke test ##')94 print('## PpaGroup smoke test ##')
87 print('#########################')95 print('#########################')
8896
89 LOCAL_REPOSITORY_PATH = "/var/spool/apt-mirror/skel/archive.ubuntu.com/ubuntu"
90 local_dists_path = os.path.join(LOCAL_REPOSITORY_PATH, "dists")97 local_dists_path = os.path.join(LOCAL_REPOSITORY_PATH, "dists")
91 if not os.path.exists(local_dists_path):98 if not os.path.exists(local_dists_path):
92 print("Error: Missing checkout")99 error(f'Missing checkout for smoketest\n{LOCAL_REPOSITORY_MIRRORING_DIRECTIONS}')
93 sys.exit(1)100 sys.exit(1)
94 repository = Repository(cache_dir=local_dists_path)101 repository = Repository(cache_dir=local_dists_path)
95 for suite in repository.suites.values():102 for suite in repository.suites.values():
diff --git a/ppa/suite.py b/ppa/suite.py
index cbded7a..6e00006 100644
--- a/ppa/suite.py
+++ b/ppa/suite.py
@@ -19,6 +19,12 @@ import apt_pkg
1919
20from .source_package import SourcePackage20from .source_package import SourcePackage
21from .binary_package import BinaryPackage21from .binary_package import BinaryPackage
22from .constants import (
23 DISTRO_UBUNTU_COMPONENTS,
24 DISTRO_UBUNTU_POCKETS,
25 LOCAL_REPOSITORY_PATH,
26 LOCAL_REPOSITORY_MIRRORING_DIRECTIONS
27)
2228
2329
24class Suite:30class Suite:
@@ -40,6 +46,8 @@ class Suite:
40 raise ValueError('undefined suite_name.')46 raise ValueError('undefined suite_name.')
41 if not cache_dir:47 if not cache_dir:
42 raise ValueError('undefined cache_dir.')48 raise ValueError('undefined cache_dir.')
49 if not os.path.exists(cache_dir):
50 raise FileNotFoundError(f"could not find cache dir '{cache_dir}'")
4351
44 self._suite_name = suite_name52 self._suite_name = suite_name
45 self._cache_dir = cache_dir53 self._cache_dir = cache_dir
@@ -69,15 +77,28 @@ class Suite:
69 def _rebuild_lookup_tables(self) -> bool:77 def _rebuild_lookup_tables(self) -> bool:
70 """Regenerates the provides and rdepends lookup tables.78 """Regenerates the provides and rdepends lookup tables.
7179
80 Some packages have build dependence that can be satisfied by one
81 of several packages. For example, a package may require either
82 awk or mawk to build. In these cases, the package will be
83 registered in the table as an rdepend for BOTH awk and mawk.
84
72 :rtype: bool85 :rtype: bool
73 :returns: True if tables were rebuilt, False otherwise"""86 :returns: True if tables were rebuilt, False otherwise"""
74 self._provides_table = {}87 self._provides_table = {}
75 self._rdepends_table = {}88 self._rdepends_table = {}
76 for source_name, source in self.sources.items():89 for source_name, source in self.sources.items():
77 print(source_name, source)90 for build_dep_binary_names in source.build_dependencies.keys():
78 for build_dep_binary_name in source.build_dependencies.keys():91 # This needs to deal with two different kinds of keys.
79 self._rdepends_table.setdefault(build_dep_binary_name, [])92 # Basic dependencies are just simple str's, while alternate
80 self._rdepends_table[build_dep_binary_name].append(source)93 # dependencies are modeled as tuples.
94 #
95 # So, convert simple str's into single-element lists, so
96 # both cases can be handled via iteration in a for loop.
97 if isinstance(build_dep_binary_names, str):
98 build_dep_binary_names = [build_dep_binary_names]
99 for build_dep_binary_name in build_dep_binary_names:
100 self._rdepends_table.setdefault(build_dep_binary_name, [])
101 self._rdepends_table[build_dep_binary_name].append(source)
81102
82 for provided_binary_name in source.provides_binaries.keys():103 for provided_binary_name in source.provides_binaries.keys():
83 self._provides_table[provided_binary_name] = source104 self._provides_table[provided_binary_name] = source
@@ -124,7 +145,10 @@ class Suite:
124 """145 """
125 if '-' not in self.name:146 if '-' not in self.name:
126 return 'release'147 return 'release'
127 return self.name.split('-')[1]148 pocket = self.name.split('-')[1]
149 if pocket not in DISTRO_UBUNTU_POCKETS:
150 raise RuntimeError(f'Unrecognized pocket "{pocket}"')
151 return pocket
128152
129 @property153 @property
130 def architectures(self) -> list[str]:154 def architectures(self) -> list[str]:
@@ -148,6 +172,7 @@ class Suite:
148 component172 component
149 for component in os.listdir(self._cache_dir)173 for component in os.listdir(self._cache_dir)
150 if os.path.isdir(os.path.join(self._cache_dir, component))174 if os.path.isdir(os.path.join(self._cache_dir, component))
175 and component in DISTRO_UBUNTU_COMPONENTS
151 ]176 ]
152 if not components:177 if not components:
153 raise RuntimeError(f'Could not load components from {self._cache_dir}')178 raise RuntimeError(f'Could not load components from {self._cache_dir}')
@@ -163,16 +188,22 @@ class Suite:
163188
164 :rtype: dict[str, SourcePackage]189 :rtype: dict[str, SourcePackage]
165 """190 """
166 sources = {}191 sources = None
167 for comp in self.components:192 for sources_file in ['Sources.xz', 'Sources.gz']:
168 source_packages_dir = f'{self._cache_dir}/{comp}/source'193 for comp in self.components:
169 with apt_pkg.TagFile(f'{source_packages_dir}/Sources.xz') as pkgs:194 source_packages_dir = f'{self._cache_dir}/{comp}/source'
170 for pkg in pkgs:195 try:
171 name = pkg['Package']196 with apt_pkg.TagFile(f'{source_packages_dir}/{sources_file}') as pkgs:
172 sources[name] = SourcePackage(dict(pkg))197 if sources is None:
173 if not sources:198 sources = {}
174 raise RuntimeError(f'Could not load {source_packages_dir}/Sources.xz')199 for pkg in pkgs:
175 return sources200 name = pkg['Package']
201 sources[name] = SourcePackage(dict(pkg))
202 except apt_pkg.Error:
203 pass
204 if sources is not None:
205 return sources
206 raise RuntimeError(f'Could not load {source_packages_dir}/Sources.[xz|gz]')
176207
177 @property208 @property
178 @lru_cache209 @lru_cache
@@ -184,38 +215,45 @@ class Suite:
184215
185 :rtype: dict[str, BinaryPackage]216 :rtype: dict[str, BinaryPackage]
186 """217 """
187 binaries = {}218 binaries = None
188 for comp in self.components:219 for packages_file in ["Packages.xz", "Packages.gz"]:
189 for arch in self.architectures:220 for comp in self.components:
190 binary_packages_dir = f'{self._cache_dir}/{comp}/binary-{arch}'221 for arch in self.architectures:
191 try:222 binary_packages_dir = f'{self._cache_dir}/{comp}/binary-{arch}'
192 with apt_pkg.TagFile(f'{binary_packages_dir}/Packages.xz') as pkgs:223 try:
193 for pkg in pkgs:224 with apt_pkg.TagFile(f'{binary_packages_dir}/{packages_file}') as pkgs:
194 name = f'{pkg["Package"]}:{arch}'225 if binaries is None:
195 binaries[name] = BinaryPackage(pkg)226 binaries = {}
196 except apt_pkg.Error:227 for pkg in pkgs:
197 # If an Apt repository is incomplete, such as if228 name = f'{pkg["Package"]}:{arch}'
198 # a given architecture was not mirrored, still229 binaries[name] = BinaryPackage(pkg)
199 # note the binaries exist but mark their records230 except apt_pkg.Error:
200 # as missing.231 pass
201 binaries[name] = None232 if binaries is not None:
202 if not binaries:233 return binaries
203 raise ValueError(f'Could not load {binary_packages_dir}/Packages.xz')234 raise ValueError(f'Could not load {binary_packages_dir}/Packages.[xz|gz]')
204 return binaries
205235
206 def dependent_packages(self, source_package: SourcePackage) -> dict[str, SourcePackage]:236 def dependent_packages(self, source_package: SourcePackage) -> dict[str, SourcePackage]:
207 """Relevant packages to run autotests against for a given source package.237 """Relevant packages to run autotests against for a given source package.
208238
209 Calculates the collection of build and reverse dependencies for239 Calculates the collection of reverse dependencies for a given
210 a given source package, that would be appropriate to re-run autopkgtests240 source package that would be appropriate to re-run autopkgtests
211 on, using the given @param source_package's name as a trigger.241 on, using the given @param source_package's name as a trigger.
212242
243 For leaf packages (that nothing else depends on as a build
244 requirement), this routine returns an empty dict.
245
246 For packages that can serve as an alternative dependency of some
247 packages, this will include all such packages as if they were
248 hard dependencies. For example, when examining postgresql-12, this
249 would include all packages dependent on any database.
250
213 :param str source_package_name: The archive name of the source package.251 :param str source_package_name: The archive name of the source package.
214 :rtype: dict[str, SourcePackage]252 :rtype: dict[str, SourcePackage]
215 :returns: Collection of source packages, keyed by name.253 :returns: Collection of source packages, keyed by name.
216 """254 """
217 # Build the lookup table for provides and rdepends255 # Build the lookup table for provides and rdepends
218 if not self._provides_table or not self._rdepends_table:256 if not self._rdepends_table:
219 if not self._rebuild_lookup_tables():257 if not self._rebuild_lookup_tables():
220 raise RuntimeError("Could not regenerate provides/rdepends lookup tables")258 raise RuntimeError("Could not regenerate provides/rdepends lookup tables")
221259
@@ -228,35 +266,26 @@ class Suite:
228 for rdep_source in rdeps:266 for rdep_source in rdeps:
229 dependencies[rdep_source.name] = rdep_source267 dependencies[rdep_source.name] = rdep_source
230268
231 # Get source packages that provide our build dependencies
232 for build_dependency_name in source_package.build_dependencies.keys():
233 bdep_source = self._provides_table.get(build_dependency_name)
234 if not bdep_source:
235 raise RuntimeError(f'Could not get source object for bdep {build_dependency_name}')
236 dependencies[bdep_source.name] = bdep_source
237
238 if not dependencies:
239 raise RuntimeError(f'Could not calculate dependencies for {source_package.name}')
240 return dependencies269 return dependencies
241270
242271
243if __name__ == '__main__':272if __name__ == '__main__':
244 # pylint: disable=invalid-name273 # pylint: disable=invalid-name
245 import sys274 import sys
246 from .repository import Repository
247
248 from pprint import PrettyPrinter275 from pprint import PrettyPrinter
249 pp = PrettyPrinter(indent=4)276 pp = PrettyPrinter(indent=4)
250277
278 from .repository import Repository
279 from .debug import error
280
251 print('############################')281 print('############################')
252 print('## Suite class smoke test ##')282 print('## Suite class smoke test ##')
253 print('############################')283 print('############################')
254 print()284 print()
255285
256 LOCAL_REPOSITORY_PATH = '/var/spool/apt-mirror/skel/archive.ubuntu.com/ubuntu'
257 local_dists_path = os.path.join(LOCAL_REPOSITORY_PATH, 'dists')286 local_dists_path = os.path.join(LOCAL_REPOSITORY_PATH, 'dists')
258 if not os.path.exists(local_dists_path):287 if not os.path.exists(local_dists_path):
259 print('Error: Missing checkout')288 error(f'Missing checkout for suite smoketest\n{LOCAL_REPOSITORY_MIRRORING_DIRECTIONS}')
260 sys.exit(1)289 sys.exit(1)
261290
262 repository = Repository(cache_dir=local_dists_path)291 repository = Repository(cache_dir=local_dists_path)
diff --git a/ppa/trigger.py b/ppa/trigger.py
index 234e14a..077dcfc 100755
--- a/ppa/trigger.py
+++ b/ppa/trigger.py
@@ -27,7 +27,7 @@ class Trigger:
27 package and/or architectures, but all such Triggers must be against27 package and/or architectures, but all such Triggers must be against
28 the same series as the Job itself.28 the same series as the Job itself.
29 """29 """
30 def __init__(self, package, version, arch, series, ppa=None):30 def __init__(self, package, version, arch, series, ppa=None, test_package=None):
31 """Initializes a new Trigger for a given package and version.31 """Initializes a new Trigger for a given package and version.
3232
33 :param str package: The source package name.33 :param str package: The source package name.
@@ -35,12 +35,17 @@ class Trigger:
35 :param str arch: The architecture for the trigger.35 :param str arch: The architecture for the trigger.
36 :param str series: The distro release series codename.36 :param str series: The distro release series codename.
37 :param Ppa ppa: (Optional) PPA wrapper object to run tests against.37 :param Ppa ppa: (Optional) PPA wrapper object to run tests against.
38 :param str test_package: The package to run autopkgtests from.
38 """39 """
39 self.package = package40 self.package = package
40 self.version = version41 self.version = version
41 self.arch = arch42 self.arch = arch
42 self.series = series43 self.series = series
43 self.ppa = ppa44 self.ppa = ppa
45 if test_package:
46 self.test_package = test_package
47 else:
48 self.test_package = package
4449
45 def __repr__(self) -> str:50 def __repr__(self) -> str:
46 """Machine-parsable unique representation of object.51 """Machine-parsable unique representation of object.
@@ -50,7 +55,8 @@ class Trigger:
50 """55 """
51 return (f'{self.__class__.__name__}('56 return (f'{self.__class__.__name__}('
52 f'package={self.package!r}, version={self.version!r}, '57 f'package={self.package!r}, version={self.version!r}, '
53 f'arch={self.arch!r}, series={self.series!r}, ppa={self.ppa!r})')58 f'arch={self.arch!r}, series={self.series!r}, ppa={self.ppa!r}, '
59 f'test_package={self.test_package!r})')
5460
55 def __str__(self) -> str:61 def __str__(self) -> str:
56 """Human-readable summary of the object.62 """Human-readable summary of the object.
@@ -58,6 +64,8 @@ class Trigger:
58 :rtype: str64 :rtype: str
59 :returns: Printable summary of the object.65 :returns: Printable summary of the object.
60 """66 """
67 if self.test_package != self.package:
68 return f"({self.test_package}) {self.package}/{self.version}"
61 return f"{self.package}/{self.version}"69 return f"{self.package}/{self.version}"
6270
63 @property71 @property
@@ -85,7 +93,7 @@ class Trigger:
85 """93 """
86 params = [94 params = [
87 ("release", self.series),95 ("release", self.series),
88 ("package", self.package),96 ("package", self.test_package),
89 ("arch", self.arch),97 ("arch", self.arch),
90 ]98 ]
9199
diff --git a/scripts/ppa b/scripts/ppa
index 9097004..7200c22 100755
--- a/scripts/ppa
+++ b/scripts/ppa
@@ -70,15 +70,12 @@ from ppa.constants import (
70 ARCHES_PPA_DEFAULT,70 ARCHES_PPA_DEFAULT,
71 ARCHES_AUTOPKGTEST,71 ARCHES_AUTOPKGTEST,
72 URL_AUTOPKGTEST,72 URL_AUTOPKGTEST,
73 LOCAL_REPOSITORY_PATH,
74 LOCAL_REPOSITORY_MIRRORING_DIRECTIONS,
73)75)
74from ppa.dict import unpack_to_dict76from ppa.dict import unpack_to_dict
75from ppa.io import open_url77from ppa.io import open_url
76from ppa.job import (78from ppa.job import show_waiting, show_running
77 get_waiting,
78 show_waiting,
79 get_running,
80 show_running
81)
82from ppa.lp import Lp79from ppa.lp import Lp
83from ppa.ppa import (80from ppa.ppa import (
84 get_ppa,81 get_ppa,
@@ -87,6 +84,7 @@ from ppa.ppa import (
87 PpaDoesNotExist84 PpaDoesNotExist
88)85)
89from ppa.ppa_group import PpaGroup, PpaAlreadyExists86from ppa.ppa_group import PpaGroup, PpaAlreadyExists
87from ppa.repository import Repository
90from ppa.result import get_results88from ppa.result import get_results
91from ppa.text import o2str, ansi_hyperlink89from ppa.text import o2str, ansi_hyperlink
92from ppa.trigger import Trigger90from ppa.trigger import Trigger
@@ -365,6 +363,10 @@ def create_arg_parser() -> argparse.ArgumentParser:
365 dest='show_urls', action='store_true',363 dest='show_urls', action='store_true',
366 default=False,364 default=False,
367 help="Display unformatted trigger action URLs")365 help="Display unformatted trigger action URLs")
366 tests_parser.add_argument('--show-rdepends',
367 dest='show_rdepends', action='store_true',
368 default=False,
369 help="Display test triggers for reverse dependencies")
368370
369 # Wait Command371 # Wait Command
370 wait_parser = subparser.add_parser(372 wait_parser = subparser.add_parser(
@@ -772,6 +774,15 @@ def command_tests(lp, config):
772 if not lp:774 if not lp:
773 return 1775 return 1
774776
777 apt_repository = None
778 if config.get("show_rdepends"):
779 local_dists_path = os.path.join(LOCAL_REPOSITORY_PATH, "dists")
780 try:
781 apt_repository = Repository(cache_dir=local_dists_path)
782 except FileNotFoundError as e:
783 error(f'Missing checkout\n{LOCAL_REPOSITORY_MIRRORING_DIRECTIONS}')
784 return 1
785
775 releases = config.get('releases', None)786 releases = config.get('releases', None)
776 if releases is None:787 if releases is None:
777 udi = UbuntuDistroInfo()788 udi = UbuntuDistroInfo()
@@ -792,8 +803,8 @@ def command_tests(lp, config):
792 # Triggers803 # Triggers
793 print("* Triggers:")804 print("* Triggers:")
794 for source_pub in the_ppa.get_source_publications():805 for source_pub in the_ppa.get_source_publications():
795 series = source_pub.distro_series.name806 series_codename = source_pub.distro_series.name
796 if series not in releases:807 if series_codename not in releases:
797 continue808 continue
798 pkg = source_pub.source_package_name809 pkg = source_pub.source_package_name
799 if packages and (pkg not in packages):810 if packages and (pkg not in packages):
@@ -802,28 +813,64 @@ def command_tests(lp, config):
802 url = f"https://launchpad.net/ubuntu/+source/{pkg}/{ver}"813 url = f"https://launchpad.net/ubuntu/+source/{pkg}/{ver}"
803 source_hyperlink = ansi_hyperlink(url, f"{pkg}/{ver}")814 source_hyperlink = ansi_hyperlink(url, f"{pkg}/{ver}")
804 print(f" - Source {source_hyperlink}: {source_pub.status}")815 print(f" - Source {source_hyperlink}: {source_pub.status}")
805 triggers = [Trigger(pkg, ver, arch, series, the_ppa) for arch in architectures]816 triggers = [Trigger(pkg, ver, arch, series_codename, the_ppa) for arch in architectures]
817
818 rdepends = None
819 if config.get("show_rdepends"):
820 # Construct suite object from repository.
821 # NOTE: If a package has been freshly added to 'proposed' it
822 # will be missed since we consider only packages present
823 # in the release pocket.
824 suite = apt_repository.get_suite(series_codename, 'release')
825 if not suite:
826 raise RuntimeError(f'Could not find suite for "{series_codename}" in the local Apt cache')
827
828 # Lookup rdepends for the package
829 source_package = suite.sources.get(pkg)
830 if not source_package:
831 raise RuntimeError(f'Could not find source package "{pkg}" in the local Apt cache for "{suite}"')
832
833 rdepends_source_package_names = suite.dependent_packages(source_package)
834 for rdep_name in rdepends_source_package_names:
835 rdep = suite.sources.get(rdep_name)
836 if not rdep:
837 raise RuntimeError(f'Undefined reverse dependency "{rdep_name}"')
838
839 triggers.extend([
840 Trigger(pkg, ver, arch, series_codename, the_ppa, rdep.name)
841 for arch
842 in architectures
843 ])
806844
807 if config.get("show_urls"):845 if config.get("show_urls"):
808 for trigger in triggers:846 for trigger in triggers:
809 print(f" + {trigger.arch}: {trigger.action_url}♻️ ")847 title = ''
848 if config.get('show_rdepends'):
849 title = trigger.test_package
850 print(f" + {title}@{trigger.arch}: {trigger.action_url}♻️ ")
810 for trigger in triggers:851 for trigger in triggers:
811 print(f" + {trigger.arch}: {trigger.action_url}💍")852 title = ''
853 if config.get('show_rdepends'):
854 title = trigger.test_package
855 print(f" + {trigger.package}@{trigger.arch}: {trigger.action_url}💍")
812856
813 else:857 else:
814 for trigger in triggers:858 for trigger in triggers:
815 pad = ' ' * (1 + abs(len('ppc64el') - len(trigger.arch)))859 pad = ' ' * (1 + abs(len('ppc64el') - len(trigger.arch)))
860 title = ''
861 if config.get('show_rdepends'):
862 title = trigger.test_package
863
816 basic_trig = ansi_hyperlink(864 basic_trig = ansi_hyperlink(
817 trigger.action_url, f"Trigger basic @{trigger.arch}♻️ "865 trigger.action_url, f"Trigger basic {title}@{trigger.arch}♻️ "
818 )866 )
819 all_proposed_trig = ansi_hyperlink(867 all_proposed_trig = ansi_hyperlink(
820 trigger.action_url + "&all-proposed=1",868 trigger.action_url + "&all-proposed=1",
821 f"Trigger all-proposed @{trigger.arch}💍"869 f"Trigger all-proposed {title}@{trigger.arch}💍"
822 )870 )
823 print(f" + " + pad.join([basic_trig, all_proposed_trig]))871 print(f" + " + pad.join([basic_trig, all_proposed_trig]))
824872
825 # Results873 # Results
826 print()
827 print("* Results:")874 print("* Results:")
828 for release in releases:875 for release in releases:
829 base_results_fmt = f"{URL_AUTOPKGTEST}/results/autopkgtest-%s-%s-%s/"876 base_results_fmt = f"{URL_AUTOPKGTEST}/results/autopkgtest-%s-%s-%s/"
@@ -850,18 +897,14 @@ def command_tests(lp, config):
850 trigger_sets[trigger] += f" • {subtest}\n"897 trigger_sets[trigger] += f" • {subtest}\n"
851898
852 for trigger, result in trigger_sets.items():899 for trigger, result in trigger_sets.items():
853 print(f" - {trigger}\n{result}")900 print(f" - {trigger}\n{result.rstrip()}")
854901
855 # Running Queue902 # Running Queue
856 response = open_url(f"{URL_AUTOPKGTEST}/static/running.json", "running autopkgtests")903 show_running(sorted(the_ppa.get_autopkgtest_running(releases),
857 if response:904 key=lambda k: str(k.submit_time)))
858 show_running(sorted(get_running(response, releases=releases, ppa=str(the_ppa)),
859 key=lambda k: str(k.submit_time)))
860905
861 # Waiting Queue906 # Waiting Queue
862 response = open_url(f"{URL_AUTOPKGTEST}/queues.json", "waiting autopkgtests")907 show_waiting(the_ppa.get_autopkgtest_waiting(releases))
863 if response:
864 show_waiting(get_waiting(response, releases=releases, ppa=str(the_ppa)))
865908
866 return os.EX_OK909 return os.EX_OK
867 except KeyboardInterrupt:910 except KeyboardInterrupt:
diff --git a/tests/helpers.py b/tests/helpers.py
index 2329125..3a0d2d4 100644
--- a/tests/helpers.py
+++ b/tests/helpers.py
@@ -19,6 +19,19 @@ from ppa.constants import ARCHES_PPA_DEFAULT
19from ppa.ppa_group import PpaAlreadyExists19from ppa.ppa_group import PpaAlreadyExists
2020
2121
22class SeriesMock:
23 def __init__(self, name):
24 self.name = name
25
26
27class PublicationMock:
28 def __init__(self, name, version, status, series):
29 self.source_package_name = name
30 self.source_package_version = version
31 self.status = status
32 self.distro_series = SeriesMock(series)
33
34
22class ProcessorMock:35class ProcessorMock:
23 """A stand-in for a Launchpad Processor object."""36 """A stand-in for a Launchpad Processor object."""
24 def __init__(self, name):37 def __init__(self, name):
@@ -32,10 +45,14 @@ class ArchiveMock:
32 self.description = description45 self.description = description
33 self.publish = True46 self.publish = True
34 self.processors = [ProcessorMock(proc_name) for proc_name in ARCHES_PPA_DEFAULT]47 self.processors = [ProcessorMock(proc_name) for proc_name in ARCHES_PPA_DEFAULT]
48 self.published_sources = []
3549
36 def setProcessors(self, processors):50 def setProcessors(self, processors):
37 self.processors = [ProcessorMock(proc.split('/')[-1]) for proc in processors]51 self.processors = [ProcessorMock(proc.split('/')[-1]) for proc in processors]
3852
53 def getPublishedSources(self):
54 return self.published_sources
55
39 def lp_save(self):56 def lp_save(self):
40 return True57 return True
4158
diff --git a/tests/test_scripts_ppa.py b/tests/test_scripts_ppa.py
index a107f92..c4dca3d 100644
--- a/tests/test_scripts_ppa.py
+++ b/tests/test_scripts_ppa.py
@@ -18,16 +18,21 @@ import types
18import importlib.machinery18import importlib.machinery
19import argparse19import argparse
20import pytest20import pytest
21from mock import patch
2122
22SCRIPT_NAME = "ppa"23SCRIPT_NAME = "ppa"
23BASE_PATH = os.path.realpath(os.path.join(os.path.dirname(os.path.realpath(__file__)), ".."))24BASE_PATH = os.path.realpath(os.path.join(os.path.dirname(os.path.realpath(__file__)), ".."))
24sys.path.insert(0, BASE_PATH)25sys.path.insert(0, BASE_PATH)
2526
27from ppa.ppa import Ppa
26from ppa.constants import (28from ppa.constants import (
27 ARCHES_PPA_ALL,29 ARCHES_PPA_ALL,
28 ARCHES_PPA_DEFAULT30 ARCHES_PPA_DEFAULT
29)31)
30from tests.helpers import LpServiceMock32from tests.helpers import (
33 LpServiceMock,
34 PublicationMock
35)
3136
32if '.pybuild' in BASE_PATH:37if '.pybuild' in BASE_PATH:
33 python_version = '.'.join([str(v) for v in sys.version_info[0:2]])38 python_version = '.'.join([str(v) for v in sys.version_info[0:2]])
@@ -363,6 +368,14 @@ def test_create_arg_parser_tests():
363 assert args.show_urls is True368 assert args.show_urls is True
364 args.show_urls = None369 args.show_urls = None
365370
371 # Check --show-rdepends
372 args = parser.parse_args([command, 'tests-ppa'])
373 assert args.show_rdepends is False
374 args.show_rdepends = None
375 args = parser.parse_args([command, 'tests-ppa', '--show-rdepends'])
376 assert args.show_rdepends is True
377 args.show_rdepends = None
378
366379
367@pytest.mark.parametrize('command_line_options, expected_config', [380@pytest.mark.parametrize('command_line_options, expected_config', [
368 pytest.param([], {}),381 pytest.param([], {}),
@@ -578,6 +591,42 @@ def test_command_status(fake_config):
578 # TODO: Capture stdout and compare with expected591 # TODO: Capture stdout and compare with expected
579592
580593
594@pytest.mark.parametrize('params, format, expected_in_stdout', [
595 (
596 {
597 'releases': None,
598 'architectures': 'amd64',
599 'show_urls': True
600 },
601 'plain',
602 'arch=amd64&trigger=x%2F1&ppa=me%2Ftesting'
603 ),
604])
605@patch('urllib.request.urlopen')
606@patch('ppa.io.open_url')
607def test_command_tests(urlopen_mock,
608 open_url_mock,
609 fake_config, capfd, params, format, expected_in_stdout):
610 '''Checks that the tests command retrieves and displays correct results.'''
611 lp = LpServiceMock()
612 urlopen_mock.return_value = "{}"
613 Ppa.get_autopkgtest_running = lambda x, y: []
614 Ppa.get_autopkgtest_waiting = lambda x, y: []
615
616 # Create a default PPA, for modification later
617 team = lp.people[fake_config['team_name']]
618 team.createPPA(fake_config['ppa_name'], 'x', 'y')
619 the_ppa = lp.me.getPPAByName(fake_config['ppa_name'])
620
621 # Add some fake publications
622 the_ppa.published_sources.append(PublicationMock('x', '1', 'Published', 'jammy'))
623
624 config = {**fake_config, **params}
625 assert script.command_tests(lp, config) == 0
626 out, err = capfd.readouterr()
627 assert expected_in_stdout in out
628
629
581@pytest.mark.xfail(reason="Unimplemented")630@pytest.mark.xfail(reason="Unimplemented")
582def test_command_wait(fake_config):631def test_command_wait(fake_config):
583 # TODO: Set wait period to 1 sec632 # TODO: Set wait period to 1 sec
diff --git a/tests/test_suite.py b/tests/test_suite.py
index 0dd430d..c6580b1 100644
--- a/tests/test_suite.py
+++ b/tests/test_suite.py
@@ -27,8 +27,8 @@ from ppa.binary_package import BinaryPackage
2727
2828
29@pytest.mark.parametrize('suite_name, cache_dir, expected_repr, expected_str', [29@pytest.mark.parametrize('suite_name, cache_dir, expected_repr, expected_str', [
30 ('x', 'x', "Suite(suite_name='x', cache_dir='x')", "x"),30 ('x', '/tmp', "Suite(suite_name='x', cache_dir='/tmp')", "x"),
31 ('a-1', 'x', "Suite(suite_name='a-1', cache_dir='x')", "a-1"),31 ('a-1', '/tmp', "Suite(suite_name='a-1', cache_dir='/tmp')", "a-1"),
32 ('b-2', '/tmp', "Suite(suite_name='b-2', cache_dir='/tmp')", "b-2"),32 ('b-2', '/tmp', "Suite(suite_name='b-2', cache_dir='/tmp')", "b-2"),
33])33])
34def test_object(suite_name, cache_dir, expected_repr, expected_str):34def test_object(suite_name, cache_dir, expected_repr, expected_str):
@@ -43,6 +43,7 @@ def test_object(suite_name, cache_dir, expected_repr, expected_str):
43@pytest.mark.parametrize('suite_name, cache_dir, expected_exception', [43@pytest.mark.parametrize('suite_name, cache_dir, expected_exception', [
44 ('x', '', ValueError),44 ('x', '', ValueError),
45 ('', 'x', ValueError),45 ('', 'x', ValueError),
46 ('x', 'x', FileNotFoundError),
46 ('', '', ValueError),47 ('', '', ValueError),
47 ('a-1', None, ValueError),48 ('a-1', None, ValueError),
48 (None, 'x', ValueError),49 (None, 'x', ValueError),
@@ -127,7 +128,7 @@ def test_series_codename(monkeypatch, info, expected_series_codename):
127128
128@pytest.mark.parametrize('info, expected_pocket', [129@pytest.mark.parametrize('info, expected_pocket', [
129 ({'Suite': 'x'}, 'release'),130 ({'Suite': 'x'}, 'release'),
130 ({'Suite': 'x-y'}, 'y'),131 ({'Suite': 'x-backports'}, 'backports'),
131 ({'Suite': 'lunar'}, 'release'),132 ({'Suite': 'lunar'}, 'release'),
132 ({'Suite': 'lunar-proposed'}, 'proposed'),133 ({'Suite': 'lunar-proposed'}, 'proposed'),
133 ({'Suite': 'focal-security'}, 'security'),134 ({'Suite': 'focal-security'}, 'security'),
@@ -159,9 +160,9 @@ def test_architectures(monkeypatch, info, expected_architectures):
159160
160161
161@pytest.mark.parametrize('suite_name, component_paths, expected_components', [162@pytest.mark.parametrize('suite_name, component_paths, expected_components', [
162 ('a-1', ['a-1/x', 'a-1/y', 'a-1/z'], ['x', 'y', 'z']),163 ('a-1', ['a-1/main', 'a-1/universe', 'a-1/multiverse'], ['main', 'universe', 'multiverse']),
163 ('a-1', ['a-1/x', 'b-1/y', 'c-1/z'], ['x']),164 ('a-1', ['a-1/main', 'b-1/universe', 'c-1/multiverse'], ['main']),
164 ('a-1', ['a-1/x', 'a-1/x/y', 'c-1/z/x'], ['x']),165 ('a-1', ['a-1/main', 'a-1/main/y', 'c-1/multiverse/x'], ['main']),
165 ('x', ['x/main', 'x/restricted', 'x/universe', 'x/multiverse'],166 ('x', ['x/main', 'x/restricted', 'x/universe', 'x/multiverse'],
166 ['main', 'restricted', 'universe', 'multiverse']),167 ['main', 'restricted', 'universe', 'multiverse']),
167])168])
@@ -553,7 +554,7 @@ def test_rebuild_tables_mapping(monkeypatch):
553 {'Package': 'd', 'Version': 'x', 'Build-Depends': 'a1, b1, c2', 'Binary': 'd1'},554 {'Package': 'd', 'Version': 'x', 'Build-Depends': 'a1, b1, c2', 'Binary': 'd1'},
554 ],555 ],
555 'd',556 'd',
556 ['a', 'b', 'c'],557 ['a'],
557 ),558 ),
558])559])
559def test_dependent_packages(monkeypatch, sources_dict, source_package_name, expected_packages):560def test_dependent_packages(monkeypatch, sources_dict, source_package_name, expected_packages):
diff --git a/tests/test_trigger.py b/tests/test_trigger.py
index c2b1702..e2a2350 100644
--- a/tests/test_trigger.py
+++ b/tests/test_trigger.py
@@ -27,10 +27,16 @@ def test_object():
27 assert trigger27 assert trigger
2828
2929
30def test_repr():30@pytest.mark.parametrize('pkg, ver, arch, series, ppa, testpkg, expected_repr', [
31 ('a', 'b', 'c', 'd', 'e', 'f',
32 "Trigger(package='a', version='b', arch='c', series='d', ppa='e', test_package='f')"),
33 ('a', 'b', 'c', 'd', 'e', None,
34 "Trigger(package='a', version='b', arch='c', series='d', ppa='e', test_package='a')"),
35])
36def test_repr(pkg, ver, arch, series, ppa, testpkg, expected_repr):
31 """Checks Trigger object representation."""37 """Checks Trigger object representation."""
32 trigger = Trigger('a', 'b', 'c', 'd', 'e')38 trigger = Trigger(pkg, ver, arch, series, ppa, testpkg)
33 assert repr(trigger) == "Trigger(package='a', version='b', arch='c', series='d', ppa='e')"39 assert repr(trigger) == expected_repr
3440
3541
36def test_str():42def test_str():
@@ -72,6 +78,12 @@ def test_history_url(trigger, expected):
72 ), (78 ), (
73 Trigger('nut', '2.7.4-1', 'armhf', 'jammy', None),79 Trigger('nut', '2.7.4-1', 'armhf', 'jammy', None),
74 "/request.cgi?release=jammy&package=nut&arch=armhf&trigger=nut%2F2.7.4-1"80 "/request.cgi?release=jammy&package=nut&arch=armhf&trigger=nut%2F2.7.4-1"
81 ), (
82 Trigger('apache2', '2.4', 'amd64', 'kinetic', 'ppa:aaa/bbb'),
83 "/request.cgi?release=kinetic&package=apache2&arch=amd64&trigger=apache2%2F2.4&ppa=ppa%3Aaaa%2Fbbb"
84 ), (
85 Trigger('apache2', '2.4', 'amd64', 'kinetic', 'ppa:aaa/bbb', 'cinder'),
86 "/request.cgi?release=kinetic&package=cinder&arch=amd64&trigger=apache2%2F2.4&ppa=ppa%3Aaaa%2Fbbb"
75 )87 )
76])88])
77def test_action_url(trigger, expected):89def test_action_url(trigger, expected):

Subscribers

People subscribed via source and target branches

to all changes: