Merge ~nacc/git-ubuntu:bug-fixes-3 into git-ubuntu:master
- Git
- lp:~nacc/git-ubuntu
- bug-fixes-3
- Merge into master
Status: | Work in progress | ||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Proposed branch: | ~nacc/git-ubuntu:bug-fixes-3 | ||||||||||||||||||||||||||||||||
Merge into: | git-ubuntu:master | ||||||||||||||||||||||||||||||||
Diff against target: |
1849 lines (+783/-376) 15 files modified
bin/git-ubuntu (+6/-1) gitubuntu/build.py (+209/-134) gitubuntu/dsc.py (+37/-11) gitubuntu/exportorig.py (+2/-4) gitubuntu/git_repository.py (+155/-48) gitubuntu/importer.py (+56/-29) gitubuntu/lint.py (+7/-5) gitubuntu/merge.py (+1/-1) gitubuntu/run.py (+15/-0) gitubuntu/versioning.py (+35/-0) scripts/import-source-packages.py (+20/-25) scripts/scriptutils.py (+202/-13) scripts/source-package-walker.py (+20/-85) scripts/update-repository-alias.py (+15/-15) snap/snapcraft.yaml (+3/-5) |
||||||||||||||||||||||||||||||||
Related bugs: |
|
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Robie Basak | Needs Fixing | ||
Server Team CI bot | continuous-integration | Approve | |
Christian Ehrhardt | Pending | ||
Andreas Hasenack | Pending | ||
git-ubuntu developers | Pending | ||
Review via email: mp+336017@code.launchpad.net |
This proposal supersedes a proposal from 2018-01-10.
Commit message
Description of the change
Make jenkins happy.
Server Team CI bot (server-team-bot) wrote : Posted in a previous version of this proposal | # |
Nish Aravamudan (nacc) wrote : Posted in a previous version of this proposal | # |
I plan on rebasing this and cleaning up the second commit so it's clearer. I'll also merge in the other approved fixes from a few of the other MPs.
Andreas Hasenack (ahasenack) wrote : Posted in a previous version of this proposal | # |
5d4441a4f987fe7
Andreas Hasenack (ahasenack) wrote : Posted in a previous version of this proposal | # |
Comments inline.
I see some duplication of code, which was already there, but is unfortunate. Something for a follow-up branch?
Andreas Hasenack (ahasenack) wrote : Posted in a previous version of this proposal | # |
Could you also please assign the bugs to yourself and mark them as "in progress"? Just the lxd one is so.
Andreas Hasenack (ahasenack) wrote : Posted in a previous version of this proposal | # |
One more question inline.
Nish Aravamudan (nacc) wrote : | # |
@Robie: the last two commits I need some help on. We talked in HO about HEAD^, but I forget what we decided as to the cleanup. For HEAD, I'm not sure how to make pristine-tar and parent-dir cooperate, when the former needs to know that we are in a Git repository and what content we need from it.
Nish Aravamudan (nacc) wrote : Posted in a previous version of this proposal | # |
On 10.01.2018 [12:26:35 -0000], Andreas Hasenack wrote:
> Review: Needs Information
>
> Comments inline.
>
> I see some duplication of code, which was already there, but is
> unfortunate. Something for a follow-up branch?
Absolutely. Patches welcome, as well :)
> Diff comments:
>
> > diff --git a/gitubuntu/
> > index 072d3e7..06b491f 100644
> > --- a/gitubuntu/
> > +++ b/gitubuntu/
> > @@ -878,16 +873,32 @@ def do_build_
> > old_pardir_contents = set(os.
> > run(['dpkg-
> > new_pardir_contents = set(os.
>
> Do we still need to take this snapshot of the pardir before and after
> the build? See below my other comment when we parse the changes file.
Hrm, you're right, probably not. We can just look that all the contents
of the changes file are present. I believe here I was trying to avoid
copying the orig tarball twice.
> > - temp_pardir_
> > - built_pardir_
> > - for f in temp_pardir_
> > - shutil.copy(
> > - os.path.
> > - os.path.
> > - )
> > - built_pardir_
> >
> > - return built_pardir_
> > + changes_file = None
> > + built_pardir_
> > + changes_files = [f for f in new_pardir_contents if 'changes' in f]
>
> Suggestion to be more specific about the file we want:
> changes_files = [f for f in new_pardir_contents if f.endswith(
Yes, probably worth doing.
> > + if len(changes_files) == 0:
> > + logging.error("No changes file")
> > + elif len(changes_files) > 1:
> > + logging.
> > + else:
> > + changes_file = changes_files.pop()
> > + # Intersect changes file contents with the new files in /tmp
> > + # to determine what was newly built
> > + with open(os.
>
> Do we still need this intersection? Won't the changes file have an
> exact list of what we need?
Good point.
>
> > + changes = Changes(
> > + changes_named_files = [f['name'] for f in changes['files']]
> > + for new_file in new_pardir_
> > + if new_file in changes_
> > + shutil.copy(
> > + os.path.
> > + os.path.
> > + )
> > + built_pardir_
> > + os.path.
> > + )
> > +
> > + return [changes_file,] + built_pardir_
> >
> > def get_cmd_
> > return shutil.which(
> > @@ -1141,7 +1164,32 @@ def do_build_
> > if filename_stripped
> > )
> >
> > - built_temp_contents = new_temp_contents - orig_temp_contents
> > + changes_file = None
> > + built_temp_contents = []
> ...
Nish Aravamudan (nacc) wrote : Posted in a previous version of this proposal | # |
On 10.01.2018 [12:41:41 -0000], Andreas Hasenack wrote:
> Could you also please assign the bugs to yourself and mark them as "in
> progress"? Just the lxd one is so.
I believe they all are now from the new branch.
Nish Aravamudan (nacc) wrote : Posted in a previous version of this proposal | # |
On 10.01.2018 [12:52:57 -0000], Andreas Hasenack wrote:
> Review: Needs Information
>
> One more question inline.
>
> Diff comments:
>
> > diff --git a/gitubuntu/
> > index 072d3e7..06b491f 100644
> > --- a/gitubuntu/
> > +++ b/gitubuntu/
> > @@ -1026,17 +1037,29 @@ def do_build_
> > time.sleep(
> > else:
> > raise RuntimeError(
> > - "Failed to run apt in ephemeral build container"
> > + "Failed to run apt-get in ephemeral build container"
> > )
> >
> > + # lxd assumes that if SNAP is set in the environment, that it is
> > + # running as a snap. That is not necessarily true when git-ubuntu is
> > + # the snapped application. So unset SNAP before we exec.
> > + # LP: #1741949
>
> Does this happen only with lxc push? There are other lxc commands we
> run elsewhere, like lxc pull, and lxc exec.
Only lxc push/pull care about the hostfs. Both have been updated to use
this env in this patch.
>
> > + env_unset_SNAP = os.environ.copy()
> > try:
> > - run([
> > - lxc,
> > - 'file',
> > - 'push',
> > - archive_
> > - '%s/tmp/' % container_name,
> > - ])
> > + del env_unset_
> > + except KeyError:
> > + pass
> > + try:
> > + run(
> > + [
> > + lxc,
> > + 'file',
> > + 'push',
> > + archive_
> > + '%s/tmp/' % container_name,
> > + ],
> > + env=env_unset_SNAP,
> > + )
> > except Exception as e:
> > raise RuntimeError(
> > "Failed to push archive tarball to ephemeral build container"
--
Nishanth Aravamudan
Ubuntu Server
Canonical Ltd
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:66abeeca9e8
https:/
Executed test runs:
SUCCESS: Checkout
SUCCESS: Style Check
SUCCESS: Unit Tests
FAILED: Integration Tests
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
FAILED: Continuous integration, rev:cd23a85ab44
https:/
Executed test runs:
SUCCESS: Checkout
SUCCESS: Style Check
SUCCESS: Unit Tests
FAILED: Integration Tests
Click here to trigger a rebuild:
https:/
Nish Aravamudan (nacc) wrote : | # |
Clearly still working on this. Feel free to not review until it passes CI.
On Jan 11, 2018 18:23, "Server Team CI bot" <
<email address hidden>> wrote:
> Review: Needs Fixing continuous-
>
> FAILED: Continuous integration, rev:cd23a85ab44
> 15eb478f29
> https:/
> Executed test runs:
> SUCCESS: Checkout
> SUCCESS: Style Check
> SUCCESS: Unit Tests
> FAILED: Integration Tests
>
> Click here to trigger a rebuild:
> https:/
>
> --
> https:/
> importer/
> You are the owner of ~nacc/usd-
>
Andreas Hasenack (ahasenack) wrote : | # |
I haven't started a re-review of this yet, but I wanted to use it to avoid that bug where lxc pull fails because it's pulling in an unrelated directory, but then I got this new backtrace
I'm not sure yet what happened:
01/12/2018 11:15:17 - DEBUG:Executing: /usr/bin/lxc exec right-lab -- sudo -s -H -u ubuntu ls /tmp
01/12/2018 11:15:17 - DEBUG:Executing: /usr/bin/lxc file pull right-lab/
01/12/2018 11:15:17 - DEBUG:Executing: /usr/bin/lxc file pull right-lab/
01/12/2018 11:15:17 - DEBUG:Executing: git commit-tree --no-gpg-sign 5615196cde70134
01/12/2018 11:15:17 - INFO:We automatically generated fixup changes relative to
HEAD as commit 4aead8c54f2875c
If you would like to create a branch at that commit, run:
git checkout -b <branch name> 4aead8c54f2875c
01/12/2018 11:15:17 - DEBUG:Executing: /usr/bin/lxc stop --force right-lab
01/12/2018 11:15:19 - DEBUG:Executing: git worktree prune
Traceback (most recent call last):
File "/home/
main()
File "/home/
sys.
File "/home/
default_
File "/home/
retry_
File "/home/
retry_backoffs,
File "/home/
shutil.copy(f, os.path.
File "/usr/lib/
copyfile(src, dst, follow_
File "/usr/lib/
with open(src, 'rb') as fsrc:
FileNotFoundError: [Errno 2] No such file or directory: 'collectd_
The files referenced in the lxc pull command are not in ".."
Andreas Hasenack (ahasenack) wrote : | # |
Ah, I know, I committed changes to configure.ac instead of making a quilt patch.
Nish Aravamudan (nacc) wrote : | # |
The backtrack is what is failing CI right now. Will debug today.
On Jan 12, 2018 05:22, "Andreas Hasenack" <email address hidden> wrote:
> Ah, I know, I committed changes to configure.ac instead of making a quilt
> patch.
> --
> https:/
> importer/
> You are the owner of ~nacc/usd-
>
Nish Aravamudan (nacc) wrote : | # |
On Fri, Jan 12, 2018 at 7:48 AM, Nish Aravamudan
<email address hidden> wrote:
> The backtrack is what is failing CI right now. Will debug today.
*backtrace and I believe the latest branch, force updated is good to
review/working.
-Nish
Server Team CI bot (server-team-bot) wrote : | # |
PASSED: Continuous integration, rev:70a249980cd
https:/
Executed test runs:
SUCCESS: Checkout
SUCCESS: Style Check
SUCCESS: Unit Tests
SUCCESS: Integration Tests
IN_PROGRESS: Declarative: Post Actions
Click here to trigger a rebuild:
https:/
Server Team CI bot (server-team-bot) wrote : | # |
PASSED: Continuous integration, rev:db0ddf8ed22
https:/
Executed test runs:
SUCCESS: Checkout
SUCCESS: Style Check
SUCCESS: Unit Tests
SUCCESS: Integration Tests
IN_PROGRESS: Declarative: Post Actions
Click here to trigger a rebuild:
https:/
Andreas Hasenack (ahasenack) wrote : | # |
First pass: some comments inline copied over from the previous MP
Nish Aravamudan (nacc) wrote : | # |
Thanks Andreas. I will update the comments for the lxc calls (I responded
on the other MP) and just forgot to take your changes suggestion. Will
update the MP tomorrow.
On Jan 15, 2018 05:42, "Andreas Hasenack" <email address hidden> wrote:
> Missed one
>
> Diff comments:
>
> > diff --git a/gitubuntu/
> > index 072d3e7..adece8e 100644
> > --- a/gitubuntu/
> > +++ b/gitubuntu/
> > @@ -1132,16 +1150,40 @@ def do_build_
> > ['ls', '/tmp',],
> > user='ubuntu',
> > )
> > - new_temp_contents = set(
> > + changes_files = set(
> > filename_stripped
> > for filename_stripped in (
> > filename.strip() for filename in
> > stdout.splitlines()
> > )
> > - if filename_stripped
> > + if filename_stripped and '.changes' in filename_stripped
>
> Suggestion to be more specific:
> if filename_stripped and filename_
>
> > )
> >
> > - built_temp_contents = new_temp_contents - orig_temp_contents
> > + changes_file = None
> > + built_temp_contents = []
> > +
> > + if len(changes_files) == 0:
> > + logging.error("No changes file")
> > + elif len(changes_files) > 1:
> > + logging.
> > + else:
> > + changes_file = changes_files.pop()
> > + cmd = [
> > + lxc,
> > + 'file',
> > + 'pull',
> > + '%s%s' % (container_name, os.path.
> changes_file)),
> > + os.pardir,
> > + ]
> > + run(cmd, env=env_unset_SNAP)
> > + changes_file = os.path.abspath(
> > + os.path.
> > + )
> > + with open(changes_file, 'rb') as changes_fobj:
> > + changes = Changes(
> > + for f in changes['files']:
> > + built_temp_
> > +
> > cmd = [lxc, 'file', 'pull']
> > cmd.extend([
> > '%s%s' % (container_name, os.path.
>
>
> --
> https:/
> importer/
> You are the owner of ~nacc/usd-
>
Robie Basak (racb) wrote : | # |
I tried to break 337e8bd into its own MP but it conflicts against master. This branch seems to be based on snap/beta instead of master.
Nish Aravamudan (nacc) wrote : | # |
Hrm? Ah yes, i guess some packages were added to master but not to snap/beta.
However, I don't see how that commit could conflict, the only diff
between my local master (which was at upstream/snap/beta) and
upstream/master are in gitubuntu/
On Thu, Jan 18, 2018 at 6:09 AM, Robie Basak <email address hidden> wrote:
> Review: Needs Fixing
>
> I tried to break 337e8bd into its own MP but it conflicts against master. This branch seems to be based on snap/beta instead of master.
> --
> https:/
> You are the owner of ~nacc/usd-
Unmerged commits
- db0ddf8... by Nish Aravamudan
-
git_repository: modify use of TemporaryDirectory to avoid crosstalk
Currently, no matter where the cwd is, all temporary directories are
placed in /tmp. This means when we use the pardir of the temporary
directory, we end up using /tmp and that leads to crosstalk between code
paths and binaries.We really want each temporary directory to be an isolated env, so use
nested TemporaryDirectories. LP: #1734137.
- 46280ff... by Nish Aravamudan
-
build.fetch_orig: modify API to take repo and commit_hash
There is a false assumption in the fetch_orig() API that assumed
gbp-buildpackage would somehow leverage the changelog object we have as
a parameter. It does not, and additionally, it cannot -- the
gbp-buildpackage subcommand only uses data from HEAD currently. So we
need to:
a) pass the repo and commit_hash to fetch_orig; and
b) use a temporary worktree in fetch_orig*() to switch to the
commit_hash in repo so that the remaining commands dtrt.b) does mean we are using a different GIT_DIR and WORK_TREE, so there
will be uncommitted changes relative to the GIT_DIR. Pass
--git-ignore-new to gbp to avoid this issue (we aren't actually using
gbp-buildpackage to build, and should switch to gbp-export-orig ASAP).However, using b) breaks fetch_orig_
from_parent_ dir which does not know
about the Git repository at all. I am not sure how to resolve this yet,
and it's technically a corner case where an experienced developer wants
to provide us orig tarballs.LP: #1738957
- 9092a15... by Nish Aravamudan
-
build: do not intersect changes file and built files
Just use the changes file contents to determine what was built.
- d4507f9... by Nish Aravamudan
-
scripts/
scriptutils: refactor srcpkgs_ to_import_ list to reduce code duplication We iterate all of Ubuntu main, then all of Ubuntu universe, then Debian,
treating Debian source packages not present in Ubuntu as if they were in
universe.I would prefer if the Ubuntu iterations were using one
components=['main', 'universe'], but then the caller would not know in
which component the given source package lives. In theory, we can use
src['directory'], but that seems sort of fragile to parse. - 07d8b26... by Nish Aravamudan
-
git_repository.
temporary_ worktree: always prune worktrees If an Exception was raised (e.g., ^C'ing git-ubuntu), the worktree would
not be pruned without this change. - 53e7bd1... by Nish Aravamudan
-
git_repository.
temporary_ worktree: adjust GIT_WORK_TREE variable gbp is throwing rather odd errors claiming the path for the temporary
work tree is not a Git repository. This appears to be due to having a
GIT_WORK_TREE environment variable and using a git-worktree that is
somewhere else. Properly set the environment variable in our context. - 399ca4b... by Nish Aravamudan
-
scriptutils.
pool_map_ import_ srcpkg: properly handle num_workers=0 Do not create a pool, just run map().
- 0a657cd... by Nish Aravamudan
-
git-ubuntu: properly handle symlinked binaries
- 7581ac8... by Nish Aravamudan
-
build: orig tarballs do not exist for native source packages
Do not try to push orig tarballs if they do not exist. Found while
investigating LP #1741472. - e7fa818... by Nish Aravamudan
-
dsc: return a list from orig_tarball_paths
The current Dsc class incorrectly handles native package orig tarballs
and returns None for the orig_tarball_path. It really needs to return
'nothing', as in an empty list (but the current API is a single return
value, not a list). When there is an orig tarball, there will be one
member in the list.LP: #1741472
---
Note that I would like to use the dsc_builder code to test this, but
it's not yet merged.
Preview Diff
1 | diff --git a/bin/git-ubuntu b/bin/git-ubuntu |
2 | index 2b15471..48549fd 100755 |
3 | --- a/bin/git-ubuntu |
4 | +++ b/bin/git-ubuntu |
5 | @@ -6,7 +6,12 @@ import os |
6 | import sys |
7 | # we know that our relative path doesn't include the module |
8 | try: |
9 | - realpath = os.readlink(__file__) |
10 | + # If __file__ is a symlink, resolve to where it points to. If the symlink |
11 | + # target is relative, then we need to join that to the directory __file__ |
12 | + # is inside in order to resolve it correctly. If the symlink target is |
13 | + # absolute, then os.path.join is defined to ignore the first parameter |
14 | + # here. |
15 | + realpath = os.path.join(os.path.dirname(__file__), os.readlink(__file__)) |
16 | except OSError: |
17 | realpath = __file__ |
18 | sys.path.insert( |
19 | diff --git a/gitubuntu/build.py b/gitubuntu/build.py |
20 | index 072d3e7..adece8e 100644 |
21 | --- a/gitubuntu/build.py |
22 | +++ b/gitubuntu/build.py |
23 | @@ -45,6 +45,7 @@ from gitubuntu.source_information import ( |
24 | try: |
25 | pkg = 'python3-debian' |
26 | from debian.debfile import PART_EXTS |
27 | + from debian.deb822 import Changes |
28 | pkg = 'python3-petname' |
29 | import petname |
30 | pkg = 'python3-pygit2' |
31 | @@ -359,25 +360,27 @@ def fetch_orig_from_launchpad(changelog, source, pullfile, retries, |
32 | spi.pull() |
33 | dsc = GitUbuntuDsc(spi.dsc_pathname) |
34 | orig_paths = dsc.all_tarball_paths |
35 | - if dl_cache is None: |
36 | - _symlink_paths_into_parent_dir(orig_paths) |
37 | - else: |
38 | - srcpkg_cache_dir = os.path.join( |
39 | - dl_cache, |
40 | - derive_source_from_changelog(changelog), |
41 | - changelog.srcpkg, |
42 | - changelog.upstream_version, |
43 | - ) |
44 | - if not os.path.isdir(srcpkg_cache_dir): |
45 | - os.makedirs(srcpkg_cache_dir, exist_ok=True) |
46 | - # populate cache |
47 | - logging.debug("Caching dsc file") |
48 | - _populate_cache(srcpkg_cache_dir, dsc.dsc_path, 'DSC') |
49 | |
50 | - cached_paths = [] |
51 | - for path in orig_paths: |
52 | - cached_paths.append(_populate_cache(srcpkg_cache_dir, path)) |
53 | - _symlink_paths_into_parent_dir(cached_paths) |
54 | + if orig_paths: |
55 | + if dl_cache is None: |
56 | + _symlink_paths_into_parent_dir(orig_paths) |
57 | + else: |
58 | + srcpkg_cache_dir = os.path.join( |
59 | + dl_cache, |
60 | + derive_source_from_changelog(changelog), |
61 | + changelog.srcpkg, |
62 | + changelog.upstream_version, |
63 | + ) |
64 | + if not os.path.isdir(srcpkg_cache_dir): |
65 | + os.makedirs(srcpkg_cache_dir, exist_ok=True) |
66 | + # populate cache |
67 | + logging.debug("Caching dsc file") |
68 | + _populate_cache(srcpkg_cache_dir, dsc.dsc_path, 'DSC') |
69 | + |
70 | + cached_paths = [] |
71 | + for path in orig_paths: |
72 | + cached_paths.append(_populate_cache(srcpkg_cache_dir, path)) |
73 | + _symlink_paths_into_parent_dir(cached_paths) |
74 | |
75 | return orig_paths |
76 | |
77 | @@ -474,7 +477,7 @@ def main( |
78 | oldubuntu_version, |
79 | ) |
80 | |
81 | - files = fetch_orig_and_build( |
82 | + built_files = fetch_orig_and_build( |
83 | repo, |
84 | commitish, |
85 | search_list, |
86 | @@ -486,14 +489,8 @@ def main( |
87 | retries, |
88 | retry_backoffs, |
89 | ) |
90 | - if len(files) and sign: |
91 | - changes_files = [f for f in files if 'changes' in f] |
92 | - if len(changes_files) == 0: |
93 | - logging.error("No changes file") |
94 | - if len(changes_files) > 1: |
95 | - logging.error("Multiple changes files?") |
96 | - |
97 | - changes_file = changes_files.pop() |
98 | + if built_files and sign: |
99 | + changes_file = built_files[0] |
100 | try: |
101 | logging.info("Signing changes file %s", changes_file) |
102 | run(['debsign', '-p%s' % _gpg_abspath(), changes_file,]) |
103 | @@ -501,7 +498,7 @@ def main( |
104 | raise RuntimeError( |
105 | "Failed to sign changes file %s" % changes_file |
106 | ) from e |
107 | - return files |
108 | + return built_files |
109 | except RuntimeError as e: |
110 | traceback.print_exc() |
111 | return [] |
112 | @@ -875,19 +872,35 @@ def do_build_host_exitstack( |
113 | stack.callback(os.chdir, oldcwd) |
114 | os.chdir(os.path.join(tempdir, archive_prefix)) |
115 | |
116 | - old_pardir_contents = set(os.listdir(os.pardir)) |
117 | run(['dpkg-buildpackage'] + list(rem_args)) |
118 | - new_pardir_contents = set(os.listdir(os.pardir)) |
119 | - temp_pardir_contents = new_pardir_contents - old_pardir_contents |
120 | + |
121 | + changes_file = None |
122 | built_pardir_contents = [] |
123 | - for f in temp_pardir_contents: |
124 | - shutil.copy( |
125 | - os.path.join(os.pardir, f), |
126 | - os.path.join(oldcwd, os.pardir) |
127 | + changes_files = [f for f in set(os.listdir(os.pardir)) if '.changes' in f] |
128 | + if len(changes_files) == 0: |
129 | + logging.error("No changes file") |
130 | + elif len(changes_files) > 1: |
131 | + logging.error("Multiple changes files?") |
132 | + else: |
133 | + changes_file = os.path.abspath( |
134 | + shutil.copy( |
135 | + os.path.join(os.pardir, changes_files.pop()), |
136 | + os.path.join(oldcwd, os.pardir), |
137 | + ) |
138 | ) |
139 | - built_pardir_contents.append(os.path.join(os.pardir, f)) |
140 | + with open(changes_file, 'rb') as changes_fobj: |
141 | + changes = Changes(changes_fobj) |
142 | + for f in changes['files']: |
143 | + built_pardir_contents.append( |
144 | + os.path.abspath( |
145 | + shutil.copy( |
146 | + os.path.join(os.pardir, f['name']), |
147 | + os.path.join(oldcwd, os.pardir), |
148 | + ) |
149 | + ) |
150 | + ) |
151 | |
152 | - return built_pardir_contents |
153 | + return [changes_file,] + built_pardir_contents |
154 | |
155 | def get_cmd_in_origpath(cmd): |
156 | return shutil.which( |
157 | @@ -1006,9 +1019,9 @@ def do_build_lxd_exitstack( |
158 | |
159 | for i in range(retries+1): |
160 | try: |
161 | - _run_in_lxd(container_name, ['apt', 'update',]) |
162 | + _run_in_lxd(container_name, ['apt-get', 'update',]) |
163 | _run_in_lxd(container_name, [ |
164 | - 'apt', |
165 | + 'apt-get', |
166 | 'install', |
167 | '-y', |
168 | 'devscripts', |
169 | @@ -1018,7 +1031,7 @@ def do_build_lxd_exitstack( |
170 | break |
171 | except Exception as e: |
172 | logging.error( |
173 | - "Failed to run apt in ephemeral build container " |
174 | + "Failed to run apt-get in ephemeral build container " |
175 | "(attempt %d/%d)", |
176 | i+1, retries+1, |
177 | ) |
178 | @@ -1026,29 +1039,42 @@ def do_build_lxd_exitstack( |
179 | time.sleep(retry_backoffs[i]) |
180 | else: |
181 | raise RuntimeError( |
182 | - "Failed to run apt in ephemeral build container" |
183 | + "Failed to run apt-get in ephemeral build container" |
184 | ) |
185 | |
186 | + # lxd assumes that if SNAP is set in the environment, that it is |
187 | + # running as a snap. That is not necessarily true when git-ubuntu is |
188 | + # the snapped application. So unset SNAP before we exec. |
189 | + # LP: #1741949 |
190 | + env_unset_SNAP = os.environ.copy() |
191 | try: |
192 | - run([ |
193 | - lxc, |
194 | - 'file', |
195 | - 'push', |
196 | - archive_tarball_name, |
197 | - '%s/tmp/' % container_name, |
198 | - ]) |
199 | + del env_unset_SNAP['SNAP'] |
200 | + except KeyError: |
201 | + pass |
202 | + try: |
203 | + run( |
204 | + [ |
205 | + lxc, |
206 | + 'file', |
207 | + 'push', |
208 | + archive_tarball_name, |
209 | + '%s/tmp/' % container_name, |
210 | + ], |
211 | + env=env_unset_SNAP, |
212 | + ) |
213 | except Exception as e: |
214 | raise RuntimeError( |
215 | "Failed to push archive tarball to ephemeral build container" |
216 | ) from e |
217 | |
218 | - cmd = [lxc, 'file', 'push'] |
219 | - cmd.extend(tarballs) |
220 | - cmd.extend(['%s/tmp/' % container_name,]) |
221 | - try: |
222 | - run(cmd) |
223 | - except Exception as e: |
224 | - raise RuntimeError("Failed to push orig tarballs to container") from e |
225 | + if tarballs: |
226 | + cmd = [lxc, 'file', 'push'] |
227 | + cmd.extend(tarballs) |
228 | + cmd.extend(['%s/tmp/' % container_name,]) |
229 | + try: |
230 | + run(cmd, env=env_unset_SNAP) |
231 | + except Exception as e: |
232 | + raise RuntimeError("Failed to push orig tarballs to container") from e |
233 | |
234 | logging.info("Copied build files to %s", container_name) |
235 | |
236 | @@ -1073,14 +1099,6 @@ def do_build_lxd_exitstack( |
237 | ['ls', '/tmp',], |
238 | user='ubuntu', |
239 | ) |
240 | - orig_temp_contents = set( |
241 | - filename_stripped |
242 | - for filename_stripped in ( |
243 | - filename.strip() for filename in |
244 | - stdout.splitlines() |
245 | - ) |
246 | - if filename_stripped |
247 | - ) |
248 | |
249 | container_path = '/tmp/%s-%s' % ( |
250 | changelog.srcpkg, |
251 | @@ -1132,16 +1150,40 @@ def do_build_lxd_exitstack( |
252 | ['ls', '/tmp',], |
253 | user='ubuntu', |
254 | ) |
255 | - new_temp_contents = set( |
256 | + changes_files = set( |
257 | filename_stripped |
258 | for filename_stripped in ( |
259 | filename.strip() for filename in |
260 | stdout.splitlines() |
261 | ) |
262 | - if filename_stripped |
263 | + if filename_stripped and '.changes' in filename_stripped |
264 | ) |
265 | |
266 | - built_temp_contents = new_temp_contents - orig_temp_contents |
267 | + changes_file = None |
268 | + built_temp_contents = [] |
269 | + |
270 | + if len(changes_files) == 0: |
271 | + logging.error("No changes file") |
272 | + elif len(changes_files) > 1: |
273 | + logging.error("Multiple changes files?") |
274 | + else: |
275 | + changes_file = changes_files.pop() |
276 | + cmd = [ |
277 | + lxc, |
278 | + 'file', |
279 | + 'pull', |
280 | + '%s%s' % (container_name, os.path.join('/tmp', changes_file)), |
281 | + os.pardir, |
282 | + ] |
283 | + run(cmd, env=env_unset_SNAP) |
284 | + changes_file = os.path.abspath( |
285 | + os.path.join(os.pardir, changes_file) |
286 | + ) |
287 | + with open(changes_file, 'rb') as changes_fobj: |
288 | + changes = Changes(changes_fobj) |
289 | + for f in changes['files']: |
290 | + built_temp_contents.append(f['name']) |
291 | + |
292 | cmd = [lxc, 'file', 'pull'] |
293 | cmd.extend([ |
294 | '%s%s' % (container_name, os.path.join('/tmp', f)) |
295 | @@ -1149,39 +1191,57 @@ def do_build_lxd_exitstack( |
296 | ] |
297 | ) |
298 | cmd.extend([os.pardir,]) |
299 | - run(cmd) |
300 | + run(cmd, env=env_unset_SNAP) |
301 | |
302 | - return [os.path.join(os.pardir, f) for f in built_temp_contents] |
303 | + built_temp_contents = [ |
304 | + os.path.abspath(os.path.join(os.pardir, f)) |
305 | + for f in built_temp_contents |
306 | + ] |
307 | |
308 | + return [changes_file,] + built_temp_contents |
309 | |
310 | def fetch_orig( |
311 | orig_search_list, |
312 | - changelog, |
313 | + repo, |
314 | + commit_hash, |
315 | ): |
316 | + changelog = gitubuntu.git_repository.Changelog.from_treeish( |
317 | + repo.raw_repo, |
318 | + repo.raw_repo.get(commit_hash), |
319 | + ) |
320 | + |
321 | unaliased_orig_search_list = expand_changelog_source_aliases( |
322 | orig_search_list, |
323 | changelog, |
324 | ) |
325 | # Follow searches already attempted as a 'changelog' source alias |
326 | # may expand to a duplicate. |
327 | - for entry in unique_everseen(unaliased_orig_search_list): |
328 | - try: |
329 | - mechanism_name = entry.mechanism.__name__ |
330 | - except AttributeError: |
331 | - mechanism_name = entry.mechanism.func.__name__ |
332 | - tarballs = entry.mechanism(changelog, entry.source) |
333 | - if tarballs is None: |
334 | - logging.debug('%s(source=%s) failed', |
335 | + oldcwd = os.getcwd() |
336 | + with repo.temporary_worktree(commit_hash): |
337 | + for entry in unique_everseen(unaliased_orig_search_list): |
338 | + try: |
339 | + mechanism_name = entry.mechanism.__name__ |
340 | + except AttributeError: |
341 | + mechanism_name = entry.mechanism.func.__name__ |
342 | + tarballs = entry.mechanism(changelog, entry.source) |
343 | + if tarballs is None: |
344 | + logging.debug('%s(source=%s) failed', |
345 | + mechanism_name, |
346 | + entry.source, |
347 | + ) |
348 | + continue # search returned negative; try next search entry |
349 | + copied_tarballs = [] |
350 | + for tarball in tarballs: |
351 | + copied_tarballs.append( |
352 | + shutil.copy(tarball, os.path.join(oldcwd, os.pardir)) |
353 | + ) |
354 | + |
355 | + logging.info('Successfully fetched%susing %s(source=%s)', |
356 | + ':\n' + '\n'.join(copied_tarballs) + '\n' if copied_tarballs else ' ', |
357 | mechanism_name, |
358 | entry.source, |
359 | ) |
360 | - continue # search returned negative; try next search entry |
361 | - logging.info('Successfully fetched%susing %s(source=%s)', |
362 | - ':\n' + '\n'.join(tarballs) + '\n' if tarballs else ' ', |
363 | - mechanism_name, |
364 | - entry.source, |
365 | - ) |
366 | - return tarballs |
367 | + return copied_tarballs |
368 | |
369 | def fetch_orig_and_build( |
370 | repo, |
371 | @@ -1205,64 +1265,79 @@ def fetch_orig_and_build( |
372 | ) |
373 | # Follow searches already attempted as a 'changelog' source alias |
374 | # may expand to a duplicate. |
375 | - for entry in unique_everseen(unaliased_orig_search_list): |
376 | - try: |
377 | - mechanism_name = entry.mechanism.__name__ |
378 | - except AttributeError: |
379 | - mechanism_name = entry.mechanism.func.__name__ |
380 | - tarballs = entry.mechanism(changelog, entry.source) |
381 | - if tarballs is None: |
382 | - logging.debug('%s(source=%s) failed', |
383 | - mechanism_name, |
384 | - entry.source, |
385 | - ) |
386 | - continue # search returned negative; try next search entry |
387 | - try: |
388 | - built_files = do_build( |
389 | - repo, |
390 | - commitish, |
391 | - tarballs, |
392 | - rem_args, |
393 | - use_lxd, |
394 | - keep_build_env, |
395 | - lxd_profile, |
396 | - lxd_image, |
397 | - retries, |
398 | - retry_backoffs, |
399 | - ) |
400 | - except RuntimeError as e: |
401 | - if entry.must_build: |
402 | - # Build failed and the search list declares this fatal |
403 | - logging.error("Failed to build") |
404 | - raise RuntimeError("Unable to build using %s(source=%s)" % ( |
405 | + commit_hash = str( |
406 | + repo.get_commitish(commitish).peel(pygit2.Commit).id |
407 | + ) |
408 | + oldcwd = os.getcwd() |
409 | + with repo.temporary_worktree(commit_hash): |
410 | + for entry in unique_everseen(unaliased_orig_search_list): |
411 | + try: |
412 | + mechanism_name = entry.mechanism.__name__ |
413 | + except AttributeError: |
414 | + mechanism_name = entry.mechanism.func.__name__ |
415 | + tarballs = entry.mechanism(changelog, entry.source) |
416 | + if tarballs is None: |
417 | + logging.debug('%s(source=%s) failed', |
418 | mechanism_name, |
419 | entry.source, |
420 | - )) from e |
421 | + ) |
422 | + continue # search returned negative; try next search entry |
423 | + try: |
424 | + built_files = do_build( |
425 | + repo, |
426 | + commitish, |
427 | + tarballs, |
428 | + rem_args, |
429 | + use_lxd, |
430 | + keep_build_env, |
431 | + lxd_profile, |
432 | + lxd_image, |
433 | + retries, |
434 | + retry_backoffs, |
435 | + ) |
436 | + except RuntimeError as e: |
437 | + if entry.must_build: |
438 | + # Build failed and the search list declares this fatal |
439 | + logging.error("Failed to build") |
440 | + raise RuntimeError("Unable to build using %s(source=%s)" % ( |
441 | + mechanism_name, |
442 | + entry.source, |
443 | + )) from e |
444 | + else: |
445 | + # Build failed, but the search list entry declares that |
446 | + # this should not be fatal and so we try the next search |
447 | + # entry |
448 | + # First, remove any tarballs fetched with this mechanism |
449 | + for tarball in tarballs: |
450 | + os.remove(tarball) |
451 | + logging.debug( |
452 | + "Ignoring failure to build using %s(source=%s)", |
453 | + mechanism_name, |
454 | + entry.source, |
455 | + ) |
456 | + continue |
457 | else: |
458 | - # Build failed, but the search list entry declares that |
459 | - # this should not be fatal and so we try the next search |
460 | - # entry |
461 | - # First, remove any tarballs fetched with this mechanism |
462 | - for tarball in tarballs: |
463 | - os.remove(tarball) |
464 | - logging.debug( |
465 | - "Ignoring failure to build using %s(source=%s)", |
466 | + copied_built_files = [] |
467 | + for f in built_files: |
468 | + copied_built_files.append( |
469 | + os.path.abspath( |
470 | + shutil.copy(f, os.path.join(oldcwd, os.pardir)) |
471 | + ) |
472 | + ) |
473 | + |
474 | + logging.info( |
475 | + "Successfully built%susing %s(source=%s)", |
476 | + ":\n" + "\n".join(copied_built_files) + "\n" |
477 | + if copied_built_files else " ", |
478 | mechanism_name, |
479 | entry.source, |
480 | ) |
481 | - continue |
482 | + return copied_built_files |
483 | else: |
484 | - logging.info('Successfully built%susing %s(source=%s)', |
485 | - ':\n' + '\n'.join(built_files) + '\n' if built_files else ' ', |
486 | - mechanism_name, |
487 | - entry.source, |
488 | + raise RuntimeError( |
489 | + "All potential search mechanisms were tried, " |
490 | + "but none produced a successful build." |
491 | ) |
492 | - return built_files |
493 | - else: |
494 | - raise RuntimeError( |
495 | - "All potential search mechanisms were tried, " |
496 | - "but none produced a successful build." |
497 | - ) |
498 | |
499 | |
500 | class NativenessMismatchError(Exception): pass |
501 | diff --git a/gitubuntu/dsc.py b/gitubuntu/dsc.py |
502 | index 1fc3129..f630b56 100644 |
503 | --- a/gitubuntu/dsc.py |
504 | +++ b/gitubuntu/dsc.py |
505 | @@ -47,24 +47,48 @@ class GitUbuntuDsc(Dsc): |
506 | |
507 | @property |
508 | def orig_tarball_path(self): |
509 | - if self._orig: |
510 | + """Get string filesystem path of this DSC's orig tarball |
511 | + |
512 | + Returns: |
513 | + list of string paths to the orig tarball |
514 | + |
515 | + The list will either have zero or one elements. Zero elements |
516 | + implies a native package. |
517 | + |
518 | + Raises: |
519 | + GitUbuntuDscError if tarball fails to verify |
520 | + """ |
521 | + if self._orig is not None: |
522 | return self._orig |
523 | - orig = None |
524 | + orig = [] |
525 | for entry in self['Files']: |
526 | if orig_re.match(entry['name']): |
527 | - if orig is not None: |
528 | - raise GitUbuntuDscError('Multiple base ' |
529 | - 'orig tarballs in DSC') |
530 | - orig = os.path.join(self.dsc_dir, entry['name']) |
531 | - logging.debug('Verifying orig tarball %s', orig) |
532 | - if not self.verify_file(orig): |
533 | - raise GitUbuntuDscError('Unable to verify orig ' |
534 | - 'tarball %s' % orig) |
535 | + if orig: |
536 | + raise GitUbuntuDscError( |
537 | + "Multiple base orig tarballs in DSC" |
538 | + ) |
539 | + _orig = os.path.join(self.dsc_dir, entry['name']) |
540 | + orig.append(_orig) |
541 | + logging.debug('Verifying orig tarball %s', _orig) |
542 | + if not self.verify_file(_orig): |
543 | + raise GitUbuntuDscError( |
544 | + "Unable to verify orig tarball %s" % _orig |
545 | + ) |
546 | + assert len(orig) < 2 |
547 | self._orig = orig |
548 | return self._orig |
549 | |
550 | @property |
551 | def component_tarball_paths(self): |
552 | + """Get string filesystem paths of this DSC's orig component tarball(s) |
553 | + |
554 | + Returns: |
555 | + dictionary of string path to component tarballs, keyed by |
556 | + component names |
557 | + |
558 | + Raises: |
559 | + GitUbuntuDscError if a tarball fails to verify |
560 | + """ |
561 | if self._components is not None: |
562 | return self._components |
563 | components = {} |
564 | @@ -83,4 +107,6 @@ class GitUbuntuDsc(Dsc): |
565 | |
566 | @property |
567 | def all_tarball_paths(self): |
568 | - return [self.orig_tarball_path] + list(self.component_tarball_paths.values()) |
569 | + return self.orig_tarball_path + list( |
570 | + self.component_tarball_paths.values() |
571 | + ) |
572 | diff --git a/gitubuntu/exportorig.py b/gitubuntu/exportorig.py |
573 | index a04362c..f259ad0 100644 |
574 | --- a/gitubuntu/exportorig.py |
575 | +++ b/gitubuntu/exportorig.py |
576 | @@ -88,10 +88,8 @@ def main( |
577 | try: |
578 | return gitubuntu.build.fetch_orig( |
579 | orig_search_list=search_list, |
580 | - changelog=gitubuntu.git_repository.Changelog.from_treeish( |
581 | - repo.raw_repo, |
582 | - repo.raw_repo.get(commit_hash), |
583 | - ), |
584 | + repo=repo, |
585 | + commit_hash=commit_hash, |
586 | ) |
587 | except gitubuntu.build.NativenessMismatchError as e: |
588 | logging.error("%s" % e) |
589 | diff --git a/gitubuntu/git_repository.py b/gitubuntu/git_repository.py |
590 | index b04cd16..d5842b2 100644 |
591 | --- a/gitubuntu/git_repository.py |
592 | +++ b/gitubuntu/git_repository.py |
593 | @@ -21,7 +21,14 @@ import gitubuntu.lint |
594 | from gitubuntu.__main__ import top_level_defaults |
595 | import gitubuntu.build |
596 | from gitubuntu.dsc import component_tarball_matches |
597 | -from gitubuntu.run import run, runq, decode_binary |
598 | +from gitubuntu.run import ( |
599 | + decode_binary, |
600 | + run, |
601 | + runq, |
602 | + run_gbp, |
603 | + run_quilt, |
604 | +) |
605 | +import gitubuntu.versioning |
606 | try: |
607 | pkg = 'python3-debian' |
608 | import debian.changelog |
609 | @@ -918,8 +925,9 @@ class GitUbuntuRepository: |
610 | tarballs.append( |
611 | os.path.join(os.path.pardir, potential_main_tarballs[0]) |
612 | ) |
613 | - cmd = ['gbp', 'buildpackage', '--git-builder=/bin/true', |
614 | - '--git-pristine-tar', '--git-ignore-branch', |
615 | + args = ['buildpackage', '--git-builder=/bin/true', |
616 | + '--git-ignore-new', '--git-pristine-tar', |
617 | + '--git-ignore-branch', |
618 | '--git-upstream-tag=%s/upstream/%s/%%(version)s%s' % |
619 | (namespace, dist, ext)] |
620 | # This will probably break if the component tarballs get |
621 | @@ -935,11 +943,11 @@ class GitUbuntuRepository: |
622 | tarballs.extend(map(lambda x : os.path.join(os.path.pardir, x), |
623 | list(potential_component_tarballs.values())) |
624 | ) |
625 | - cmd.extend(map(lambda x : '--git-component=%s' % x, |
626 | + args.extend(map(lambda x : '--git-component=%s' % x, |
627 | list(potential_component_tarballs.keys())) |
628 | ) |
629 | with self.pristine_tar_branches(dist, namespace): |
630 | - run(cmd) |
631 | + run_gbp(args, env=self.env) |
632 | return tarballs |
633 | |
634 | raise PristineTarNotFoundError( |
635 | @@ -1447,6 +1455,9 @@ class GitUbuntuRepository: |
636 | def get_upstream_tag(self, version, namespace): |
637 | return self.get_tag_reference(upstream_tag(version, namespace)) |
638 | |
639 | + def get_orphan_tag(self, version, namespace): |
640 | + return self.get_tag_reference(orphan_tag(version, namespace)) |
641 | + |
642 | def nearest_remote_branches(self, commit_hash, prefix=None, |
643 | max_commits=100 |
644 | ): |
645 | @@ -1853,29 +1864,39 @@ class GitUbuntuRepository: |
646 | |
647 | @contextmanager |
648 | def temporary_worktree(self, commitish, prefix=None): |
649 | - with tempfile.TemporaryDirectory(prefix=prefix) as tempdir: |
650 | - self.git_run( |
651 | - [ |
652 | - 'worktree', |
653 | - 'add', |
654 | - '--detach', |
655 | - '--force', |
656 | - tempdir, |
657 | - commitish, |
658 | - ] |
659 | - ) |
660 | - |
661 | - oldcwd = os.getcwd() |
662 | - os.chdir(tempdir) |
663 | + try: |
664 | + with tempfile.TemporaryDirectory() as tempdir: |
665 | + with tempfile.TemporaryDirectory( |
666 | + prefix=prefix, |
667 | + dir=tempdir, |
668 | + ) as ttempdir: |
669 | + self.git_run( |
670 | + [ |
671 | + 'worktree', |
672 | + 'add', |
673 | + '--detach', |
674 | + '--force', |
675 | + ttempdir, |
676 | + commitish, |
677 | + ] |
678 | + ) |
679 | |
680 | - try: |
681 | - yield |
682 | - except: |
683 | - raise |
684 | - finally: |
685 | - os.chdir(oldcwd) |
686 | + oldcwd = os.getcwd() |
687 | + os.chdir(ttempdir) |
688 | |
689 | - self.git_run(['worktree', 'prune']) |
690 | + orig_work_tree = self._env['GIT_WORK_TREE'] |
691 | + try: |
692 | + self._env['GIT_WORK_TREE'] = ttempdir |
693 | + yield |
694 | + except: |
695 | + raise |
696 | + finally: |
697 | + self._env['GIT_WORK_TREE'] = orig_work_tree |
698 | + os.chdir(oldcwd) |
699 | + except: |
700 | + raise |
701 | + finally: |
702 | + self.git_run(['worktree', 'prune']) |
703 | |
704 | def tree_hash_after_command(self, commitish, cmd): |
705 | with self.temporary_worktree(commitish): |
706 | @@ -1972,8 +1993,8 @@ class GitUbuntuRepository: |
707 | # differentiate between applied an unapplied |
708 | with self.temporary_worktree(commit_hash): |
709 | try: |
710 | - run( |
711 | - ['quilt', 'push', '-a'], |
712 | + run_quilt( |
713 | + ['push', '-a'], |
714 | env=self.quilt_env_from_treeish_str(commit_hash), |
715 | ) |
716 | # False if in an unapplied state, which is signified by |
717 | @@ -1998,8 +2019,8 @@ class GitUbuntuRepository: |
718 | raise e |
719 | |
720 | try: |
721 | - run( |
722 | - ['quilt', 'push', '-a'], |
723 | + run_quilt( |
724 | + ['push', '-a'], |
725 | env=self.quilt_env_from_treeish_str(commit_hash), |
726 | ) |
727 | # False if in an unapplied state |
728 | @@ -2064,10 +2085,8 @@ class GitUbuntuRepository: |
729 | for_merge=False, |
730 | no_pristine_tar=False, |
731 | ), |
732 | - changelog=Changelog.from_treeish( |
733 | - self.raw_repo, |
734 | - self.raw_repo.get(commit_hash) |
735 | - ), |
736 | + repo=self, |
737 | + commit_hash=commit_hash, |
738 | ) |
739 | logger.setLevel(oldLevel) |
740 | # run dpkg-source |
741 | @@ -2080,22 +2099,32 @@ class GitUbuntuRepository: |
742 | |
743 | # create a nested temporary directory where we will recreate |
744 | # the .pc directory |
745 | - with tempfile.TemporaryDirectory(prefix=tempdir+'/') as ttempdir: |
746 | + with tempfile.TemporaryDirectory(dir=tempdir) as ttempdir: |
747 | oldcwd = os.getcwd() |
748 | os.chdir(ttempdir) |
749 | |
750 | for tarball in tarballs: |
751 | run(['tar', '-x', '--strip-components=1', '-f', tarball,]) |
752 | |
753 | - # need the debia/patches |
754 | - shutil.copytree( |
755 | - os.path.join(self.local_dir, 'debian',), |
756 | - 'debian', |
757 | - ) |
758 | + # in case the orig tarballs contain a debian/ directory |
759 | + try: |
760 | + os.unlink('debian') |
761 | + except IsADirectoryError: |
762 | + shutil.rmtree('debian') |
763 | + except FileNotFoundError: |
764 | + # it is possible no debian/ dir exists yet |
765 | + pass |
766 | + |
767 | + # need the debian dir from @commit_hash to recreate .pc |
768 | + with self.temporary_worktree(commit_hash): |
769 | + shutil.copytree( |
770 | + 'debian', |
771 | + os.path.join(ttempdir, 'debian'), |
772 | + ) |
773 | |
774 | # generate the equivalent .pc directory |
775 | - run( |
776 | - ['quilt', 'push', '-a'], |
777 | + run_quilt( |
778 | + ['push', '-a'], |
779 | env=self.quilt_env_from_treeish_str(commit_hash), |
780 | rcs=[2], |
781 | ) |
782 | @@ -2184,9 +2213,10 @@ with other relevant fields (see below for details). |
783 | # if any patches add files that are untracked, |
784 | # remove them |
785 | run(['git', 'clean', '-f', '-d',]) |
786 | - # reset all the other files to their status in |
787 | - # HEAD |
788 | - run(['git', 'checkout', commit_hash, '--', '*',]) |
789 | + # reset all the other files to their status |
790 | + # before our modifications, preserving our d/p |
791 | + # changes |
792 | + run(['git', 'checkout', '--', '*',]) |
793 | with open(fixup_patch_path, 'rb') as f: |
794 | run(['patch', '-Rp1',], input=f.read()) |
795 | |
796 | @@ -2278,14 +2308,14 @@ with other relevant fields (see below for details). |
797 | 'debian/changelog', |
798 | ): |
799 | with self.temporary_worktree(commit_hash): |
800 | - run( |
801 | + run_gbp( |
802 | [ |
803 | - 'gbp', |
804 | 'dch', |
805 | '--snapshot', |
806 | '--ignore-branch', |
807 | '--since=%s' % str(parent_ref), |
808 | - ] |
809 | + ], |
810 | + env=self.env, |
811 | ) |
812 | return self.dir_to_tree('.') |
813 | |
814 | @@ -2339,3 +2369,80 @@ with other relevant fields (see below for details). |
815 | pygit2.GIT_FILEMODE_TREE, # attr |
816 | ) |
817 | return str(tb.write()) |
818 | + |
819 | + def find_ubuntu_merge_base( |
820 | + self, |
821 | + new_debian_id, |
822 | + old_ubuntu_id, |
823 | + ): |
824 | + # For now, we have to account for both old and new versions of the |
825 | + # importer. This can be removed/simplified once we reimport the |
826 | + # world. |
827 | + # Under the old algorithm, use git-merge-base, but also check that |
828 | + # the merge base's version is what we expect |
829 | + # If it is not, assume we are in the new algorithm and look for an |
830 | + # import tag for the expected Debian version. |
831 | + # If not found, return None |
832 | + # LP: #1734364 |
833 | + |
834 | + # obtain version from appropriate changelog |
835 | + expected_version, _ = self.get_changelog_versions_from_treeish( |
836 | + str(old_ubuntu_id) |
837 | + ) |
838 | + |
839 | + # extract corresponding Debian version |
840 | + expected_version, _ = gitubuntu.versioning.split_version_string( |
841 | + expected_version |
842 | + ) |
843 | + |
844 | + merge_base_id = self.raw_repo.merge_base( |
845 | + str(new_debian_id), |
846 | + str(old_ubuntu_id), |
847 | + ) |
848 | + |
849 | + version, _ = self.get_changelog_versions_from_treeish( |
850 | + str(merge_base_id) |
851 | + ) |
852 | + |
853 | + if version != expected_version: |
854 | + logging.debug( |
855 | + "Found a git-merge-base %s, but its version (%s) " |
856 | + "is not as expected (%s). Assuming the new importer " |
857 | + "algorithm and looking for an import tag instead.", |
858 | + str(merge_base_id), |
859 | + version, |
860 | + expected_version, |
861 | + ) |
862 | + merge_base_id = self.get_import_tag( |
863 | + expected_version, |
864 | + 'pkg', |
865 | + ).peel(pygit2.Commit).id |
866 | + |
867 | + if not merge_base_id: |
868 | + logging.error( |
869 | + "Unable to find an import tag for %s.", |
870 | + expected_version, |
871 | + ) |
872 | + return None |
873 | + |
874 | + try: |
875 | + self.git_run( |
876 | + [ |
877 | + 'merge-base', |
878 | + '--is-ancestor', |
879 | + str(merge_base_id), |
880 | + str(old_ubuntu_id), |
881 | + ], |
882 | + verbose_on_failure=False, |
883 | + ) |
884 | + except CalledProcessError as e: |
885 | + logging.error( |
886 | + "Found an import tag for %s (commit: %s), but it is " |
887 | + "not an ancestor of %s.", |
888 | + expected_version, |
889 | + str(merge_base_id), |
890 | + str(old_ubuntu_id), |
891 | + ) |
892 | + return None |
893 | + |
894 | + return merge_base_id |
895 | diff --git a/gitubuntu/importer.py b/gitubuntu/importer.py |
896 | index e8fb4f8..ec157e7 100644 |
897 | --- a/gitubuntu/importer.py |
898 | +++ b/gitubuntu/importer.py |
899 | @@ -53,7 +53,13 @@ from gitubuntu.git_repository import ( |
900 | PristineTarError, |
901 | is_dir_3_0_quilt, |
902 | ) |
903 | -from gitubuntu.run import decode_binary, run, runq |
904 | +from gitubuntu.run import ( |
905 | + decode_binary, |
906 | + run, |
907 | + runq, |
908 | + run_gbp, |
909 | + run_quilt, |
910 | +) |
911 | from gitubuntu.source_information import ( |
912 | GitUbuntuSourceInformation, |
913 | NoPublicationHistoryException, |
914 | @@ -645,31 +651,29 @@ def import_orig(repo, dsc, namespace, version, dist): |
915 | orig_tarball_path = dsc.orig_tarball_path |
916 | except GitUbuntuDscError as e: |
917 | raise GitUbuntuImportOrigError( |
918 | - 'Unable to import orig tarball in DSC for version %s' % version |
919 | + "Unable to import orig tarball in DSC for version %s" % version |
920 | ) from e |
921 | |
922 | - if orig_tarball_path is None: |
923 | - # native packages only have a .tar.* |
924 | - native_re = re.compile(r'.*\.tar\.[^.]+$') |
925 | - for entry in dsc['Files']: |
926 | - if native_re.match(entry['name']): |
927 | - return |
928 | - raise GitUbuntuImportOrigError('No orig tarball in ' |
929 | - 'DSC for version %s.' % version |
930 | + if orig_tarball_path: |
931 | + orig_tarball_path = orig_tarball_path[0] |
932 | + else: |
933 | + # native packages do not have an orig tarball |
934 | + if '-' not in version: |
935 | + return |
936 | + raise GitUbuntuImportOrigError( |
937 | + "No orig tarball in DSC for version %s." % version |
938 | ) |
939 | |
940 | try: |
941 | orig_component_paths = dsc.component_tarball_paths |
942 | except GitUbuntuDscError as e: |
943 | raise GitUbuntuImportOrigError( |
944 | - 'Unable to import component tarballs in DSC for version ' |
945 | - '%s' % version |
946 | + "Unable to import component tarballs in DSC for version %s" % |
947 | + version |
948 | ) from e |
949 | |
950 | - orig_paths = [orig_tarball_path] + list(orig_component_paths.values()) |
951 | - if (not orig_tarball_path or |
952 | - orig_imported(repo, orig_paths, namespace, dist) |
953 | - ): |
954 | + orig_paths = dsc.all_tarball_paths |
955 | + if orig_imported(repo, orig_paths, namespace, dist): |
956 | return |
957 | if not all(map(os.path.exists, orig_paths)): |
958 | raise GitUbuntuImportOrigError('Unable to find tarball: ' |
959 | @@ -683,20 +687,20 @@ def import_orig(repo, dsc, namespace, version, dist): |
960 | # https://github.com/agx/git-buildpackage/pull/16 |
961 | os.chdir(repo.local_dir) |
962 | |
963 | - ext = os.path.splitext(dsc.orig_tarball_path)[1] |
964 | + ext = os.path.splitext(orig_tarball_path)[1] |
965 | |
966 | # make this non-fatal if upstream already exist as tagged? |
967 | - cmd = ['gbp', 'import-orig', '--no-merge', |
968 | + args = ['import-orig', '--no-merge', |
969 | '--upstream-branch', 'do-not-push', |
970 | '--pristine-tar', '--no-interactive', |
971 | '--no-symlink-orig', |
972 | '--upstream-tag=%s/upstream/%s/%%(version)s%s' % |
973 | (namespace, dist, ext)] |
974 | - cmd.extend(map(lambda x: '--component=%s' % x, |
975 | + args.extend(map(lambda x: '--component=%s' % x, |
976 | list(orig_component_paths.keys())) |
977 | ) |
978 | - cmd.extend([orig_tarball_path,]) |
979 | - run(cmd, env=repo.env) |
980 | + args.extend([orig_tarball_path,]) |
981 | + run_gbp(args, env=repo.env) |
982 | except CalledProcessError as e: |
983 | raise GitUbuntuImportOrigError( |
984 | 'Unable to import tarballs: %s' % orig_paths |
985 | @@ -783,18 +787,18 @@ def import_patches_applied_tree(repo, dsc_pathname): |
986 | while True: |
987 | try: |
988 | os.chdir(extracted_dir) |
989 | - run( |
990 | - ['quilt', 'push'], |
991 | + run_quilt( |
992 | + ['push'], |
993 | env=repo.quilt_env_from_treeish_str(import_tree_hash), |
994 | verbose_on_failure=False, |
995 | ) |
996 | - patch_name, _ = run( |
997 | - ['quilt', 'top'], |
998 | + patch_name, _ = run_quilt( |
999 | + ['top'], |
1000 | env=repo.quilt_env_from_treeish_str(import_tree_hash), |
1001 | ) |
1002 | patch_name = patch_name.strip() |
1003 | - header, _ = run( |
1004 | - ['quilt', 'header'], |
1005 | + header, _ = run_quilt( |
1006 | + ['header'], |
1007 | env=repo.quilt_env_from_treeish_str(import_tree_hash), |
1008 | ) |
1009 | header = header.strip() |
1010 | @@ -1167,9 +1171,16 @@ def import_unapplied_spi(repo, spi, namespace, skip_orig, ubuntu_sinfo): |
1011 | unapplied_tip_version, |
1012 | ) |
1013 | |
1014 | - # Walk changelog backwards until we find an imported version |
1015 | + # Walk changelog backwards until we find an imported or orphaned |
1016 | + # version, preferring imported versions |
1017 | for version in import_tree_versions: |
1018 | - unapplied_changelog_parent_tag = repo.get_import_tag(version, namespace) |
1019 | + unapplied_changelog_parent_tag = repo.get_import_tag( |
1020 | + version, |
1021 | + namespace, |
1022 | + ) or repo.get_orphan_tag( |
1023 | + version, |
1024 | + namespace, |
1025 | + ) |
1026 | if unapplied_changelog_parent_tag is not None: |
1027 | # sanity check that version from d/changelog of the |
1028 | # tagged parent matches ours |
1029 | @@ -1202,6 +1213,7 @@ def import_unapplied_spi(repo, spi, namespace, skip_orig, ubuntu_sinfo): |
1030 | # of the import tree, to obtain the changelog parent |
1031 | upload_tag = repo.get_upload_tag(spi.version, namespace) |
1032 | import_tag = repo.get_import_tag(spi.version, namespace) |
1033 | + orphan_tag = repo.get_orphan_tag(spi.version, namespace) |
1034 | if import_tag: |
1035 | if str(import_tag.peel().tree.id) != unapplied_import_tree_hash: |
1036 | logging.error( |
1037 | @@ -1216,6 +1228,21 @@ def import_unapplied_spi(repo, spi, namespace, skip_orig, ubuntu_sinfo): |
1038 | str(import_tag.peel().id), |
1039 | ) |
1040 | return |
1041 | + elif orphan_tag: |
1042 | + if str(orphan_tag.peel().tree.id) != unapplied_import_tree_hash: |
1043 | + raise Exception( |
1044 | + "Found orphan tag %s, but the corresponding tree " |
1045 | + "does not match the published version. We cannot " |
1046 | + "currently support multiple orphan tags for the same " |
1047 | + "published version and will now exit.", |
1048 | + repo.tag_to_pretty_name(orphan_tag), |
1049 | + ) |
1050 | + else: |
1051 | + repo.update_head_to_commit( |
1052 | + spi.head_name(namespace), |
1053 | + str(orphan_tag.peel().id), |
1054 | + ) |
1055 | + return |
1056 | elif upload_tag: |
1057 | if str(upload_tag.peel().tree.id) != unapplied_import_tree_hash: |
1058 | logging.warn( |
1059 | diff --git a/gitubuntu/lint.py b/gitubuntu/lint.py |
1060 | index a7428e5..de130d0 100644 |
1061 | --- a/gitubuntu/lint.py |
1062 | +++ b/gitubuntu/lint.py |
1063 | @@ -443,8 +443,8 @@ def tree_after_merge_changelogs(repo, old, new): |
1064 | changelog_files = [] |
1065 | try: |
1066 | merge_base_id = str( |
1067 | - repo.raw_repo.merge_base( |
1068 | - repo.get_commitish(old).id, |
1069 | + repo.find_ubuntu_merge_base( |
1070 | + repo.get_commitish(old).id, |
1071 | repo.get_commitish(new).id |
1072 | ) |
1073 | ) |
1074 | @@ -513,9 +513,11 @@ def do_merge_lint(repo, namespace, commitish_string, old, new_base): |
1075 | # for now, assume pkg/ubuntu/devel and pkg/debian/sid are the merge |
1076 | # points and that we can do this command. Ideally, we abstract the merge |
1077 | # API into a better place |
1078 | - merge_base_id = str(repo.raw_repo.merge_base( |
1079 | - repo.get_commitish(old).id, |
1080 | - repo.get_commitish(new_base).id) |
1081 | + merge_base_id = str( |
1082 | + repo.find_ubuntu_merge_base( |
1083 | + repo.get_commitish(old).id, |
1084 | + repo.get_commitish(new_base).id |
1085 | + ) |
1086 | ) |
1087 | # 1) check if old/debian, old/ubuntu and new/debian point to what we |
1088 | # expect - compare treeishs and commits |
1089 | diff --git a/gitubuntu/merge.py b/gitubuntu/merge.py |
1090 | index c16fe4b..c49d9ed 100644 |
1091 | --- a/gitubuntu/merge.py |
1092 | +++ b/gitubuntu/merge.py |
1093 | @@ -437,7 +437,7 @@ def main( |
1094 | logging.error(stdout) |
1095 | return 1 |
1096 | |
1097 | - merge_base_id = repo.raw_repo.merge_base( |
1098 | + merge_base_id = repo.find_ubuntu_merge_base( |
1099 | onto_obj.id, |
1100 | commitish_obj.id, |
1101 | ) |
1102 | diff --git a/gitubuntu/run.py b/gitubuntu/run.py |
1103 | index 4b1a86c..1fdcd6d 100644 |
1104 | --- a/gitubuntu/run.py |
1105 | +++ b/gitubuntu/run.py |
1106 | @@ -1,4 +1,5 @@ |
1107 | import logging |
1108 | +import os |
1109 | import subprocess |
1110 | |
1111 | |
1112 | @@ -102,3 +103,17 @@ def decode_binary(binary, verbose=True): |
1113 | logging.warning("Failed to decode blob: %s", e) |
1114 | logging.warning("blob=%s", binary.decode(errors='replace')) |
1115 | return binary.decode('utf-8', errors='replace') |
1116 | + |
1117 | +def run_quilt(args, env=None, **kwargs): |
1118 | + cmd = ['quilt', '--quiltrc', '-'] + args |
1119 | + return run(cmd, env=env, **kwargs) |
1120 | + |
1121 | +def run_gbp(args, env=None, **kwargs): |
1122 | + cmd = ['gbp'] + args |
1123 | + if env: |
1124 | + env_no_gbp_conf = env.copy() |
1125 | + else: |
1126 | + env_no_gbp_conf = os.environ.copy() |
1127 | + env_no_gbp_conf['GBP_CONF_FILES'] = '/dev/null' |
1128 | + |
1129 | + return run(cmd, env=env_no_gbp_conf, **kwargs) |
1130 | diff --git a/gitubuntu/versioning.py b/gitubuntu/versioning.py |
1131 | index 7b1eb7b..62e28ae 100644 |
1132 | --- a/gitubuntu/versioning.py |
1133 | +++ b/gitubuntu/versioning.py |
1134 | @@ -25,6 +25,7 @@ __all__ = [ |
1135 | "next_development_version_string", |
1136 | "next_sru_version_string", |
1137 | "version_compare", |
1138 | + "split_version_string", |
1139 | ] |
1140 | |
1141 | # see _decompose_version_string for explanation of members |
1142 | @@ -117,6 +118,27 @@ def _decompose_version_string(version_string): |
1143 | return _VersionComps._make([parts, None, None, None, None]) |
1144 | |
1145 | |
1146 | +def split_version_string(version_string): |
1147 | + """Return the Debian version and Ubuntu suffix, if present |
1148 | + |
1149 | + Returns a 2-tuple of strings |
1150 | + """ |
1151 | + |
1152 | + if 'build' in version_string and 'ubuntu' in version_string: |
1153 | + raise ValueError( |
1154 | + "Not sure how to split %s: both build and " "ubuntu found" % |
1155 | + version_string |
1156 | + ) |
1157 | + |
1158 | + if 'build' in version_string: |
1159 | + deb_version, ubuntu_suffix = version_string.split('build') |
1160 | + return deb_version, 'build' + ubuntu_suffix |
1161 | + if 'ubuntu' in version_string: |
1162 | + deb_version, ubuntu_suffix = version_string.split('ubuntu') |
1163 | + return deb_version, 'ubuntu' + ubuntu_suffix |
1164 | + return version_string, "" |
1165 | + |
1166 | + |
1167 | def _bump_version_object(old_version_object, bump_function): |
1168 | """Return a new Version object with the correct attribute bumped |
1169 | |
1170 | @@ -280,6 +302,19 @@ def test_decompose_version_string(test_input, expected): |
1171 | |
1172 | |
1173 | @pytest.mark.parametrize('test_input, expected', [ |
1174 | + ('1.0-2', ('1.0-2', '')), |
1175 | + ('1.0-2ubuntu1', ('1.0-2', 'ubuntu1')), |
1176 | + ('1.0-2ubuntu1.3', ('1.0-2', 'ubuntu1.3')), |
1177 | + ('1.0-2build1', ('1.0-2', 'build1')), |
1178 | + ('1', ('1', '')), |
1179 | + ('1ubuntu2', ('1', 'ubuntu2')), |
1180 | + ('1.0-3ubuntu', ('1.0-3', 'ubuntu')), |
1181 | + ('1:1.0-1.1ubuntu1', ('1:1.0-1.1', 'ubuntu1')), |
1182 | +]) |
1183 | +def test_split_version_string(test_input, expected): |
1184 | + assert split_version_string(test_input) == expected |
1185 | + |
1186 | +@pytest.mark.parametrize('test_input, expected', [ |
1187 | ('1.0-2', '1.0-2ubuntu1'), |
1188 | ('1.0-2ubuntu1', '1.0-2ubuntu2'), |
1189 | ('1.0-2ubuntu1.3', '1.0-2ubuntu2'), |
1190 | diff --git a/scripts/import-source-packages.py b/scripts/import-source-packages.py |
1191 | index 33ccbf2..0bc5dca 100755 |
1192 | --- a/scripts/import-source-packages.py |
1193 | +++ b/scripts/import-source-packages.py |
1194 | @@ -45,20 +45,15 @@ LOG_PATH = os.path.join(tempfile.gettempdir(), 'import-source-packages-log') |
1195 | |
1196 | def import_new_published_sources( |
1197 | num_workers, |
1198 | - whitelist, |
1199 | - blacklist, |
1200 | - phasing_main, |
1201 | - phasing_universe, |
1202 | + pkgnames_to_import, |
1203 | dry_run, |
1204 | ): |
1205 | - """import_new_published_source - Import all new publishes since a prior execution |
1206 | + """Import all new publishes since a prior execution |
1207 | |
1208 | Arguments: |
1209 | num_workers - integer number of worker processes to use |
1210 | - whitelist - a list of of packages to always import |
1211 | - blacklist - a list of packages to never import |
1212 | - phasing_main - a integer percentage of all packages in main to import |
1213 | - phasing_universe - a integer percentage of all packages in universe to import |
1214 | + pkgnames_to_import - list of source package names that should be |
1215 | + considered for import, if publish records are found |
1216 | dry_run - a boolean to indicate a dry-run operation |
1217 | |
1218 | Returns: |
1219 | @@ -118,14 +113,7 @@ def import_new_published_sources( |
1220 | # a publish timestamp before the last run |
1221 | if spphr.date_published.timestamp() < timestamp.time: |
1222 | break |
1223 | - if scriptutils.should_import_srcpkg( |
1224 | - spphr.source_package_name, |
1225 | - spphr.component_name, |
1226 | - whitelist, |
1227 | - blacklist, |
1228 | - phasing_main, |
1229 | - phasing_universe, |
1230 | - ): |
1231 | + if spphr.source_package_name in pkgnames_to_import: |
1232 | dist_filtered_pkgnames.add(spphr.source_package_name) |
1233 | |
1234 | if not dist_filtered_pkgnames: |
1235 | @@ -146,7 +134,6 @@ def import_new_published_sources( |
1236 | link=str(dist_newest_spphr), |
1237 | ) |
1238 | |
1239 | - |
1240 | ret = scriptutils.pool_map_import_srcpkg( |
1241 | num_workers=num_workers, |
1242 | dry_run=dry_run, |
1243 | @@ -158,7 +145,7 @@ def import_new_published_sources( |
1244 | return ret |
1245 | |
1246 | def read_timestamps(): |
1247 | - """read_timestamps - Read saved timestamp values from LOG_PATH |
1248 | + """Read saved timestamp values from LOG_PATH |
1249 | |
1250 | If the log file is not readable (e.g., does not exist), the |
1251 | timestamps will be set to 24 hours before now. |
1252 | @@ -194,7 +181,7 @@ def read_timestamps(): |
1253 | ) |
1254 | |
1255 | def write_timestamps(timestamps): |
1256 | - """write_timestamps - Write timestamp values to LOG_PATH |
1257 | + """Write timestamp values to LOG_PATH |
1258 | |
1259 | Arguments: |
1260 | timestamps - a dictionary with 'debian' and 'ubuntu' keys, each of which is |
1261 | @@ -230,7 +217,7 @@ def main( |
1262 | phasing_universe=scriptutils.DEFAULTS.phasing_universe, |
1263 | dry_run=scriptutils.DEFAULTS.dry_run, |
1264 | ): |
1265 | - """main - Main entry point to the script |
1266 | + """Main entry point to the script |
1267 | |
1268 | Arguments: |
1269 | num_workers - integer number of worker threads to use |
1270 | @@ -272,13 +259,21 @@ def main( |
1271 | mail_imported_srcpkgs = set() |
1272 | mail_failed_srcpkgs = set() |
1273 | |
1274 | - while True: |
1275 | - imported_srcpkgs, failed_srcpkgs = import_new_published_sources( |
1276 | - num_workers, |
1277 | + try: |
1278 | + pkgnames_to_import = scriptutils.srcpkgs_to_import_list( |
1279 | whitelist, |
1280 | blacklist, |
1281 | phasing_main, |
1282 | phasing_universe, |
1283 | + ) |
1284 | + except RuntimeError as e: |
1285 | + print("%s" % e) |
1286 | + sys.exit(1) |
1287 | + |
1288 | + while True: |
1289 | + imported_srcpkgs, failed_srcpkgs = import_new_published_sources( |
1290 | + num_workers, |
1291 | + pkgnames_to_import, |
1292 | dry_run, |
1293 | ) |
1294 | print("Imported %d source packages" % len(imported_srcpkgs)) |
1295 | @@ -320,7 +315,7 @@ def main( |
1296 | time.sleep(sleep_interval_minutes * 60) |
1297 | |
1298 | def cli_main(): |
1299 | - """cli_main - CLI entry point to script |
1300 | + """CLI entry point to script |
1301 | """ |
1302 | parser = argparse.ArgumentParser( |
1303 | description='Script to import all source packages with phasing', |
1304 | diff --git a/scripts/scriptutils.py b/scripts/scriptutils.py |
1305 | index 912efd1..3608d58 100644 |
1306 | --- a/scripts/scriptutils.py |
1307 | +++ b/scripts/scriptutils.py |
1308 | @@ -1,13 +1,21 @@ |
1309 | +import bz2 |
1310 | from collections import namedtuple |
1311 | import functools |
1312 | +import gzip |
1313 | import hashlib |
1314 | +import itertools |
1315 | +import lzma |
1316 | import multiprocessing |
1317 | import os |
1318 | import sys |
1319 | import subprocess |
1320 | import time |
1321 | +import urllib.request |
1322 | |
1323 | import pkg_resources |
1324 | +import pytest |
1325 | + |
1326 | +from debian.deb822 import Sources |
1327 | |
1328 | # We expect to be running from a git repository in master for this |
1329 | # script, because the snap's python code is not exposed except within |
1330 | @@ -24,6 +32,7 @@ sys.path.insert( |
1331 | ) |
1332 | |
1333 | from gitubuntu.run import run |
1334 | +from gitubuntu.source_information import GitUbuntuSourceInformation |
1335 | |
1336 | Defaults = namedtuple( |
1337 | 'Defaults', |
1338 | @@ -54,6 +63,8 @@ DEFAULTS = Defaults( |
1339 | use_whitelist=True, |
1340 | ) |
1341 | |
1342 | +UBUNTU_SOURCES_ROOT = 'http://archive.ubuntu.com/ubuntu/dists/' |
1343 | +DEBIAN_SOURCES_ROOT = 'http://cdn-fastly.deb.debian.org/debian/dists/' |
1344 | |
1345 | def should_import_srcpkg( |
1346 | pkgname, |
1347 | @@ -63,7 +74,7 @@ def should_import_srcpkg( |
1348 | phasing_main, |
1349 | phasing_universe, |
1350 | ): |
1351 | - """should_import_srcpkg - indicate if a given source package should be imported |
1352 | + """indicate if a given source package should be imported |
1353 | |
1354 | The phasing is implemented similarly to update-manager. If the |
1355 | md5sum of the source package name is less than the (appropriate |
1356 | @@ -100,9 +111,183 @@ def should_import_srcpkg( |
1357 | # skip partner and multiverse for now |
1358 | return False |
1359 | |
1360 | +@pytest.mark.parametrize('pkgname, component, whitelist, blacklist, phasing_main, phasing_universe, expected', [ |
1361 | + ('jimtcl', 'universe', [], [], 1, 0, False), |
1362 | + ('jimtcl', 'universe', [], [], 1, 100, True), |
1363 | + ('jimtcl', 'universe', ['jimtcl',], [], 1, 0, True), |
1364 | + ('linux', 'main', [], ['linux',], 100, 0, False), |
1365 | +]) |
1366 | +def test_should_import_srcpkg( |
1367 | + pkgname, |
1368 | + component, |
1369 | + whitelist, |
1370 | + blacklist, |
1371 | + phasing_main, |
1372 | + phasing_universe, |
1373 | + expected, |
1374 | +): |
1375 | + assert should_import_srcpkg( |
1376 | + pkgname=pkgname, |
1377 | + component=component, |
1378 | + whitelist=whitelist, |
1379 | + blacklist=blacklist, |
1380 | + phasing_main=phasing_main, |
1381 | + phasing_universe=phasing_universe, |
1382 | + ) == expected |
1383 | + |
1384 | +def _get_sources( |
1385 | + sources_url_root, |
1386 | + serieses, |
1387 | + components, |
1388 | + pocket_suffixes, |
1389 | +): |
1390 | + """Helper generator of Sources file contents |
1391 | + |
1392 | + Arguments: |
1393 | + sources_url_root: a string prefix up to "dists" for Sources URLs |
1394 | + serieses: a list of series strings |
1395 | + components: a list of component strings |
1396 | + pocket_suffixes: a list of pocket suffix strings |
1397 | + |
1398 | + Yields: |
1399 | + The contents of all Sources files satisfying the requested |
1400 | + serieses X components X pocket_suffixes product. |
1401 | + """ |
1402 | + compressions = { |
1403 | + '.xz': lzma.open, |
1404 | + '.bz2': bz2.open, |
1405 | + '.gz': gzip.open, |
1406 | + } |
1407 | + |
1408 | + for component in components: |
1409 | + for series, pocket_suffix in itertools.product( |
1410 | + serieses, |
1411 | + pocket_suffixes, |
1412 | + ): |
1413 | + url = sources_url_root + '%s%s/%s/source/Sources' % ( |
1414 | + series, |
1415 | + pocket_suffix, |
1416 | + component, |
1417 | + ) |
1418 | + for compression, opener in compressions.items(): |
1419 | + try: |
1420 | + with urllib.request.urlopen( |
1421 | + url + compression |
1422 | + ) as source_url_file: |
1423 | + yield opener(source_url_file, mode='r') |
1424 | + break |
1425 | + except urllib.error.HTTPError: |
1426 | + pass |
1427 | + else: |
1428 | + raise RuntimeError( |
1429 | + "Unable to find any Sources file for component=%s, " |
1430 | + "series=%s, pocket=%s" % ( |
1431 | + component, |
1432 | + series, |
1433 | + pocket_suffix, |
1434 | + ) |
1435 | + ) |
1436 | + raise StopIteration() |
1437 | + |
1438 | +def srcpkgs_to_import_list( |
1439 | + whitelist, |
1440 | + blacklist, |
1441 | + phasing_main, |
1442 | + phasing_universe, |
1443 | +): |
1444 | + """Obtain a list of source package names satisfying some criteria |
1445 | + |
1446 | + Arguments: |
1447 | + whitelist - a list of of packages to always import |
1448 | + blacklist - a list of packages to never import |
1449 | + phasing_main - a integer percentage of all packages in main to import |
1450 | + phasing_universe - a integer percentage of all packages in universe to import |
1451 | + |
1452 | + Packages only published in Debian are treated like universe |
1453 | + packages. |
1454 | + |
1455 | + Returns: |
1456 | + A list of source package names that should be imported, which can be |
1457 | + empty. |
1458 | + |
1459 | + Raises: |
1460 | + RuntimeError if any active component/series/pocket URL is not found |
1461 | + """ |
1462 | + |
1463 | + all_ubuntu_pkgnames = set() |
1464 | + pkgnames_to_import = set() |
1465 | + for sources in _get_sources( |
1466 | + sources_url_root=UBUNTU_SOURCES_ROOT, |
1467 | + serieses=GitUbuntuSourceInformation('ubuntu').active_series_name_list, |
1468 | + components=['main',], |
1469 | + pocket_suffixes=['-proposed', '-updates', '-security', ''], |
1470 | + ): |
1471 | + for src in Sources.iter_paragraphs( |
1472 | + sources, |
1473 | + use_apt_pkg=False, |
1474 | + ): |
1475 | + pkgname = src['Package'] |
1476 | + all_ubuntu_pkgnames.add(pkgname) |
1477 | + if should_import_srcpkg( |
1478 | + pkgname=pkgname, |
1479 | + component='main', |
1480 | + whitelist=whitelist, |
1481 | + blacklist=blacklist, |
1482 | + phasing_main=phasing_main, |
1483 | + phasing_universe=phasing_universe, |
1484 | + ): |
1485 | + pkgnames_to_import.add(pkgname) |
1486 | + |
1487 | + for sources in _get_sources( |
1488 | + sources_url_root=UBUNTU_SOURCES_ROOT, |
1489 | + serieses=GitUbuntuSourceInformation('ubuntu').active_series_name_list, |
1490 | + components=['universe',], |
1491 | + pocket_suffixes=['-proposed', '-updates', '-security', ''], |
1492 | + ): |
1493 | + for src in Sources.iter_paragraphs( |
1494 | + sources, |
1495 | + use_apt_pkg=False, |
1496 | + ): |
1497 | + pkgname = src['Package'] |
1498 | + all_ubuntu_pkgnames.add(pkgname) |
1499 | + if should_import_srcpkg( |
1500 | + pkgname=pkgname, |
1501 | + component='universe', |
1502 | + whitelist=whitelist, |
1503 | + blacklist=blacklist, |
1504 | + phasing_main=phasing_main, |
1505 | + phasing_universe=phasing_universe, |
1506 | + ): |
1507 | + pkgnames_to_import.add(pkgname) |
1508 | + |
1509 | + # find source packages only published in Debian and treat them like |
1510 | + # they are in Universe |
1511 | + for sources in _get_sources( |
1512 | + sources_url_root=DEBIAN_SOURCES_ROOT, |
1513 | + serieses=GitUbuntuSourceInformation('debian').all_series_name_list, |
1514 | + components=['main',], |
1515 | + pocket_suffixes = ['',], |
1516 | + ): |
1517 | + for src in Sources.iter_paragraphs( |
1518 | + sources, |
1519 | + use_apt_pkg=False, |
1520 | + ): |
1521 | + pkgname = src['Package'] |
1522 | + if pkgname not in all_ubuntu_pkgnames: |
1523 | + if should_import_srcpkg( |
1524 | + pkgname=pkgname, |
1525 | + component='universe', |
1526 | + whitelist=whitelist, |
1527 | + blacklist=blacklist, |
1528 | + phasing_main=phasing_main, |
1529 | + phasing_universe=phasing_universe, |
1530 | + ): |
1531 | + pkgnames_to_import.add(pkgname) |
1532 | + |
1533 | + return pkgnames_to_import |
1534 | |
1535 | def import_srcpkg(pkgname, dry_run): |
1536 | - """import_srcpkg - Invoke git ubuntu import on @pkgname |
1537 | + """Invoke git ubuntu import on @pkgname |
1538 | |
1539 | Arguments: |
1540 | pkgname - string name of a source package |
1541 | @@ -145,7 +330,7 @@ def setup_git_config( |
1542 | name='Ubuntu Git Importer', |
1543 | email='usd-importer-announce@lists.canonical.com', |
1544 | ): |
1545 | - """setup_git_config - Ensure global required Git configuration values are set |
1546 | + """Ensure global required Git configuration values are set |
1547 | |
1548 | Arguments: |
1549 | name - string name to set as user.name in Git config |
1550 | @@ -165,22 +350,26 @@ def pool_map_import_srcpkg( |
1551 | dry_run, |
1552 | pkgnames, |
1553 | ): |
1554 | - """pool_map_import_srcpkg - Use a multiprocessing.Pool to parallel |
1555 | - import source packages |
1556 | + """Use a multiprocessing.Pool to parallel import source packages |
1557 | |
1558 | Arguments: |
1559 | num_workers - integer number of worker processes to use |
1560 | dry_run - a boolean to indicate a dry-run operation |
1561 | pkgnames - a list of string names of source packages |
1562 | """ |
1563 | - with multiprocessing.Pool(processes=num_workers) as pool: |
1564 | - results = pool.map( |
1565 | - functools.partial( |
1566 | - import_srcpkg, |
1567 | - dry_run=dry_run, |
1568 | - ), |
1569 | - pkgnames, |
1570 | - ) |
1571 | + map_args = [ |
1572 | + functools.partial( |
1573 | + import_srcpkg, |
1574 | + dry_run=dry_run, |
1575 | + ), |
1576 | + pkgnames, |
1577 | + ] |
1578 | + |
1579 | + if num_workers: |
1580 | + with multiprocessing.Pool(processes=num_workers) as pool: |
1581 | + results = pool.map(*map_args) |
1582 | + else: |
1583 | + results = map(*map_args) |
1584 | |
1585 | return ( |
1586 | [pkg for pkg, success in results if success], |
1587 | diff --git a/scripts/source-package-walker.py b/scripts/source-package-walker.py |
1588 | index 58c3d5a..0dcc2a8 100755 |
1589 | --- a/scripts/source-package-walker.py |
1590 | +++ b/scripts/source-package-walker.py |
1591 | @@ -9,18 +9,11 @@ |
1592 | # Report on successful and failed imports |
1593 | |
1594 | import argparse |
1595 | -import bz2 |
1596 | -import itertools |
1597 | -import gzip |
1598 | -import lzma |
1599 | import os |
1600 | import sys |
1601 | -import urllib.request |
1602 | |
1603 | import scriptutils |
1604 | |
1605 | -from debian.deb822 import Sources |
1606 | - |
1607 | # We expect to be running from a git repository in master for this |
1608 | # script, because the snap's python code is not exposed except within |
1609 | # the snap |
1610 | @@ -35,25 +28,17 @@ sys.path.insert( |
1611 | ) |
1612 | ) |
1613 | |
1614 | -from gitubuntu.source_information import GitUbuntuSourceInformation |
1615 | - |
1616 | def import_all_published_sources( |
1617 | num_workers, |
1618 | - whitelist, |
1619 | - blacklist, |
1620 | - phasing_main, |
1621 | - phasing_universe, |
1622 | + pkgnames_to_import, |
1623 | dry_run, |
1624 | ): |
1625 | - """import_all_published_sources - Import all publishes satisfying a |
1626 | - {white,black}list and phasing |
1627 | + """Import all publishes for a list of source package names |
1628 | |
1629 | Arguments: |
1630 | num_workers - integer number of worker processes to use |
1631 | - whitelist - a list of of packages to always import |
1632 | - blacklist - a list of packages to never import |
1633 | - phasing_main - a integer percentage of all packages in main to import |
1634 | - phasing_universe - a integer percentage of all packages in universe to import |
1635 | + pkgnames_to_import - list of source package names that should be |
1636 | + considered for import |
1637 | dry_run - a boolean to indicate a dry-run operation |
1638 | |
1639 | Returns: |
1640 | @@ -61,72 +46,14 @@ def import_all_published_sources( |
1641 | successfully imported source packages, the second containing the |
1642 | names of all source packages that failed to import. |
1643 | """ |
1644 | - serieses = GitUbuntuSourceInformation('ubuntu').active_series_name_list |
1645 | - components = ['main', 'universe'] |
1646 | - pocket_suffixes = ['-proposed', '-updates', '-security', ''] |
1647 | - compressions = { |
1648 | - '.xz': lzma.open, |
1649 | - '.bz2': bz2.open, |
1650 | - '.gz': gzip.open, |
1651 | - } |
1652 | - |
1653 | - base_sources_url = ( |
1654 | - 'http://archive.ubuntu.com/ubuntu/dists/%s%s/%s/source/Sources' |
1655 | - ) |
1656 | - pkgnames = set() |
1657 | - for component in components: |
1658 | - for series, pocket_suffix in itertools.product( |
1659 | - serieses, |
1660 | - pocket_suffixes, |
1661 | - ): |
1662 | - url = base_sources_url % ( |
1663 | - series, |
1664 | - pocket_suffix, |
1665 | - component, |
1666 | - ) |
1667 | - for compression, opener in compressions.items(): |
1668 | - try: |
1669 | - with urllib.request.urlopen( |
1670 | - url + compression |
1671 | - ) as source_url_file: |
1672 | - with opener(source_url_file, mode='r') as sources: |
1673 | - print(url + compression) |
1674 | - for src in Sources.iter_paragraphs( |
1675 | - sources, |
1676 | - use_apt_pkg=False, |
1677 | - ): |
1678 | - pkgname = src['Package'] |
1679 | - if scriptutils.should_import_srcpkg( |
1680 | - pkgname, |
1681 | - component, |
1682 | - whitelist, |
1683 | - blacklist, |
1684 | - phasing_main, |
1685 | - phasing_universe, |
1686 | - ): |
1687 | - pkgnames.add(pkgname) |
1688 | - break |
1689 | - except urllib.error.HTTPError: |
1690 | - pass |
1691 | - else: |
1692 | - print( |
1693 | - "Unable to find any Sources file for component=%s, " |
1694 | - "series=%s, pocket=%s" % ( |
1695 | - component, |
1696 | - series, |
1697 | - pocket_suffix, |
1698 | - ) |
1699 | - ) |
1700 | - sys.exit(1) |
1701 | - |
1702 | - if not pkgnames: |
1703 | + if not pkgnames_to_import: |
1704 | print("No relevant publishes found") |
1705 | return [], [] |
1706 | |
1707 | return scriptutils.pool_map_import_srcpkg( |
1708 | num_workers=num_workers, |
1709 | dry_run=dry_run, |
1710 | - pkgnames=pkgnames, |
1711 | + pkgnames=pkgnames_to_import, |
1712 | ) |
1713 | |
1714 | def main( |
1715 | @@ -138,7 +65,7 @@ def main( |
1716 | dry_run=scriptutils.DEFAULTS.dry_run, |
1717 | use_whitelist=scriptutils.DEFAULTS.use_whitelist, |
1718 | ): |
1719 | - """main - Main entry point to the script |
1720 | + """Main entry point to the script |
1721 | |
1722 | Arguments: |
1723 | num_workers - integer number of worker threads to use |
1724 | @@ -184,12 +111,20 @@ def main( |
1725 | except (FileNotFoundError, IOError): |
1726 | blacklist = list() |
1727 | |
1728 | + try: |
1729 | + pkgnames_to_import = scriptutils.srcpkgs_to_import_list( |
1730 | + whitelist, |
1731 | + blacklist, |
1732 | + phasing_main, |
1733 | + phasing_universe, |
1734 | + ) |
1735 | + except RuntimeError as e: |
1736 | + print("%s" % e) |
1737 | + sys.exit(1) |
1738 | + |
1739 | imported_srcpkgs, failed_srcpkgs = import_all_published_sources( |
1740 | num_workers, |
1741 | - whitelist, |
1742 | - blacklist, |
1743 | - phasing_main, |
1744 | - phasing_universe, |
1745 | + pkgnames_to_import, |
1746 | dry_run, |
1747 | ) |
1748 | print( |
1749 | @@ -206,7 +141,7 @@ def main( |
1750 | ) |
1751 | |
1752 | def cli_main(): |
1753 | - """cli_main - CLI entry point to script |
1754 | + """CLI entry point to script |
1755 | """ |
1756 | parser = argparse.ArgumentParser( |
1757 | description='Script to import all source packages with phasing', |
1758 | diff --git a/scripts/update-repository-alias.py b/scripts/update-repository-alias.py |
1759 | index 5f2c8ae..a2d9fd6 100755 |
1760 | --- a/scripts/update-repository-alias.py |
1761 | +++ b/scripts/update-repository-alias.py |
1762 | @@ -27,7 +27,7 @@ sys.path.insert( |
1763 | from gitubuntu.source_information import launchpad_login_auth |
1764 | |
1765 | def update_git_repository(package, dry_run, unset): |
1766 | - """update_git_repository - set the default Git Repository on Launchpad for a source package |
1767 | + """Set the default Git Repository on Launchpad for a source package |
1768 | |
1769 | Arguments: |
1770 | package - string name of a source package |
1771 | @@ -93,7 +93,7 @@ def main( |
1772 | unset=False, |
1773 | use_whitelist=scriptutils.DEFAULTS.use_whitelist, |
1774 | ): |
1775 | - """main - Main entry point to the script |
1776 | + """Main entry point to the script |
1777 | |
1778 | Set the default Git Repository target on Launchpad for all source |
1779 | packages imported to usd-import-team to the usd-import-team |
1780 | @@ -162,22 +162,22 @@ def main( |
1781 | filtered_pkgnames = set() |
1782 | for pkgname in main_packages: |
1783 | if scriptutils.should_import_srcpkg( |
1784 | - pkgname, |
1785 | - 'main', |
1786 | - whitelist, |
1787 | - blacklist, |
1788 | - phasing_main, |
1789 | - phasing_universe, |
1790 | + pkgname=pkgname, |
1791 | + component='main', |
1792 | + whitelist=whitelist, |
1793 | + blacklist=blacklist, |
1794 | + phasing_main=phasing_main, |
1795 | + phasing_universe=phasing_universe, |
1796 | ): |
1797 | filtered_pkgnames.add(pkgname) |
1798 | for pkgname in universe_packages: |
1799 | if scriptutils.should_import_srcpkg( |
1800 | - pkgname, |
1801 | - 'universe', |
1802 | - whitelist, |
1803 | - blacklist, |
1804 | - phasing_main, |
1805 | - phasing_universe, |
1806 | + pkgname=pkgname, |
1807 | + component='universe', |
1808 | + whitelist=whitelist, |
1809 | + blacklist=blacklist, |
1810 | + phasing_main=phasing_main, |
1811 | + phasing_universe=phasing_universe, |
1812 | ): |
1813 | filtered_pkgnames.add(pkgname) |
1814 | |
1815 | @@ -196,7 +196,7 @@ def main( |
1816 | ) |
1817 | |
1818 | def cli_main(): |
1819 | - """cli_main - main entry point for CLI |
1820 | + """main entry point for CLI |
1821 | """ |
1822 | parser = argparse.ArgumentParser( |
1823 | description='Update the default Git repository for imported source packages', |
1824 | diff --git a/snap/snapcraft.yaml b/snap/snapcraft.yaml |
1825 | index 0fbad2b..f8e6e5d 100644 |
1826 | --- a/snap/snapcraft.yaml |
1827 | +++ b/snap/snapcraft.yaml |
1828 | @@ -38,9 +38,8 @@ parts: |
1829 | gbp: |
1830 | plugin: python |
1831 | python-version: python3 |
1832 | - source: https://github.com/nacc/git-buildpackage.git |
1833 | - source-type: git |
1834 | - source-branch: master-fix-tilde # want to use a tagged release once available |
1835 | + source: https://github.com/agx/git-buildpackage/archive/debian/0.9.6.tar.gz |
1836 | + source-type: tar |
1837 | requirements: requirements.txt |
1838 | stage: |
1839 | - -usr/bin/py* |
1840 | @@ -51,8 +50,7 @@ parts: |
1841 | after: [git, pristine-tar, python3] |
1842 | git: |
1843 | plugin: autotools |
1844 | - # want to use an upstream release of 2.14.2 or later, once available |
1845 | - source: https://github.com/nacc/git/archive/v2.14.1_nacc_fixes.tar.gz |
1846 | + source: https://github.com/git/git/archive/v2.14.3.tar.gz |
1847 | source-type: tar |
1848 | build-packages: |
1849 | - libz-dev |
PASSED: Continuous integration, rev:5d4441a4f98 7fe7242513d1430 601224717d2b46 /jenkins. ubuntu. com/server/ job/git- ubuntu- ci/239/
https:/
Executed test runs:
SUCCESS: Checkout
SUCCESS: Style Check
SUCCESS: Unit Tests
SUCCESS: Integration Tests
IN_PROGRESS: Declarative: Post Actions
Click here to trigger a rebuild: /jenkins. ubuntu. com/server/ job/git- ubuntu- ci/239/ rebuild
https:/