Merge ~alexmurray/ubuntu-cve-tracker:noboilerplates-3 into ubuntu-cve-tracker:master
- Git
- lp:~alexmurray/ubuntu-cve-tracker
- noboilerplates-3
- Merge into master
Status: | Merged |
---|---|
Merged at revision: | 7120fd2d1eeb59d24bedb5dae0e1ab13d85b1b45 |
Proposed branch: | ~alexmurray/ubuntu-cve-tracker:noboilerplates-3 |
Merge into: | ubuntu-cve-tracker:master |
Diff against target: |
1534 lines (+505/-582) 19 files modified
README (+4/-5) README.mozilla (+3/-2) README.usn (+1/-1) README.webkit (+3/-3) scripts/active_edit (+123/-269) scripts/add-derived-kernel (+33/-11) scripts/add_meta_info.py (+4/-2) scripts/boilerplate-to-json.py (+151/-0) scripts/check-cves (+110/-220) scripts/check-syntax (+10/-17) scripts/check-syntax-fixup (+0/-5) scripts/cve-mode.el (+1/-3) scripts/cve.vim (+0/-1) scripts/cve_lib.py (+54/-24) scripts/dup-status-for-pkg (+1/-1) scripts/kernel-triage-missing-break-fix (+1/-1) scripts/release-cycle-devel-opens (+1/-1) scripts/release-cycle-released (+1/-1) scripts/sync-from-eol.py (+4/-15) |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Alex Murray | Needs Fixing | ||
Seth Arnold | Approve | ||
Eduardo Barretto | Approve | ||
Review via email:
|
Commit message
Description of the change
This series of commits is designed to allow us to replace the boilerplate files and the package_
To test this branch you will need to generate this package-db.json as:
./scripts/
Then you should be able to do CVE triage and run active_edit etc etc as normal.
The main benefit this change gives is a better way to represent this data *plus* it ensures that when say updating a CVE with a boilerplate target (e.g. active_edit -C CVE-NNNN-XXXX -p openjdk) - that the packages captured by that boilerplate (ie openjdk-10 etc) all get added to the CVE file as expected *along with all the various subprojects*. This should mean we don't have to use check-syntax-fixup so much.
![](/+icing/build/overlay/assets/skins/sam/images/close.gif)
Steve Beattie (sbeattie) wrote : | # |
![](/+icing/build/overlay/assets/skins/sam/images/close.gif)
Steve Beattie (sbeattie) wrote : | # |
Again, have not read in detail...
On Wed, Aug 03, 2022 at 07:05:15AM -0000, Alex Murray wrote:
> diff --git a/scripts/
> index d219586..f4c96c9 100755
> --- a/scripts/
> +++ b/scripts/
> @@ -105,6 +105,6 @@ echo "Updating status of released CVEs (ok if empty and prints help)..."
> xargs --no-run-if-empty ./scripts/
>
> printf "\n\nPlease update cve_lib.py, kernel_lib.py, sis-generate-usn, cve-alert.sh, and prepare-
> -echo "Please add package description information if necessary to meta_lists/
> +echo "Please add package description information if necessary to meta_lists/
> echo "If this kernel has not been published yet, please add it to the unpublished_kernels list in check-syntax."
> echo "Also, before publishing a USN against the new kernel, ensure packages_mirror has been run"
Plesae note that one of the things the `add-derived-
for the new kernel being added for tracking in UCT is add it to the
boilerplate (I subsequently adjust the addition's location in the
boilerplate to try to keep them relatively organized). Is there a way
add-derived-kernel could be updated to do this under the new structure?
(Yes, there is post-hoc manual editing to be done after
add-derived-kernel does its thing, but my intent is to reduce that as
much as possible, to avoid errors.)
--
Steve Beattie
<email address hidden>
![](/+icing/build/overlay/assets/skins/sam/images/close.gif)
Alex Murray (alexmurray) wrote : | # |
Thanks for the comments @sbeattie - see https:/
![](/+icing/build/overlay/assets/skins/sam/images/close.gif)
Alex Murray (alexmurray) wrote : | # |
So this merge request has had no real review for almost a whole month... I would really like to get it merged so will likely just merge it next week unless anyone wants to give it a look over... TIA 🙏
![](/+icing/build/overlay/assets/skins/sam/images/close.gif)
Eduardo Barretto (ebarretto) wrote : | # |
Small fix in indentation, but the code looks good to me.
I believe you will send another PR, later when things are working in this new structure, to do the cleanup in cve_lib (load_boilerpla
![](/+icing/build/overlay/assets/skins/sam/images/close.gif)
Alex Murray (alexmurray) wrote : | # |
Thanks @ebarretto - I have fixed the indentation in https:/
Yes, once this is merged in then I plan to actually remove the boilerplate files *and* generate the new package-db.json and commit this to UCT (plus remove the now obsolete package_
![](/+icing/build/overlay/assets/skins/sam/images/close.gif)
Eduardo Barretto (ebarretto) wrote : | # |
lgtm, thanks!
![](/+icing/build/overlay/assets/skins/sam/images/close.gif)
Seth Arnold (seth-arnold) wrote : | # |
Sorry for not getting to this sooner :( I like quite a lot of it. I'd love to be rid of the boilerplates.
I've got a few comments inline.
Thanks
![](/+icing/build/overlay/assets/skins/sam/images/close.gif)
Alex Murray (alexmurray) wrote : | # |
Thanks Seth - fixed the typo and moved the parse_embedded_
Regarding wordwrap() etc - that is simply stolen from check-cves and moved into cve_lib now so that we can reuse it - so it will have the same functionality as it did before - I have not modified it.
![](/+icing/build/overlay/assets/skins/sam/images/close.gif)
Alex Murray (alexmurray) wrote : | # |
I just noticed that I missed a few other scripts that still refer to the boilerplate files so they will need to be fixed up before this gets merged:
scripts/
scripts/
scripts/
scripts/
Basically `git grep boilerplate` should return nothing when this MR is actually ready.
![](/+icing/build/overlay/assets/skins/sam/images/close.gif)
Seth Arnold (seth-arnold) wrote : | # |
scripts/
active/
scripts/
scripts/
active/
to:
active/CVE-*
scripts/
Getting rid of the boilerplates looks like it'll simplify a lot :)
Preview Diff
1 | diff --git a/README b/README | |||
2 | index 433dde7..8817769 100644 | |||
3 | --- a/README | |||
4 | +++ b/README | |||
5 | @@ -597,7 +597,7 @@ Stable Release Actions | |||
6 | 597 | ---------------------- | 597 | ---------------------- |
7 | 598 | When a stable release is published, the active CVEs need to be adjusted to | 598 | When a stable release is published, the active CVEs need to be adjusted to |
8 | 599 | reflect the new stable release. e.g. when trusty was published: | 599 | reflect the new stable release. e.g. when trusty was published: |
10 | 600 | perl -pi -e 's/^((#?)devel_(.*))/$2trusty_$3\n$1/g' active/{CVE-,00boilerplate}* | 600 | perl -pi -e 's/^((#?)devel_(.*))/$2trusty_$3\n$1/g' active/CVE-* |
11 | 601 | The script tools will need to be adjusted as well. There is usually some | 601 | The script tools will need to be adjusted as well. There is usually some |
12 | 602 | lag time between the new devel archive opening and the stable release | 602 | lag time between the new devel archive opening and the stable release |
13 | 603 | getting published. This means that "devel" will disappear from ubuntu-table | 603 | getting published. This means that "devel" will disappear from ubuntu-table |
14 | @@ -611,7 +611,7 @@ briefly: | |||
15 | 611 | 611 | ||
16 | 612 | scripts/cve_lib.py should have an empty 'devel_release'. | 612 | scripts/cve_lib.py should have an empty 'devel_release'. |
17 | 613 | 613 | ||
19 | 614 | Move all active CVEs and boilerplates from "devel" to release state: | 614 | Move all active CVEs from "devel" to release state: |
20 | 615 | ./scripts/release-cycle-released $RELEASE | 615 | ./scripts/release-cycle-released $RELEASE |
21 | 616 | 616 | ||
22 | 617 | 617 | ||
23 | @@ -619,7 +619,7 @@ Development Release Actions | |||
24 | 619 | --------------------------- | 619 | --------------------------- |
25 | 620 | Fill in releases and devel_release in ubuntu-cve-tools/scripts/cve_lib.py | 620 | Fill in releases and devel_release in ubuntu-cve-tools/scripts/cve_lib.py |
26 | 621 | 621 | ||
28 | 622 | Move all active CVEs and boilerplates from latest release to devel state: | 622 | Move all active CVEs from latest release to devel state: |
29 | 623 | ./scripts/release-cycle-devel-opens $LATEST_STABLE_RELEASE | 623 | ./scripts/release-cycle-devel-opens $LATEST_STABLE_RELEASE |
30 | 624 | 624 | ||
31 | 625 | Add release to non-ports and ports section of | 625 | Add release to non-ports and ports section of |
32 | @@ -635,7 +635,6 @@ Here is how: | |||
33 | 635 | 635 | ||
34 | 636 | 2. update the CVEs: | 636 | 2. update the CVEs: |
35 | 637 | $ sed -i 's/^<release>_\(.*\): \(needed\|needs\-triage\)/<release>_\1: \2 (reached end-of-life)/g' ./active/CVE-* | 637 | $ sed -i 's/^<release>_\(.*\): \(needed\|needs\-triage\)/<release>_\1: \2 (reached end-of-life)/g' ./active/CVE-* |
36 | 638 | $ sed -i '/^<release>_\(.*\): /d' ./active/00boilerplate* | ||
37 | 639 | 638 | ||
38 | 640 | 3. retire the CVEs (see 'Retiring items', above) | 639 | 3. retire the CVEs (see 'Retiring items', above) |
39 | 641 | 640 | ||
40 | @@ -678,7 +677,7 @@ New Backport Kernel | |||
41 | 678 | When a new backport kernel is added, update scripts/cve_lib.py's | 677 | When a new backport kernel is added, update scripts/cve_lib.py's |
42 | 679 | kernel_srcs and description_overrides. | 678 | kernel_srcs and description_overrides. |
43 | 680 | 679 | ||
45 | 681 | Then update the 00boilerplate.linux with its entry, add that entry to each | 680 | Then update the meta_lists/package-db.json with its entry, add that entry to each |
46 | 682 | CVE with an entry for the LTS the kernel was added to, and update all the | 681 | CVE with an entry for the LTS the kernel was added to, and update all the |
47 | 683 | statuses for the newly added kernel, based off the version it will be | 682 | statuses for the newly added kernel, based off the version it will be |
48 | 684 | branched from. For example, to add a new kernel backported from Vivid to | 683 | branched from. For example, to add a new kernel backported from Vivid to |
49 | diff --git a/README.mozilla b/README.mozilla | |||
50 | index 5e06594..7c0f1f2 100644 | |||
51 | --- a/README.mozilla | |||
52 | +++ b/README.mozilla | |||
53 | @@ -45,7 +45,7 @@ CVE Triage | |||
54 | 45 | ---------- | 45 | ---------- |
55 | 46 | CVEs in Firefox are tracked in the xulrunner source packages for builds that | 46 | CVEs in Firefox are tracked in the xulrunner source packages for builds that |
56 | 47 | use the system xulrunner, and firefox source packages for those that use a | 47 | use the system xulrunner, and firefox source packages for those that use a |
58 | 48 | static build. active/00boilerplate.firefox is used to capture the source | 48 | static build. meta_lists/package-db.json is used to capture the source |
59 | 49 | package relationships when triaging CVEs for firefox (ie, you only need to | 49 | package relationships when triaging CVEs for firefox (ie, you only need to |
60 | 50 | specify 'firefox' as the source package). | 50 | specify 'firefox' as the source package). |
61 | 51 | 51 | ||
62 | @@ -65,6 +65,7 @@ xulrunner-1.9.2: system xul for reverese dependencies that process web content. | |||
63 | 65 | firefox: Ubuntu 8.04 LTS and higher (static build of 3.6.x or higher) | 65 | firefox: Ubuntu 8.04 LTS and higher (static build of 3.6.x or higher) |
64 | 66 | 66 | ||
65 | 67 | Additionally, the following share a common codebase and are affected by the | 67 | Additionally, the following share a common codebase and are affected by the |
67 | 68 | same CVE often enough that they warrant being part of the firefox boilerplate: | 68 | same CVE often enough that they warrant being part of the firefox pkgs set in |
68 | 69 | meta_lists/package-db.json: | ||
69 | 69 | seamonkey | 70 | seamonkey |
70 | 70 | thunkerbird | 71 | thunkerbird |
71 | diff --git a/README.usn b/README.usn | |||
72 | index ac8156f..e5c3f58 100644 | |||
73 | --- a/README.usn | |||
74 | +++ b/README.usn | |||
75 | @@ -30,7 +30,7 @@ versions. | |||
76 | 30 | Make sure the source package description was properly set for every affected | 30 | Make sure the source package description was properly set for every affected |
77 | 31 | release (i.e. the value for the usn.py --source-description argument in the | 31 | release (i.e. the value for the usn.py --source-description argument in the |
78 | 32 | generated usn shell script is set). If it was not automatically populated, make | 32 | generated usn shell script is set). If it was not automatically populated, make |
80 | 33 | sure the source package is present in $UCT/meta_list/package_info_overrides.json | 33 | sure the source package is present in $UCT/meta_list/package-db.json |
81 | 34 | and add it if it is not. | 34 | and add it if it is not. |
82 | 35 | 35 | ||
83 | 36 | ** CVEs references | 36 | ** CVEs references |
84 | diff --git a/README.webkit b/README.webkit | |||
85 | index 187d619..a52bf88 100644 | |||
86 | --- a/README.webkit | |||
87 | +++ b/README.webkit | |||
88 | @@ -75,7 +75,7 @@ chromium-browser | |||
89 | 75 | 75 | ||
90 | 76 | This package contains a fork of the WebKit code base (aka 'Blink'). It is | 76 | This package contains a fork of the WebKit code base (aka 'Blink'). It is |
91 | 77 | maintained separately by Google and is tracked in the 'chromium-browser' | 77 | maintained separately by Google and is tracked in the 'chromium-browser' |
93 | 78 | boilerplate. | 78 | entry in meta_lists/package-db.json. |
94 | 79 | 79 | ||
95 | 80 | oxide | 80 | oxide |
96 | 81 | ----- | 81 | ----- |
97 | @@ -83,11 +83,11 @@ oxide | |||
98 | 83 | Oxide is bindings for the chromium content api and therefore contains a fork | 83 | Oxide is bindings for the chromium content api and therefore contains a fork |
99 | 84 | of the webkit code base (aka, 'Blink'). The chromium content api is maintained | 84 | of the webkit code base (aka, 'Blink'). The chromium content api is maintained |
100 | 85 | separately by Google and Oxide is maintained by Canonical. Oxide is tracked in | 85 | separately by Google and Oxide is maintained by Canonical. Oxide is tracked in |
102 | 86 | the 'chromium-browser' boilerplate. | 86 | the 'chromium-browser' entry in meta_lists/package-db.json. |
103 | 87 | 87 | ||
104 | 88 | CVE Triage | 88 | CVE Triage |
105 | 89 | ---------- | 89 | ---------- |
107 | 90 | active/00boilerplate.webkit is used to capture the source package relationships | 90 | meta_lists/package-db.json is used to capture the source package relationships |
108 | 91 | when triaging CVEs for webkit (ie, you only need to specify 'webkit' as the | 91 | when triaging CVEs for webkit (ie, you only need to specify 'webkit' as the |
109 | 92 | source package). | 92 | source package). |
110 | 93 | 93 | ||
111 | diff --git a/scripts/active_edit b/scripts/active_edit | |||
112 | index a85f279..482446c 100755 | |||
113 | --- a/scripts/active_edit | |||
114 | +++ b/scripts/active_edit | |||
115 | @@ -10,9 +10,9 @@ | |||
116 | 10 | 10 | ||
117 | 11 | import optparse | 11 | import optparse |
118 | 12 | import os | 12 | import os |
119 | 13 | import pathlib | ||
120 | 13 | import re | 14 | import re |
121 | 14 | import sys | 15 | import sys |
122 | 15 | import time | ||
123 | 16 | 16 | ||
124 | 17 | import cve_lib | 17 | import cve_lib |
125 | 18 | import source_map | 18 | import source_map |
126 | @@ -21,7 +21,6 @@ releases = ['upstream'] + cve_lib.all_releases | |||
127 | 21 | 21 | ||
128 | 22 | max_file_size = 10 * 1024 * 1024 # 10MB | 22 | max_file_size = 10 * 1024 * 1024 # 10MB |
129 | 23 | cvedir = cve_lib.active_dir | 23 | cvedir = cve_lib.active_dir |
130 | 24 | boilerplates = cvedir | ||
131 | 25 | 24 | ||
132 | 26 | parser = optparse.OptionParser() | 25 | parser = optparse.OptionParser() |
133 | 27 | parser.add_option("-p", "--package", dest="pkgs", help="Package name and optional version where package is fixed (with optional Ubuntu release and version in that release)", metavar="NAME[,VERSION[,RELEASE,RELEASE_VERSION]]", action="append") | 26 | parser.add_option("-p", "--package", dest="pkgs", help="Package name and optional version where package is fixed (with optional Ubuntu release and version in that release)", metavar="NAME[,VERSION[,RELEASE,RELEASE_VERSION]]", action="append") |
134 | @@ -30,7 +29,8 @@ parser.add_option("-r", "--reference-url", dest="ref_urls", help="URL references | |||
135 | 30 | parser.add_option("-c", "--cve", dest="cve", help="CVE entry", metavar="CVE-YYYY-NNNN") | 29 | parser.add_option("-c", "--cve", dest="cve", help="CVE entry", metavar="CVE-YYYY-NNNN") |
136 | 31 | parser.add_option("-e", "--embargoed", dest="embargoed", help="This is an embargoed entry", action="store_true") | 30 | parser.add_option("-e", "--embargoed", dest="embargoed", help="This is an embargoed entry", action="store_true") |
137 | 32 | parser.add_option("-y", "--yes", dest="autoconfirm", help="Do not ask for confirmation", action="store_true") | 31 | parser.add_option("-y", "--yes", dest="autoconfirm", help="Do not ask for confirmation", action="store_true") |
139 | 33 | parser.add_option("-P", "--public", help="Record date the CVE went public", metavar="YYYY-MM-DD") | 32 | parser.add_option("-P", "--public", dest="public_date", help="Record date the CVE went public", metavar="YYYY-MM-DD") |
140 | 33 | parser.add_option("--priority", help="Record a priority for the CVE", default=None) | ||
141 | 34 | parser.add_option("-C", "--cvss", help="CVSS3.1 rating", metavar="CVSS:3.1/AV:_/AC:_/PR:_/UI:_/S:_/C:_/I:_/A:_") | 34 | parser.add_option("-C", "--cvss", help="CVSS3.1 rating", metavar="CVSS:3.1/AV:_/AC:_/PR:_/UI:_/S:_/C:_/I:_/A:_") |
142 | 35 | (options, args) = parser.parse_args() | 35 | (options, args) = parser.parse_args() |
143 | 36 | 36 | ||
144 | @@ -70,6 +70,7 @@ def create_or_update_external_subproject_cves(cve, pkgname): | |||
145 | 70 | continue | 70 | continue |
146 | 71 | affected_releases.append(release) | 71 | affected_releases.append(release) |
147 | 72 | 72 | ||
148 | 73 | ans = "y" | ||
149 | 73 | if not options.autoconfirm: | 74 | if not options.autoconfirm: |
150 | 74 | print("\n") | 75 | print("\n") |
151 | 75 | for release in affected_releases: | 76 | for release in affected_releases: |
152 | @@ -101,245 +102,130 @@ def release_wants_dne(release): | |||
153 | 101 | _, product, _, _ = cve_lib.get_subproject_details(release) | 102 | _, product, _, _ = cve_lib.get_subproject_details(release) |
154 | 102 | return product != None and product == cve_lib.PRODUCT_UBUNTU | 103 | return product != None and product == cve_lib.PRODUCT_UBUNTU |
155 | 103 | 104 | ||
156 | 104 | def update_cve(cve, pkgname, fixed_in=None, fixed_in_release=None, fixed_in_release_version=None): | ||
157 | 105 | '''Update an existing CVE file''' | ||
158 | 106 | with open(os.path.join(cvedir, cve), "r") as f: | ||
159 | 107 | lines = f.read(max_file_size).split('\n') | ||
160 | 108 | |||
161 | 109 | skipped = [] | ||
162 | 110 | added_lines = "" | ||
163 | 111 | |||
164 | 112 | tmp_releases = get_releases(pkgname) | ||
165 | 113 | |||
166 | 114 | # If we are using 00boilerplate.<pkgname> and the package is DNE on all | ||
167 | 115 | # current releases, then don't add the the stanza for this release. This | ||
168 | 116 | # allows us to use generic boilerplate names like 00boilerplate.gnutls or | ||
169 | 117 | # 00boilerplate.openjdk without adding useless extra stanzas. | ||
170 | 118 | # TODO: this still doesn't handle adding the contents of the boilerplate | ||
171 | 119 | # to an existing CVE (ie, ./scripts/sctive_edit -p openjdk -c CVE-YYYY-NNNN | ||
172 | 120 | # where CVE-YYYY-NNNN already exists) | ||
173 | 121 | if os.path.exists(os.path.join(boilerplates, '00boilerplate.%s' % pkgname)): | ||
174 | 122 | pkg_exists_somewhere = False | ||
175 | 123 | for r in tmp_releases: | ||
176 | 124 | if r == 'upstream' or (r in cve_lib.eol_releases \ | ||
177 | 125 | and not cve_lib.is_active_esm_release(r)): | ||
178 | 126 | continue | ||
179 | 127 | if pkg_in_rel(pkgname,r): | ||
180 | 128 | pkg_exists_somewhere = True | ||
181 | 129 | break | ||
182 | 130 | if not pkg_exists_somewhere: | ||
183 | 131 | print("skipping '" + pkgname + "' (DNE on all current releases)\n", file=sys.stderr) | ||
184 | 132 | return | ||
185 | 133 | |||
186 | 134 | for line in lines: | ||
187 | 135 | for r in tmp_releases: | ||
188 | 136 | if r == cve_lib.devel_release or r == '': | ||
189 | 137 | r = 'devel' | ||
190 | 138 | if not re.match(r'^' + r + ".*:", line): | ||
191 | 139 | continue | ||
192 | 140 | tmp = line.split(':') | ||
193 | 141 | match = "%s_%s" % (r, pkgname) | ||
194 | 142 | if match == tmp[0]: | ||
195 | 143 | skipped.append(r) | ||
196 | 144 | |||
197 | 145 | if len(skipped) == 0: | ||
198 | 146 | added_lines += '\nPatches_' + pkgname + ':\n' | ||
199 | 147 | |||
200 | 148 | higher_not_affected = False | ||
201 | 149 | for release in tmp_releases: | ||
202 | 150 | # don't add any external releases | ||
203 | 151 | if release in cve_lib.external_releases: | ||
204 | 152 | continue | ||
205 | 153 | |||
206 | 154 | r = release | ||
207 | 155 | if r == cve_lib.devel_release or r == '': | ||
208 | 156 | r = 'devel' | ||
209 | 157 | |||
210 | 158 | if r in skipped: | ||
211 | 159 | print("skipping '" + pkgname + "' for " + r + " (already included)\n", file=sys.stderr) | ||
212 | 160 | else: | ||
213 | 161 | # skip eol_releases releases without esm support | ||
214 | 162 | if r in cve_lib.eol_releases \ | ||
215 | 163 | and (not cve_lib.is_active_esm_release(r) or r == 'precise'): | ||
216 | 164 | continue | ||
217 | 165 | state = "needs-triage" | ||
218 | 166 | if not pkg_in_rel(pkgname, release): | ||
219 | 167 | # package doesn't exist in this release - see if it wants a | ||
220 | 168 | # DNE entry | ||
221 | 169 | if release_wants_dne(release): | ||
222 | 170 | state = "DNE" | ||
223 | 171 | else: | ||
224 | 172 | continue | ||
225 | 173 | elif cve_lib.is_active_esm_release(r): | ||
226 | 174 | state = "ignored (out of standard support)" | ||
227 | 175 | elif r == 'upstream' and fixed_in is not None: | ||
228 | 176 | state = "released (%s)" % fixed_in | ||
229 | 177 | elif fixed_in_release_version and r == fixed_in_release: | ||
230 | 178 | state = "not-affected (%s)" % fixed_in_release_version | ||
231 | 179 | higher_not_affected = True | ||
232 | 180 | elif higher_not_affected: | ||
233 | 181 | state = "not-affected" | ||
234 | 182 | added_lines += '%s_%s: %s\n' % (r, pkgname, state) | ||
235 | 183 | |||
236 | 184 | if len(releases) == len(skipped): | ||
237 | 185 | print("\nNothing to add!\n") | ||
238 | 186 | return | ||
239 | 187 | |||
240 | 188 | if not options.autoconfirm: | ||
241 | 189 | print("\n" + added_lines) | ||
242 | 190 | print("\nAppend the above to " + os.path.join(cvedir, cve) + " (y|N)? ") | ||
243 | 191 | ans = sys.stdin.readline().lower() | ||
244 | 192 | print("\n") | ||
245 | 193 | else: | ||
246 | 194 | ans = "y" | ||
247 | 195 | 105 | ||
252 | 196 | if ans.startswith("y"): | 106 | def create_or_update_cve(cve, packages, priority=None, bug_urls=None, ref_urls=None, public_date=None, desc=None, cvss=None): |
253 | 197 | file = open(os.path.join(cvedir, cve), "a") | 107 | |
254 | 198 | file.write(added_lines) | 108 | pkgs = [] |
255 | 199 | file.close() | 109 | fixed = {} |
256 | 110 | # parse optional fixed_in release and version from package name | ||
257 | 111 | for p in packages: | ||
258 | 112 | tmp_p = p.split(',') | ||
259 | 113 | pkg = tmp_p[0] | ||
260 | 114 | pkgs.append(pkg) | ||
261 | 115 | fixed[pkg] = tmp_p[1:] | ||
262 | 116 | |||
263 | 117 | update = False | ||
264 | 118 | try: | ||
265 | 119 | dst = cve_lib.find_cve(cve) | ||
266 | 120 | update = True | ||
267 | 121 | except ValueError: | ||
268 | 122 | dst = os.path.join(pathlib.Path().parent.resolve(), "active", cve) | ||
269 | 123 | |||
270 | 124 | # collect notes from pkg_db and add any extra pkgs from pkg_db as well | ||
271 | 125 | notes = [] | ||
272 | 126 | for p in pkgs: | ||
273 | 127 | if p in pkg_db: | ||
274 | 128 | notes = notes + pkg_db[p]["notes"] | ||
275 | 129 | pkgs = pkgs + list(pkg_db[p]["pkgs"].keys()) | ||
276 | 130 | |||
277 | 131 | # normalise the list of packages | ||
278 | 132 | pkgs = sorted(list(set(pkgs))) | ||
279 | 133 | |||
280 | 134 | # remove any packages which don't actually exist in any release | ||
281 | 135 | for p in pkgs: | ||
282 | 136 | keep = False | ||
283 | 137 | for r in source.keys(): | ||
284 | 138 | keep |= p in source[r] | ||
285 | 139 | if not keep: | ||
286 | 140 | pkgs.remove(p) | ||
287 | 141 | |||
288 | 142 | if update: | ||
289 | 143 | mode = "a" | ||
290 | 200 | else: | 144 | else: |
344 | 201 | print("Aborted\n") | 145 | mode = "w" |
345 | 202 | 146 | with open(dst, mode, encoding="utf-8") as fp: | |
346 | 203 | def create_cve(cve, pkgname, fixed_in=None, fixed_in_release=None, fixed_in_release_version=None): | 147 | if not update: |
347 | 204 | '''Create a new CVE file''' | 148 | print('Candidate: %s' % (cve), file=fp) |
348 | 205 | src = os.path.join(boilerplates, '00boilerplate') | 149 | print('PublicDate: %s' % (public_date if public_date else "unknown"), file=fp) |
349 | 206 | if os.path.exists(src + "." + pkgname): | 150 | print('References:\n https://cve.mitre.org/cgi-bin/cvename.cgi?name=%s' % (cve), file=fp) |
350 | 207 | src = src + "." + pkgname | 151 | for url in (ref_urls if ref_urls else []): |
351 | 208 | boiler = open(src, "r") | 152 | print(" %s" % url, file=fp) |
352 | 209 | lines = boiler.read(max_file_size).splitlines() | 153 | print('Description:', file=fp) |
353 | 210 | boiler.close() | 154 | for desc_line in (cve_lib.wrap_text(desc).split('\n') if desc else []): |
354 | 211 | 155 | print(" %s" % (desc_line), file=fp) | |
355 | 212 | cand_pat = re.compile(r'^Candidate:') | 156 | print('Ubuntu-Description:', file=fp) |
356 | 213 | ref_pat = re.compile(r'^References:') | 157 | print('Notes:', file=fp) |
357 | 214 | bugs_pat = re.compile(r'^Bugs:') | 158 | for note in notes: |
358 | 215 | cvss_pat = re.compile(r'^CVSS:') | 159 | for note_line in cve_lib.wrap_text(note[1], 75 - len(note[0]) - 2).split('\n'): |
359 | 216 | pkg_pat = re.compile(r'^#?[a-z/\-]+_(PKG|%s):' % pkgname) | 160 | print(" %s> %s" % (note[0], note_line), file=fp) |
360 | 217 | patch_pat = re.compile(r'^#?Patches_(PKG|%s):' % pkgname) | 161 | print('Mitigation:', file=fp) |
361 | 218 | 162 | print('Bugs:', file=fp) | |
362 | 219 | tmp_releases = get_releases(pkgname) | 163 | for url in (bug_urls if bug_urls else []): |
363 | 220 | added = set() | 164 | print(" %s" % url, file=fp) |
364 | 221 | 165 | print('Priority: %s' % (priority if priority else "untriaged"), file=fp) | |
365 | 222 | contents = "" | 166 | print('Discovered-by:', file=fp) |
366 | 223 | higher_not_affected = False | 167 | print('Assigned-to:', file=fp) |
367 | 224 | for line in lines: | 168 | print('CVSS:', file=fp) |
368 | 225 | if (cand_pat.search(line)): | 169 | for entry in (cvss if cvss else []): |
369 | 226 | contents += line + os.path.basename(cve) + '\n' | 170 | src, cvss = entry |
370 | 227 | elif line.startswith('PublicDate:'): | 171 | print(' %s: %s' % (src, cvss), file=fp) |
371 | 228 | if options.embargoed: | 172 | |
372 | 229 | if options.public: | 173 | # add package info from pkg_db |
373 | 230 | # use public date as CRD | 174 | for p in pkgs: |
374 | 231 | contents += "CRD: %s\n" % options.public | 175 | print('', file=fp) |
375 | 232 | else: | 176 | print('Patches_%s:' % p, file=fp) |
376 | 233 | contents += "CRD: <TBD>\n" | 177 | # find which releases p exists in |
377 | 234 | if options.public: | 178 | higher_not_affected = False |
378 | 235 | contents += "PublicDate: %s\n" % options.public | 179 | fixed_in = None |
379 | 236 | else: | 180 | fixed_in_release = None |
380 | 237 | # default to today-- this will be refreshed by check-cves | 181 | fixed_in_release_version = None |
381 | 238 | contents += "PublicDate: %s\n" % time.strftime("%Y-%m-%d", time.gmtime()) | 182 | if p in fixed and len(fixed[p]) > 0: |
382 | 239 | elif (cvss_pat.search(line)): | 183 | fixed_in = fixed[p][0] |
383 | 240 | if options.cvss: | 184 | if len(fixed[p]) > 1: |
384 | 241 | contents += 'CVSS: ' + options.cvss + '\n' | 185 | fixed_in_release = fixed[p][1] |
385 | 242 | else: | 186 | if len(fixed[p]) > 2: |
386 | 243 | contents += 'CVSS:\n' | 187 | fixed_in_release_version = fixed[p][2] |
387 | 244 | elif (patch_pat.search(line)): | 188 | for rel in ['upstream'] + list(source.keys()): |
388 | 245 | contents += '\nPatches_' + pkgname + ':\n' | 189 | # determine default state but override this if pkg_db has a |
389 | 246 | elif (pkg_pat.search(line)): | 190 | # better one |
390 | 247 | for release in tmp_releases: | 191 | state = "needs-triage" |
391 | 248 | if release == cve_lib.devel_release or release == '': | 192 | if not pkg_in_rel(p, rel): |
392 | 249 | release = 'devel' | 193 | # package doesn't exist in this release - see if it wants a |
393 | 250 | 194 | # DNE entry | |
394 | 251 | # skip eol_releases releases without esm support | 195 | if release_wants_dne(rel): |
395 | 252 | if release in cve_lib.eol_releases \ | 196 | state = "DNE" |
396 | 253 | and not cve_lib.is_active_esm_release(release): | 197 | else: |
397 | 198 | continue | ||
398 | 199 | if rel == cve_lib.devel_release: | ||
399 | 200 | # devel is present in source.keys() so use that instead | ||
400 | 201 | # of the codename | ||
401 | 254 | continue | 202 | continue |
402 | 203 | if cve_lib.is_active_esm_release(rel): | ||
403 | 204 | state = "ignored (out of standard support)" | ||
404 | 205 | elif rel == 'upstream' and fixed_in is not None: | ||
405 | 206 | state = "released (%s)" % fixed_in | ||
406 | 207 | elif fixed_in_release_version and rel == fixed_in_release: | ||
407 | 208 | state = "not-affected (%s)" % fixed_in_release_version | ||
408 | 209 | higher_not_affected = True | ||
409 | 210 | elif higher_not_affected: | ||
410 | 211 | state = "not-affected" | ||
411 | 212 | |||
412 | 213 | # use pkg_db state if one exists | ||
413 | 214 | if p in pkg_db and p in pkg_db[p]["pkgs"] and rel in pkg_db[p]["pkgs"][p]: | ||
414 | 215 | state_tuple = pkg_db[p]["pkgs"][p][rel] | ||
415 | 216 | state = state_tuple[0] | ||
416 | 217 | if len(state_tuple[1]) > 0: | ||
417 | 218 | state = state + " (%s)" % state_tuple[1] | ||
418 | 219 | if rel not in cve_lib.external_releases: | ||
419 | 220 | print('%s_%s: %s' % (rel, p, state), file=fp) | ||
420 | 221 | else: | ||
421 | 222 | # add this to subprojects for rel | ||
422 | 223 | with open(os.path.join(cve_lib.get_external_subproject_cve_dir(rel), cve), "a") as f: | ||
423 | 224 | print('%s_%s: %s' % (rel, p, state), file=f) | ||
424 | 255 | 225 | ||
425 | 256 | rel_pat = re.compile('#?' + release + '_') | ||
426 | 257 | |||
427 | 258 | if (rel_pat.search(line)): | ||
428 | 259 | state = "needs-triage" | ||
429 | 260 | if not pkg_in_rel(pkgname, release): | ||
430 | 261 | # package doesn't exist in this release - see if it wants a | ||
431 | 262 | # DNE entry | ||
432 | 263 | if release_wants_dne(release): | ||
433 | 264 | state = "DNE" | ||
434 | 265 | else: | ||
435 | 266 | continue | ||
436 | 267 | elif cve_lib.is_active_esm_release(release): | ||
437 | 268 | state = "ignored (out of standard support)" | ||
438 | 269 | elif release == 'upstream' and fixed_in is not None: | ||
439 | 270 | state = "released (%s)" % fixed_in | ||
440 | 271 | elif fixed_in_release_version and release == fixed_in_release: | ||
441 | 272 | state = "not-affected (%s)" % fixed_in_release_version | ||
442 | 273 | higher_not_affected = True | ||
443 | 274 | elif higher_not_affected: | ||
444 | 275 | state = "not-affected" | ||
445 | 276 | if release not in added: | ||
446 | 277 | contents += "%s_%s: %s\n" % (release, pkgname, state) | ||
447 | 278 | added.add(release) | ||
448 | 279 | elif ref_pat.search(line): | ||
449 | 280 | if not re.search(r'N', cve): | ||
450 | 281 | contents += line + "\n https://cve.mitre.org/cgi-bin/cvename.cgi?name=" + cve + "\n" | ||
451 | 282 | else: | ||
452 | 283 | contents += line + "\n" | ||
453 | 284 | if options.ref_urls: | ||
454 | 285 | for i in options.ref_urls: | ||
455 | 286 | contents += " %s\n" % i | ||
456 | 287 | elif options.bug_urls and bugs_pat.search(line): | ||
457 | 288 | contents += line | ||
458 | 289 | for i in options.bug_urls: | ||
459 | 290 | contents += "\n %s\n" % i | ||
460 | 291 | else: | ||
461 | 292 | contents += line + '\n' | ||
462 | 293 | |||
463 | 294 | # check for missing entries which aren't in the boilerplate | ||
464 | 295 | for release in tmp_releases: | ||
465 | 296 | if release == cve_lib.devel_release or release == '': | ||
466 | 297 | release = 'devel' | ||
467 | 298 | if release in added: | ||
468 | 299 | continue | ||
469 | 300 | if release in cve_lib.external_releases: | ||
470 | 301 | continue | ||
471 | 302 | # skip eol_releases releases without esm support | ||
472 | 303 | if release in cve_lib.eol_releases \ | ||
473 | 304 | and not cve_lib.is_active_esm_release(release): | ||
474 | 305 | continue | ||
475 | 306 | if pkg_in_rel(pkgname, release): | ||
476 | 307 | state = "needs-triage" | ||
477 | 308 | contents += "%s_%s: %s\n" % (release, pkgname, state) | ||
478 | 309 | |||
479 | 310 | # for each line in contents, see if we need to supercede DNE status | ||
480 | 311 | # with needs-triage as boilerplate entries may not be up-to-date with | ||
481 | 312 | # xxx-supported.txt lists | ||
482 | 313 | pkgstatus_pat = re.compile(r'^([/a-z/\-]+)_(.*): (.*)') | ||
483 | 314 | old_contents = contents | ||
484 | 315 | contents = "" | ||
485 | 316 | for line in old_contents.splitlines(): | ||
486 | 317 | match = pkgstatus_pat.search(line) | ||
487 | 318 | if match is not None: | ||
488 | 319 | release = match.group(1) | ||
489 | 320 | pkgname = match.group(2) | ||
490 | 321 | status = match.group(3) | ||
491 | 322 | if status == 'DNE' and pkg_in_rel(pkgname, release): | ||
492 | 323 | status = 'needs-triage' | ||
493 | 324 | line = '%s_%s: %s' % (release, pkgname, status) | ||
494 | 325 | contents += line + '\n' | ||
495 | 326 | |||
496 | 327 | if not options.autoconfirm: | ||
497 | 328 | print(contents) | ||
498 | 329 | print("\nWrite the above to " + os.path.join(cvedir, cve) + " (y|N)? ") | ||
499 | 330 | ans = sys.stdin.readline().lower() | ||
500 | 331 | print("\n") | ||
501 | 332 | else: | ||
502 | 333 | ans = "y" | ||
503 | 334 | |||
504 | 335 | if ans.startswith("y"): | ||
505 | 336 | newfile = open(os.path.join(cvedir, cve), 'w') | ||
506 | 337 | newfile.write(contents) | ||
507 | 338 | newfile.close() | ||
508 | 339 | else: | ||
509 | 340 | print("Aborted\n") | ||
510 | 341 | 226 | ||
511 | 342 | 227 | ||
512 | 228 | pkg_db = cve_lib.load_package_db() | ||
513 | 343 | 229 | ||
514 | 344 | if not options.pkgs: | 230 | if not options.pkgs: |
515 | 345 | parser.print_help() | 231 | parser.print_help() |
516 | @@ -369,37 +255,5 @@ if not pat.search(cve): | |||
517 | 369 | print("Bad CVE entry. Should be CVE-XXXX-XXXX\n", file=sys.stderr) | 255 | print("Bad CVE entry. Should be CVE-XXXX-XXXX\n", file=sys.stderr) |
518 | 370 | sys.exit(1) | 256 | sys.exit(1) |
519 | 371 | 257 | ||
553 | 372 | # more here | 258 | create_or_update_cve(cve, pkgs, priority=options.priority, bug_urls=options.bug_urls, ref_urls=options.ref_urls, public_date=options.public_date, cvss=options.cvss) |
521 | 373 | pat = re.compile(r'\s') | ||
522 | 374 | for p in pkgs: | ||
523 | 375 | tmp_p = p.split(',') | ||
524 | 376 | pkgname = tmp_p[0] | ||
525 | 377 | fixed_in = None | ||
526 | 378 | if len(tmp_p) > 1: | ||
527 | 379 | fixed_in = tmp_p[1] | ||
528 | 380 | fixed_in_release = None | ||
529 | 381 | fixed_in_release_version = None | ||
530 | 382 | if len(tmp_p) > 3: | ||
531 | 383 | fixed_in_release = tmp_p[2] | ||
532 | 384 | fixed_in_release_version = tmp_p[3] | ||
533 | 385 | |||
534 | 386 | if pat.search(pkgname): | ||
535 | 387 | print("Bad package name\n", file=sys.stderr) | ||
536 | 388 | sys.exit(1) | ||
537 | 389 | |||
538 | 390 | if not os.path.isfile(os.path.join(boilerplates, "00boilerplate")): | ||
539 | 391 | print("Could not find 00boilerplate in " + cvedir + "\n", file=sys.stderr) | ||
540 | 392 | sys.exit(1) | ||
541 | 393 | |||
542 | 394 | if (os.path.isfile(os.path.join(cvedir, cve))): | ||
543 | 395 | if not options.autoconfirm: | ||
544 | 396 | print("Found existing " + cve + "...\n\n") | ||
545 | 397 | update_cve(cve, pkgname, fixed_in, fixed_in_release, fixed_in_release_version) | ||
546 | 398 | else: | ||
547 | 399 | if not options.autoconfirm: | ||
548 | 400 | print("Creating new " + cve + "...\n\n") | ||
549 | 401 | create_cve(cve, pkgname, fixed_in, fixed_in_release, fixed_in_release_version) | ||
550 | 402 | |||
551 | 403 | create_or_update_external_subproject_cves(cve, pkgname) | ||
552 | 404 | |||
554 | 405 | sys.exit(0) | 259 | sys.exit(0) |
555 | diff --git a/scripts/add-derived-kernel b/scripts/add-derived-kernel | |||
556 | index d219586..3c146ab 100755 | |||
557 | --- a/scripts/add-derived-kernel | |||
558 | +++ b/scripts/add-derived-kernel | |||
559 | @@ -67,19 +67,37 @@ if [ -z "${DERIVED_KERNEL}" ] ; then | |||
560 | 67 | DERIVED_KERNEL="linux" | 67 | DERIVED_KERNEL="linux" |
561 | 68 | fi | 68 | fi |
562 | 69 | 69 | ||
564 | 70 | chunk=$(mktemp -t add-derived-kernel-XXXXXX) | 70 | releases=$(PYTHONPATH=./scripts python3 -c "import cve_lib; print(' '.join([a for a in cve_lib.all_releases if a not in cve_lib.external_releases and a is not cve_lib.devel_release and (a not in cve_lib.eol_releases or cve_lib.is_active_esm_release(a))]))") |
565 | 71 | |||
566 | 72 | releases="upstream ${releases} devel" | ||
567 | 71 | 73 | ||
568 | 74 | chunk=$(mktemp -t add-derived-kernel-XXXXXX) | ||
569 | 72 | echo "" > "$chunk" | 75 | echo "" > "$chunk" |
578 | 73 | grep "^.*_linux:" active/00boilerplate.linux | sed -e "s#_linux:.*#_linux-${KERNEL}: DNE#" >> "$chunk" | 76 | echo "Patches_linux-${KERNEL}:" >> "$chunk" |
579 | 74 | sed -i -e "s#^${RELEASE}_linux-${KERNEL}:.*#${RELEASE}_linux-${KERNEL}: needs-triage#" "$chunk" | 77 | |
580 | 75 | sed -i -e "s#^Patches_linux-${KERNEL}:.*#Patches_linux-${KERNEL}:#" "$chunk" | 78 | json="{" |
581 | 76 | sed -i -e "s#^upstream_linux-${KERNEL}:.*#upstream_linux-${KERNEL}: needs-triage#" "$chunk" | 79 | |
582 | 77 | 80 | # add other releases | |
583 | 78 | if ! grep -q "^${RELEASE}_linux-${KERNEL}:" active/00boilerplate.linux ; then | 81 | for rel in $releases; do |
584 | 79 | echo "Updating 00boilerplate.linux..." | 82 | status="DNE" |
585 | 80 | cat "$chunk" >> active/00boilerplate.linux | 83 | if [ "${rel}" = "${RELEASE}" ] || [ "${rel}" = "upstream" ] ; then |
586 | 84 | status="needs-triage" | ||
587 | 85 | fi | ||
588 | 86 | json="$json\"${rel}_${DERIVED_KERNEL}\": [\"$status\", \"\"]," | ||
589 | 87 | echo "${rel}_linux-${KERNEL}: ${status}" >> "$chunk" | ||
590 | 88 | done | ||
591 | 89 | # remove trailing comma | ||
592 | 90 | json=${json%?} | ||
593 | 91 | # close json objects | ||
594 | 92 | json="$json}" | ||
595 | 93 | |||
596 | 94 | # append the derived kernel to meta_lists/package-db.json if it doesn't exist already | ||
597 | 95 | if jq -e ".linux.pkgs.\"linux-$KERNEL\"" meta_lists/package-db.json; then | ||
598 | 96 | echo "kernel linux-${KERNEL} already exists in meta_lists/package-db.json, skipping update." | ||
599 | 81 | else | 97 | else |
601 | 82 | echo "00boilerplate.linux already contains linux-${KERNEL}, skipping update." | 98 | echo "adding kernel linux-${KERNEL} to meta_lists/package-db.json" |
602 | 99 | jq -e ".linux.pkgs.\"linux-$KERNEL\" += $json" meta_lists/package-db.json > meta_lists/package-db.json.new | ||
603 | 100 | mv meta_lists/package-db.json.new meta_lists/package-db.json | ||
604 | 83 | fi | 101 | fi |
605 | 84 | 102 | ||
606 | 85 | echo "Adding new backport to existing CVEs..." | 103 | echo "Adding new backport to existing CVEs..." |
607 | @@ -105,6 +123,10 @@ echo "Updating status of released CVEs (ok if empty and prints help)..." | |||
608 | 105 | xargs --no-run-if-empty ./scripts/mass-cve-edit -p "linux-${KERNEL}" -r "${RELEASE}" -s not-affected | 123 | xargs --no-run-if-empty ./scripts/mass-cve-edit -p "linux-${KERNEL}" -r "${RELEASE}" -s not-affected |
609 | 106 | 124 | ||
610 | 107 | printf "\n\nPlease update cve_lib.py, kernel_lib.py, sis-generate-usn, cve-alert.sh, and prepare-kernel-usn.py in scripts/\n" | 125 | printf "\n\nPlease update cve_lib.py, kernel_lib.py, sis-generate-usn, cve-alert.sh, and prepare-kernel-usn.py in scripts/\n" |
612 | 108 | echo "Please add package description information if necessary to meta_lists/package_info_overrides.json" | 126 | echo "Please add package description information if necessary to meta_lists/package-db.json" |
613 | 109 | echo "If this kernel has not been published yet, please add it to the unpublished_kernels list in check-syntax." | 127 | echo "If this kernel has not been published yet, please add it to the unpublished_kernels list in check-syntax." |
614 | 110 | echo "Also, before publishing a USN against the new kernel, ensure packages_mirror has been run" | 128 | echo "Also, before publishing a USN against the new kernel, ensure packages_mirror has been run" |
615 | 129 | |||
616 | 130 | # Local Variables: | ||
617 | 131 | # sh-indentation: 4 | ||
618 | 132 | # End: | ||
619 | diff --git a/scripts/add_meta_info.py b/scripts/add_meta_info.py | |||
620 | index 994c8ec..d5b78c3 100755 | |||
621 | --- a/scripts/add_meta_info.py | |||
622 | +++ b/scripts/add_meta_info.py | |||
623 | @@ -10,11 +10,13 @@ if len(sys.argv) != 4: | |||
624 | 10 | package = sys.argv[1] | 10 | package = sys.argv[1] |
625 | 11 | title = sys.argv[2] | 11 | title = sys.argv[2] |
626 | 12 | description = sys.argv[3] | 12 | description = sys.argv[3] |
628 | 13 | json_file = os.environ['UCT'] + "/meta_lists/package_info_overrides.json" | 13 | json_file = os.environ['UCT'] + "/meta_lists/package-db.json" |
629 | 14 | 14 | ||
630 | 15 | with open(json_file, 'r') as handle: | 15 | with open(json_file, 'r') as handle: |
631 | 16 | parsed = json.load(handle) | 16 | parsed = json.load(handle) |
633 | 17 | parsed[package] = { "description": description, 'title': title } | 17 | parsed.setdefault(package, {}) |
634 | 18 | parsed[package]['title'] = title | ||
635 | 19 | parsed[package]['description'] = description | ||
636 | 18 | new_content = json.dumps(parsed, indent=4, sort_keys=True) | 20 | new_content = json.dumps(parsed, indent=4, sort_keys=True) |
637 | 19 | 21 | ||
638 | 20 | with open(json_file, 'w') as handle: | 22 | with open(json_file, 'w') as handle: |
639 | diff --git a/scripts/boilerplate-to-json.py b/scripts/boilerplate-to-json.py | |||
640 | 21 | new file mode 100755 | 23 | new file mode 100755 |
641 | index 0000000..db2c735 | |||
642 | --- /dev/null | |||
643 | +++ b/scripts/boilerplate-to-json.py | |||
644 | @@ -0,0 +1,151 @@ | |||
645 | 1 | #!/usr/bin/python3 | ||
646 | 2 | import glob | ||
647 | 3 | import json | ||
648 | 4 | import os | ||
649 | 5 | import re | ||
650 | 6 | import sys | ||
651 | 7 | |||
652 | 8 | import cve_lib | ||
653 | 9 | |||
654 | 10 | def parse_boilerplate(filepath): | ||
655 | 11 | cve_data = cve_lib.load_cve(filepath) | ||
656 | 12 | # capture tags, Notes, and package relationships | ||
657 | 13 | data = dict() | ||
658 | 14 | data.setdefault("aliases", list()) | ||
659 | 15 | # tags are a set but json can't serialise a set so convert to a list first | ||
660 | 16 | data.setdefault("tags", list(cve_data.get("tags", list()))) | ||
661 | 17 | data.setdefault("notes", cve_data.get("Notes", list())) | ||
662 | 18 | data.setdefault("pkgs", cve_data.get("pkgs", dict())) | ||
663 | 19 | return data | ||
664 | 20 | |||
665 | 21 | |||
666 | 22 | def load_boilerplates(): | ||
667 | 23 | data = dict() | ||
668 | 24 | aliases = dict() | ||
669 | 25 | for filepath in glob.glob("active/00boilerplate.*"): | ||
670 | 26 | name = ".".join(filepath.split(".")[1:]) | ||
671 | 27 | # check if is a symlink and if so don't bother loading the file | ||
672 | 28 | # directly but add an entry as this is an alias | ||
673 | 29 | if os.path.islink(filepath): | ||
674 | 30 | orig = os.readlink(filepath) | ||
675 | 31 | orig_name = ".".join(orig.split(".")[1:]) | ||
676 | 32 | aliases.setdefault(orig_name, set()) | ||
677 | 33 | aliases[orig_name].add(name) | ||
678 | 34 | continue | ||
679 | 35 | bpdata = parse_boilerplate(filepath) | ||
680 | 36 | # having a package reference itself as we have in the boilerplates | ||
681 | 37 | # is redundant - although this is not always the case as we may | ||
682 | 38 | # have a boilerplate filename like openjdk yet there is no openjdk | ||
683 | 39 | # package (just openjdk-8 etc) - so ignore any failures here | ||
684 | 40 | try: | ||
685 | 41 | del bpdata["pkgs"][name] | ||
686 | 42 | except KeyError: | ||
687 | 43 | pass | ||
688 | 44 | data.setdefault(name, bpdata) | ||
689 | 45 | for alias in aliases: | ||
690 | 46 | data[alias]["aliases"] = list(aliases[alias]) | ||
691 | 47 | return data | ||
692 | 48 | |||
693 | 49 | |||
694 | 50 | def load_package_info_overrides(): | ||
695 | 51 | with open("meta_lists/package_info_overrides.json", "r") as fp: | ||
696 | 52 | data = json.load(fp) | ||
697 | 53 | return data | ||
698 | 54 | |||
699 | 55 | |||
700 | 56 | overrides_data = load_package_info_overrides() | ||
701 | 57 | # turn this into empty data | ||
702 | 58 | for pkg in overrides_data: | ||
703 | 59 | for key in ["aliases", "tags", "notes"]: | ||
704 | 60 | overrides_data[pkg][key] = [] | ||
705 | 61 | overrides_data[pkg]["pkgs"] = {} | ||
706 | 62 | bp_data = load_boilerplates() | ||
707 | 63 | # merge the two data sources | ||
708 | 64 | data = {**overrides_data, **bp_data} | ||
709 | 65 | |||
710 | 66 | print(json.dumps(data, indent=2)) | ||
711 | 67 | sys.exit(0) | ||
712 | 68 | |||
713 | 69 | # TODO - decide if we want to keep this - for now leave it out | ||
714 | 70 | |||
715 | 71 | def parse_embedded_code_copies(filepath): | ||
716 | 72 | begin_re = re.compile(r"^---BEGIN$") | ||
717 | 73 | pkg_re = re.compile(r"^([a-zA-Z0-9.+_-]+).*$") | ||
718 | 74 | # [release] - srcpkg version|<status> (sort; bug #) | ||
719 | 75 | embedding_re = re.compile(r"^\t(\[([a-z]+)\] )?- ([a-zA-Z0-9.+-]+) ([a-zA-Z0-9:~.+-]+|<(unfixed|removed|itp|not-affected|unknown|unfixable)>)( \((static|embed|modified-embed|fork|old-version)(; (.*))?\))?.*$") | ||
720 | 76 | note_re = re.compile(r"^\tNOTE: (.*)$") | ||
721 | 77 | data = dict() | ||
722 | 78 | pkgs = dict() | ||
723 | 79 | notes = dict() | ||
724 | 80 | with open(filepath, "r") as fp: | ||
725 | 81 | linenum = 0 | ||
726 | 82 | begin = False | ||
727 | 83 | pkg = None | ||
728 | 84 | embedding_pkg = None | ||
729 | 85 | for line in fp.readlines(): | ||
730 | 86 | linenum += 1 | ||
731 | 87 | # strip trailing space | ||
732 | 88 | line = line.rstrip() | ||
733 | 89 | if not begin: | ||
734 | 90 | if begin_re.match(line): | ||
735 | 91 | begin = True | ||
736 | 92 | continue | ||
737 | 93 | if len(line) == 0: | ||
738 | 94 | continue | ||
739 | 95 | # is this a new package | ||
740 | 96 | m = pkg_re.match(line) | ||
741 | 97 | if m is not None: | ||
742 | 98 | pkg = m[1] | ||
743 | 99 | pkgs[pkg] = dict() | ||
744 | 100 | continue | ||
745 | 101 | # is this an entry for a package | ||
746 | 102 | m = embedding_re.match(line) | ||
747 | 103 | if m is not None: | ||
748 | 104 | assert pkg is not None | ||
749 | 105 | if m[2] is not None: | ||
750 | 106 | # release | ||
751 | 107 | pass | ||
752 | 108 | embedding_pkg = m[3] | ||
753 | 109 | status = m[4] | ||
754 | 110 | sort = m[7] | ||
755 | 111 | pkgs[pkg][embedding_pkg] = (status, sort) | ||
756 | 112 | continue | ||
757 | 113 | m = note_re.match(line) | ||
758 | 114 | if m is not None: | ||
759 | 115 | assert pkg is not None | ||
760 | 116 | assert embedding_pkg is not None | ||
761 | 117 | notes.setdefault(pkg, dict()) | ||
762 | 118 | notes[pkg][embedding_pkg] = m[1] | ||
763 | 119 | continue | ||
764 | 120 | print("%s: %d: Failed to parse: '%s'" % (filepath, linenum, line), file=sys.stderr) | ||
765 | 121 | data["pkgs"] = pkgs | ||
766 | 122 | data["notes"] = notes | ||
767 | 123 | return data | ||
768 | 124 | |||
769 | 125 | # also parse debian's embedded-code-copies and amalgate that into data | ||
770 | 126 | config = cve_lib.read_config() | ||
771 | 127 | debian_embedded_copies = os.path.join(config['secure_testing_path'], "data", "embedded-code-copies") | ||
772 | 128 | code_copies = parse_embedded_code_copies(debian_embedded_copies) | ||
773 | 129 | for pkg in code_copies["pkgs"]: | ||
774 | 130 | data.setdefault(pkg, {"aliases": [], | ||
775 | 131 | "tags": [], | ||
776 | 132 | "notes": [], | ||
777 | 133 | "pkgs": {}}) | ||
778 | 134 | for embedding_pkg in code_copies["pkgs"][pkg]: | ||
779 | 135 | data[pkg]["pkgs"].setdefault(embedding_pkg, ("needs-triage", "")) | ||
780 | 136 | # use the sort to create a more informative description of the | ||
781 | 137 | # relationship between pkg and embedding_pkg | ||
782 | 138 | status, sort = code_copies["pkgs"][pkg][embedding_pkg] | ||
783 | 139 | template = "%s embeds a copy of %s" | ||
784 | 140 | if sort == "static": | ||
785 | 141 | template = "%s statically links against %s so needs to be rebuilt" | ||
786 | 142 | elif sort == "modified-embed": | ||
787 | 143 | template = "%s embeds a modified copy of %s so should be checked if is affected" | ||
788 | 144 | elif sort == "fork": | ||
789 | 145 | template = "%s contains a fork of %s so should be checked if is affected" | ||
790 | 146 | elif sort == "old-version": | ||
791 | 147 | template = "%s contains an older version of %s so should be checked if is affected" | ||
792 | 148 | note = template % (embedding_pkg, pkg) | ||
793 | 149 | if pkg in code_copies["notes"] and embedding_pkg in code_copies["notes"][pkg]: | ||
794 | 150 | note = note + " - " + code_copies["notes"][pkg][embedding_pkg] | ||
795 | 151 | data[pkg]["notes"].append(("dst", note)) | ||
796 | diff --git a/scripts/check-cves b/scripts/check-cves | |||
797 | index fa6093d..963f7d3 100755 | |||
798 | --- a/scripts/check-cves | |||
799 | +++ b/scripts/check-cves | |||
800 | @@ -34,7 +34,6 @@ import xml.sax | |||
801 | 34 | import xml.sax.handler | 34 | import xml.sax.handler |
802 | 35 | import xml.sax.xmlreader | 35 | import xml.sax.xmlreader |
803 | 36 | from html import escape | 36 | from html import escape |
804 | 37 | from functools import reduce | ||
805 | 38 | import progressbar | 37 | import progressbar |
806 | 39 | 38 | ||
807 | 40 | import cve_lib | 39 | import cve_lib |
808 | @@ -75,9 +74,10 @@ for release in list(source.keys()): | |||
809 | 75 | # likely to sometimes contain these | 74 | # likely to sometimes contain these |
810 | 76 | common_words = ['an', 'and', 'context', 'file', 'modules', 'the', 'when'] | 75 | common_words = ['an', 'and', 'context', 'file', 'modules', 'the', 'when'] |
811 | 77 | allsrcs.difference_update(set(common_words)) | 76 | allsrcs.difference_update(set(common_words)) |
815 | 78 | # add boilerplate names too so we can get mysql or postgresql | 77 | # add names and aliases etc from package-db |
816 | 79 | # etc even if they don't exist as source package names | 78 | allsrcs.update(set(cve_lib.package_db.keys())) |
817 | 80 | allsrcs.update(set(cve_lib.load_boilerplates().keys())) | 79 | for pkg in cve_lib.package_db: |
818 | 80 | allsrcs.update(set(cve_lib.package_db[pkg]["aliases"])) | ||
819 | 81 | 81 | ||
820 | 82 | built_using_map = None | 82 | built_using_map = None |
821 | 83 | 83 | ||
822 | @@ -124,24 +124,8 @@ def subtract_list(list1, list2): | |||
823 | 124 | list1.remove(item) | 124 | list1.remove(item) |
824 | 125 | 125 | ||
825 | 126 | 126 | ||
826 | 127 | def wordwrap(text, width): | ||
827 | 128 | """ | ||
828 | 129 | A word-wrap function that preserves existing line breaks | ||
829 | 130 | and most spaces in the text. Expects that existing line | ||
830 | 131 | breaks are posix newlines (\n). | ||
831 | 132 | """ | ||
832 | 133 | return reduce(lambda line, word, width=width: '%s%s%s' % | ||
833 | 134 | (line, | ||
834 | 135 | ' \n'[(len(line) - line.rfind('\n') - 1 + | ||
835 | 136 | len(word.split('\n', 1)[0] | ||
836 | 137 | ) >= width)], | ||
837 | 138 | word), | ||
838 | 139 | text.split(' ') | ||
839 | 140 | ) | ||
840 | 141 | |||
841 | 142 | |||
842 | 143 | def _wrap_desc(desc): | 127 | def _wrap_desc(desc): |
844 | 144 | return wordwrap(desc, 75).replace(' \n', '\n') | 128 | return cve_lib.wrap_text(desc) |
845 | 145 | 129 | ||
846 | 146 | def _spawn_editor(path): | 130 | def _spawn_editor(path): |
847 | 147 | editor = os.getenv('EDITOR', 'vi') | 131 | editor = os.getenv('EDITOR', 'vi') |
848 | @@ -164,104 +148,6 @@ def prompt_user(msg): | |||
849 | 164 | print(msg, flush=True, end='') | 148 | print(msg, flush=True, end='') |
850 | 165 | 149 | ||
851 | 166 | 150 | ||
852 | 167 | def add_CVE_to_tracker(cve, info, packages, priority=None, bug_urls=[], ref_urls=[]): | ||
853 | 168 | src = '%s/active/00boilerplate' % (destdir) | ||
854 | 169 | |||
855 | 170 | # Use the first boilerplate for template | ||
856 | 171 | first_boiler = "" | ||
857 | 172 | for b in packages: | ||
858 | 173 | if os.path.exists(src + "." + b): | ||
859 | 174 | src = src + "." + b | ||
860 | 175 | first_boiler = src | ||
861 | 176 | break | ||
862 | 177 | dst = '%s/active/%s' % (destdir, cve) | ||
863 | 178 | with open(src) as f: | ||
864 | 179 | template = f.readlines() | ||
865 | 180 | cve_file = open(dst, 'w') | ||
866 | 181 | orig_priority = "" | ||
867 | 182 | for line in template: | ||
868 | 183 | line = line.rstrip() | ||
869 | 184 | if line.startswith('Candidate:'): | ||
870 | 185 | print('Candidate: %s' % (cve), file=cve_file) | ||
871 | 186 | elif info['public'] and line.startswith('PublicDate:'): | ||
872 | 187 | print('PublicDate: %s' % (info['public']), file=cve_file) | ||
873 | 188 | elif info['cvss'] and line.startswith('CVSS:'): | ||
874 | 189 | print('CVSS:', file=cve_file) | ||
875 | 190 | for entry in info['cvss']: | ||
876 | 191 | print(' %s: %s [%s %s]' % (entry['source'], entry['vector'], entry['baseScore'], entry['baseSeverity']), file=cve_file) | ||
877 | 192 | elif line.startswith('References:'): | ||
878 | 193 | print('References:\n https://cve.mitre.org/cgi-bin/cvename.cgi?name=%s' % (cve), file=cve_file) | ||
879 | 194 | for i in ref_urls: | ||
880 | 195 | print(" %s" % i, file=cve_file) | ||
881 | 196 | elif line.startswith('Bugs:'): | ||
882 | 197 | print(line, file=cve_file) | ||
883 | 198 | for i in bug_urls: | ||
884 | 199 | print(" %s" % i, file=cve_file) | ||
885 | 200 | elif line.startswith('Priority:'): | ||
886 | 201 | orig_priority = line.split()[1] | ||
887 | 202 | if priority: | ||
888 | 203 | print('Priority: %s' % priority, file=cve_file) | ||
889 | 204 | else: | ||
890 | 205 | print(line, file=cve_file) | ||
891 | 206 | elif not line.startswith('#'): | ||
892 | 207 | print(line, file=cve_file) | ||
893 | 208 | |||
894 | 209 | if line.startswith('Description:'): | ||
895 | 210 | for desc_line in _wrap_desc(info['desc']).split('\n'): | ||
896 | 211 | print(" %s" % (desc_line), file=cve_file) | ||
897 | 212 | |||
898 | 213 | # Now add package information (with Priority_<pkg>) from other boilers | ||
899 | 214 | if len(packages) > 1: | ||
900 | 215 | for p in packages: | ||
901 | 216 | skip_emptyline = False | ||
902 | 217 | boiler = '%s/active/00boilerplate.%s' % (destdir, p) | ||
903 | 218 | if os.path.exists(boiler) and boiler != first_boiler: | ||
904 | 219 | with open(boiler) as f: | ||
905 | 220 | template = f.readlines() | ||
906 | 221 | in_note_section = False | ||
907 | 222 | for line in template: | ||
908 | 223 | line = line.rstrip() | ||
909 | 224 | # handle notes in templates | ||
910 | 225 | if in_note_section: | ||
911 | 226 | if line.startswith(' '): | ||
912 | 227 | continue | ||
913 | 228 | else: | ||
914 | 229 | in_note_section = False | ||
915 | 230 | if line.startswith('Notes:'): | ||
916 | 231 | in_note_section = True | ||
917 | 232 | continue | ||
918 | 233 | if line.startswith('Candidate:') or \ | ||
919 | 234 | line.startswith('PublicDate:') or \ | ||
920 | 235 | line.startswith('References:') or \ | ||
921 | 236 | line.startswith('Description:') or \ | ||
922 | 237 | line.startswith('Ubuntu-Description:') or \ | ||
923 | 238 | line.startswith('Bugs:') or \ | ||
924 | 239 | line.startswith('Discovered-by:') or \ | ||
925 | 240 | line.startswith('Assigned-to:') or \ | ||
926 | 241 | line.startswith('#'): | ||
927 | 242 | continue | ||
928 | 243 | if line.startswith('Priority:'): | ||
929 | 244 | new_priority = line.split()[1] | ||
930 | 245 | if priority: | ||
931 | 246 | continue | ||
932 | 247 | elif new_priority == orig_priority: | ||
933 | 248 | continue | ||
934 | 249 | elif orig_priority == "": | ||
935 | 250 | print(line, file=cve_file) | ||
936 | 251 | else: | ||
937 | 252 | print('Priority_%s: %s' % (p, new_priority), file=cve_file) | ||
938 | 253 | skip_emptyline = True | ||
939 | 254 | elif skip_emptyline and line == "": | ||
940 | 255 | skip_emptyline = False | ||
941 | 256 | continue | ||
942 | 257 | else: | ||
943 | 258 | print(line, file=cve_file) | ||
944 | 259 | print("") | ||
945 | 260 | |||
946 | 261 | cve_file.close() | ||
947 | 262 | |||
948 | 263 | return dst | ||
949 | 264 | |||
950 | 265 | 151 | ||
951 | 266 | class PercentageFile(object): | 152 | class PercentageFile(object): |
952 | 267 | def __init__(self, filename): | 153 | def __init__(self, filename): |
953 | @@ -1371,110 +1257,114 @@ class CVEHandler(xml.sax.handler.ContentHandler): | |||
954 | 1371 | print('%s %s %s\n%s' % (action, cve, data, desc)) | 1257 | print('%s %s %s\n%s' % (action, cve, data, desc)) |
955 | 1372 | 1258 | ||
956 | 1373 | def add_cve(self, cve, packages, priority=None): | 1259 | def add_cve(self, cve, packages, priority=None): |
966 | 1374 | # remove from not-for-us.txt if adding and ensure we remove any | 1260 | # remove from not-for-us.txt if adding and ensure we remove any |
967 | 1375 | # mistriaged_hint from the description | 1261 | # mistriaged_hint from the description |
968 | 1376 | if cve in CVEIgnoreNotForUsList: | 1262 | if cve in CVEIgnoreNotForUsList: |
969 | 1377 | cmd = ['sed', '-i', '/^%s #.*$/d' % cve, './ignored/not-for-us.txt'] | 1263 | cmd = ['sed', '-i', '/^%s #.*$/d' % cve, './ignored/not-for-us.txt'] |
970 | 1378 | subprocess.call(cmd) | 1264 | subprocess.call(cmd) |
971 | 1379 | self.cve_data[cve]['desc'] = self.cve_data[cve]['desc'].replace(mistriaged_hint, '') | 1265 | self.cve_data[cve]['desc'] = self.cve_data[cve]['desc'].replace(mistriaged_hint, '') |
972 | 1380 | 1266 | ||
973 | 1381 | # Build up list of reference urls | 1267 | # Build up list of reference urls |
974 | 1382 | ref_urls = [] | 1268 | ref_urls = [] |
975 | 1269 | if self.debian and \ | ||
976 | 1270 | cve in self.debian and \ | ||
977 | 1271 | 'note' in self.debian[cve]: | ||
978 | 1272 | for line in self.debian[cve]['note']: | ||
979 | 1273 | tmp = line.lstrip("NOTE: ") | ||
980 | 1274 | if tmp.startswith("http"): | ||
981 | 1275 | ref_urls.append(tmp) | ||
982 | 1276 | if 'refs' in self.cve_data[cve]: | ||
983 | 1277 | for ref in self.cve_data[cve]['refs']: | ||
984 | 1278 | url = "" | ||
985 | 1279 | if ref[1].strip().startswith("http"): | ||
986 | 1280 | url = ref[1].strip() | ||
987 | 1281 | elif ref[2] is not None and ref[2].strip().startswith("http"): | ||
988 | 1282 | url = ref[2].strip() | ||
989 | 1283 | else: # no urls | ||
990 | 1284 | continue | ||
991 | 1285 | |||
992 | 1286 | if '//' not in url: # invalid url | ||
993 | 1287 | continue | ||
994 | 1288 | |||
995 | 1289 | # ignore certain reference URLs which we don't use | ||
996 | 1290 | ignored_urls = ['www.securityfocus.com', 'www.osvdb.org'] | ||
997 | 1291 | if url.split('//')[1].split('/')[0] in ignored_urls: | ||
998 | 1292 | continue | ||
999 | 1293 | |||
1000 | 1294 | if url not in ref_urls: | ||
1001 | 1295 | ref_urls.append(url) | ||
1002 | 1296 | |||
1003 | 1297 | # Build up list of bug urls | ||
1004 | 1298 | bug_urls = [] | ||
1005 | 1299 | for pkg in packages: | ||
1006 | 1383 | if self.debian and \ | 1300 | if self.debian and \ |
1007 | 1384 | cve in self.debian and \ | 1301 | cve in self.debian and \ |
1095 | 1385 | 'note' in self.debian[cve]: | 1302 | self.debian[cve]['pkgs'] and \ |
1096 | 1386 | for line in self.debian[cve]['note']: | 1303 | pkg in self.debian[cve]['pkgs']: |
1097 | 1387 | tmp = line.lstrip("NOTE: ") | 1304 | bug = None |
1098 | 1388 | if tmp.startswith("http"): | 1305 | if self.debian[cve]['pkgs'][pkg]['priority'] and \ |
1099 | 1389 | ref_urls.append(tmp) | 1306 | re.search(r'^bug #[0-9]+$', self.debian[cve]['pkgs'][pkg]['priority']): |
1100 | 1390 | if 'refs' in self.cve_data[cve]: | 1307 | bug = self.debian[cve]['pkgs'][pkg]['priority'].split('#')[1] |
1101 | 1391 | for ref in self.cve_data[cve]['refs']: | 1308 | elif self.debian[cve]['pkgs'][pkg]['bug']: |
1102 | 1392 | url = "" | 1309 | bug = self.debian[cve]['pkgs'][pkg]['bug'] |
1103 | 1393 | if ref[1].strip().startswith("http"): | 1310 | if bug: |
1104 | 1394 | url = ref[1].strip() | 1311 | url = "http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=%s" % bug |
1105 | 1395 | elif ref[2] is not None and ref[2].strip().startswith("http"): | 1312 | if url not in bug_urls: |
1106 | 1396 | url = ref[2].strip() | 1313 | bug_urls.append(url) |
1107 | 1397 | else: # no urls | 1314 | |
1108 | 1398 | continue | 1315 | |
1109 | 1399 | 1316 | # Build up command line | |
1110 | 1400 | if '//' not in url: # invalid url | 1317 | cmd = ['./scripts/active_edit', '-c', cve, '--yes'] |
1111 | 1401 | continue | 1318 | for url in ref_urls: |
1112 | 1402 | 1319 | cmd.extend(['--reference-url', url]) | |
1113 | 1403 | # ignore certain reference URLs which we don't use | 1320 | for url in bug_urls: |
1114 | 1404 | ignored_urls = ['www.securityfocus.com', 'www.osvdb.org'] | 1321 | cmd.extend(['--bug-url', url]) |
1115 | 1405 | if url.split('//')[1].split('/')[0] in ignored_urls: | 1322 | if priority: |
1116 | 1406 | continue | 1323 | cmd.extend(['--priority', priority]) |
1117 | 1407 | 1324 | ||
1118 | 1408 | if url not in ref_urls: | 1325 | # capture debian not-affected states |
1119 | 1409 | ref_urls.append(url) | 1326 | not_affected = [] |
1120 | 1410 | 1327 | ||
1121 | 1411 | # Build up list of bug urls | 1328 | for pkg in packages: |
1122 | 1412 | bug_urls = [] | 1329 | # The Debian convention is to specify the fixed version as the state |
1123 | 1413 | for pkg in packages: | 1330 | # with the bug number as the priority for fixed bugs. Leverage this |
1124 | 1414 | if self.debian and \ | 1331 | # with active_edit |
1125 | 1415 | cve in self.debian and \ | 1332 | fixed_in = "" |
1126 | 1416 | self.debian[cve]['pkgs'] and \ | 1333 | if self.debian and \ |
1127 | 1417 | pkg in self.debian[cve]['pkgs']: | 1334 | cve in self.debian and \ |
1128 | 1418 | bug = None | 1335 | self.debian[cve]['pkgs'] and \ |
1129 | 1419 | if self.debian[cve]['pkgs'][pkg]['priority'] and \ | 1336 | pkg in self.debian[cve]['pkgs'] and \ |
1130 | 1420 | re.search(r'^bug #[0-9]+$', self.debian[cve]['pkgs'][pkg]['priority']): | 1337 | self.debian[cve]['pkgs'][pkg]['state']: |
1131 | 1421 | bug = self.debian[cve]['pkgs'][pkg]['priority'].split('#')[1] | 1338 | if re.search(r'^[0-9]', self.debian[cve]['pkgs'][pkg]['state']): |
1132 | 1422 | elif self.debian[cve]['pkgs'][pkg]['bug']: | 1339 | fixed_version = self.debian[cve]['pkgs'][pkg]['state'] |
1133 | 1423 | bug = self.debian[cve]['pkgs'][pkg]['bug'] | 1340 | fixed_in = ",%s" % fixed_version |
1134 | 1424 | if bug: | 1341 | |
1135 | 1425 | url = "http://bugs.debian.org/cgi-bin/bugreport.cgi?bug=%s" % bug | 1342 | # Now see if we can correlate this to an Ubuntu version |
1136 | 1426 | if url not in bug_urls: | 1343 | answer = source_map.madison(source, pkg) |
1137 | 1427 | bug_urls.append(url) | 1344 | for name in sorted(answer.keys()): |
1138 | 1428 | 1345 | rel = name.split('/')[0].split('-')[0] # don't care about the pocket | |
1139 | 1429 | # Add to tracker from 00boilerplate | 1346 | version = answer[name][pkg] |
1140 | 1430 | dst = add_CVE_to_tracker(cve, self.cve_data[cve], packages, priority, bug_urls, ref_urls) | 1347 | # Try to compare apples to apples. Ie, if one of us has |
1141 | 1431 | 1348 | # an epoch and the other doesn't, don't try to be smart | |
1142 | 1432 | # Build up command line | 1349 | if (':' not in version and ':' not in fixed_version) or \ |
1143 | 1433 | cmd = ['./scripts/active_edit', '-c', cve, '--yes'] | 1350 | (':' in version and ':' in fixed_version): |
1144 | 1434 | 1351 | if dpkg_compare_versions(version, 'ge', fixed_version): | |
1145 | 1435 | # capture debian not-affected states | 1352 | if rel == cve_lib.devel_release: |
1146 | 1436 | not_affected = [] | 1353 | rel = 'devel' |
1147 | 1437 | 1354 | fixed_in += ",%s,%s" % (rel, version) | |
1148 | 1438 | for pkg in packages: | 1355 | break |
1149 | 1439 | # The Debian convention is to specify the fixed version as the state | 1356 | elif self.debian[cve]['pkgs'][pkg]['state'].startswith('<not-affected>') and \ |
1150 | 1440 | # with the bug number as the priority for fixed bugs. Leverage this | 1357 | len(self.debian[cve]['pkgs'][pkg]['priority']) > 0: |
1151 | 1441 | # with active_edit | 1358 | # capture that debian believes their version is unaffected |
1152 | 1442 | fixed_in = "" | 1359 | not_affected.append((pkg, "debian: %s" % self.debian[cve]['pkgs'][pkg]['priority'])) |
1153 | 1443 | if self.debian and \ | 1360 | cmd += ['-p', "%s%s" % (pkg, fixed_in)] |
1154 | 1444 | cve in self.debian and \ | 1361 | |
1155 | 1445 | self.debian[cve]['pkgs'] and \ | 1362 | subprocess.call(cmd) |
1156 | 1446 | pkg in self.debian[cve]['pkgs'] and \ | 1363 | for (pkg, reason) in not_affected: |
1157 | 1447 | self.debian[cve]['pkgs'][pkg]['state']: | 1364 | cmd = ['./scripts/mass-cve-edit', '-p', pkg, '-r', 'upstream', '-s', 'not-affected', '-v', reason, cve] |
1071 | 1448 | if re.search(r'^[0-9]', self.debian[cve]['pkgs'][pkg]['state']): | ||
1072 | 1449 | fixed_version = self.debian[cve]['pkgs'][pkg]['state'] | ||
1073 | 1450 | fixed_in = ",%s" % fixed_version | ||
1074 | 1451 | |||
1075 | 1452 | # Now see if we can correlate this to an Ubuntu version | ||
1076 | 1453 | answer = source_map.madison(source, pkg) | ||
1077 | 1454 | for name in sorted(answer.keys()): | ||
1078 | 1455 | rel = name.split('/')[0].split('-')[0] # don't care about the pocket | ||
1079 | 1456 | version = answer[name][pkg] | ||
1080 | 1457 | # Try to compare apples to apples. Ie, if one of us has | ||
1081 | 1458 | # an epoch and the other doesn't, don't try to be smart | ||
1082 | 1459 | if (':' not in version and ':' not in fixed_version) or \ | ||
1083 | 1460 | (':' in version and ':' in fixed_version): | ||
1084 | 1461 | if dpkg_compare_versions(version, 'ge', fixed_version): | ||
1085 | 1462 | if rel == cve_lib.devel_release: | ||
1086 | 1463 | rel = 'devel' | ||
1087 | 1464 | fixed_in += ",%s,%s" % (rel, version) | ||
1088 | 1465 | break | ||
1089 | 1466 | elif self.debian[cve]['pkgs'][pkg]['state'].startswith('<not-affected>') and \ | ||
1090 | 1467 | len(self.debian[cve]['pkgs'][pkg]['priority']) > 0: | ||
1091 | 1468 | # capture that debian believes their version is unaffected | ||
1092 | 1469 | not_affected.append((pkg, "debian: %s" % self.debian[cve]['pkgs'][pkg]['priority'])) | ||
1093 | 1470 | cmd += ['-p', "%s%s" % (pkg, fixed_in)] | ||
1094 | 1471 | |||
1158 | 1472 | subprocess.call(cmd) | 1365 | subprocess.call(cmd) |
1164 | 1473 | for (pkg, reason) in not_affected: | 1366 | self.num_added += 1 |
1165 | 1474 | cmd = ['./scripts/mass-cve-edit', '-p', pkg, '-r', 'upstream', '-s', 'not-affected', '-v', reason, cve] | 1367 | return './active/%s' % cve |
1161 | 1475 | subprocess.call(cmd) | ||
1162 | 1476 | self.num_added += 1 | ||
1163 | 1477 | return dst | ||
1166 | 1478 | 1368 | ||
1167 | 1479 | def unembargo_cve(self, cve): | 1369 | def unembargo_cve(self, cve): |
1168 | 1480 | # unembargo a cve now public | 1370 | # unembargo a cve now public |
1169 | diff --git a/scripts/check-syntax b/scripts/check-syntax | |||
1170 | index 9501d58..234c71a 100755 | |||
1171 | --- a/scripts/check-syntax | |||
1172 | +++ b/scripts/check-syntax | |||
1173 | @@ -279,7 +279,7 @@ if len(args) == 0: | |||
1174 | 279 | with open(opt.filelist) as fh: | 279 | with open(opt.filelist) as fh: |
1175 | 280 | for line in fh: | 280 | for line in fh: |
1176 | 281 | for dir in check_dirs: | 281 | for dir in check_dirs: |
1178 | 282 | if line.startswith("%s/CVE-" % dir) or line.startswith("%s/00boilerplate" % dir): | 282 | if line.startswith("%s/CVE-" % dir): |
1179 | 283 | args += [line.rstrip()] | 283 | args += [line.rstrip()] |
1180 | 284 | elif opt.modified: | 284 | elif opt.modified: |
1181 | 285 | if opt.debug: | 285 | if opt.debug: |
1182 | @@ -299,11 +299,7 @@ if len(args) == 0: | |||
1183 | 299 | args += [filename] | 299 | args += [filename] |
1184 | 300 | else: | 300 | else: |
1185 | 301 | for dir in check_dirs: | 301 | for dir in check_dirs: |
1191 | 302 | # XXX might want to separate out boilerplate checks | 302 | for cve in sorted(glob.glob("%s/CVE-*" % dir)): |
1187 | 303 | # as they might need to be less restrictive than regular CVE | ||
1188 | 304 | # file checks | ||
1189 | 305 | for cve in sorted(glob.glob("%s/CVE-*" % dir) + | ||
1190 | 306 | glob.glob("%s/00boilerplate.*" % dir)): | ||
1192 | 307 | args += [cve] | 303 | args += [cve] |
1193 | 308 | else: | 304 | else: |
1194 | 309 | all_files = False | 305 | all_files = False |
1195 | @@ -404,7 +400,7 @@ for cve in args: | |||
1196 | 404 | cve_okay = False | 400 | cve_okay = False |
1197 | 405 | 401 | ||
1198 | 406 | # verify candidate field matches the CVE file name | 402 | # verify candidate field matches the CVE file name |
1200 | 407 | if "stdin" not in cve and "boilerplate" not in cve and not data["Candidate"] == cve: | 403 | if "stdin" not in cve and not data["Candidate"] == cve: |
1201 | 408 | filename = srcmap["Candidate"][0] | 404 | filename = srcmap["Candidate"][0] |
1202 | 409 | linenum = srcmap["Candidate"][1] | 405 | linenum = srcmap["Candidate"][1] |
1203 | 410 | print( | 406 | print( |
1204 | @@ -424,10 +420,9 @@ for cve in args: | |||
1205 | 424 | # place the generated error message on this release's line etc | 420 | # place the generated error message on this release's line etc |
1206 | 425 | nearby_rel = list(listed_releases - missing_releases)[0] | 421 | nearby_rel = list(listed_releases - missing_releases)[0] |
1207 | 426 | for rel in missing_releases: | 422 | for rel in missing_releases: |
1210 | 427 | # only warn on active CVEs but don't warn on boilerplate entries missing external | 423 | # only warn on active CVEs |
1209 | 428 | # releases as this is not supported | ||
1211 | 429 | if is_active(cve) and \ | 424 | if is_active(cve) and \ |
1213 | 430 | ("boilerplate" not in cve or rel not in cve_lib.external_releases) and \ | 425 | rel not in cve_lib.external_releases and \ |
1214 | 431 | rel in source and pkg in source[rel]: | 426 | rel in source and pkg in source[rel]: |
1215 | 432 | filename = srcmap["pkgs"][pkg][nearby_rel][0] | 427 | filename = srcmap["pkgs"][pkg][nearby_rel][0] |
1216 | 433 | linenum = srcmap["pkgs"][pkg][nearby_rel][1] | 428 | linenum = srcmap["pkgs"][pkg][nearby_rel][1] |
1217 | @@ -658,10 +653,8 @@ for cve in args: | |||
1218 | 658 | ) | 653 | ) |
1219 | 659 | cve_okay = False | 654 | cve_okay = False |
1220 | 660 | 655 | ||
1223 | 661 | # Verify priority for any CVE with a supported package and when this is | 656 | # Verify priority for any CVE with a supported package |
1222 | 662 | # not boilerplate | ||
1224 | 663 | if ( | 657 | if ( |
1225 | 664 | "boilerplate" not in cve and | ||
1226 | 665 | len(supported) | 658 | len(supported) |
1227 | 666 | and (is_active(cve) or is_embargoed(cve)) | 659 | and (is_active(cve) or is_embargoed(cve)) |
1228 | 667 | and ("Priority" not in data or data["Priority"] not in cve_lib.priorities) | 660 | and ("Priority" not in data or data["Priority"] not in cve_lib.priorities) |
1229 | @@ -777,10 +770,10 @@ for cve in args: | |||
1230 | 777 | cve_okay = False | 770 | cve_okay = False |
1231 | 778 | 771 | ||
1232 | 779 | # Either PublicDate or CRD must be set to something | 772 | # Either PublicDate or CRD must be set to something |
1237 | 780 | if ("boilerplate" not in cve and | 773 | if ( |
1238 | 781 | ("PublicDate" not in data or data["PublicDate"] == "") and ( | 774 | "PublicDate" not in data or data["PublicDate"] == "" and |
1239 | 782 | "CRD" not in data or data["CRD"] == "" | 775 | "CRD" not in data or data["CRD"] == "" |
1240 | 783 | )): | 776 | ): |
1241 | 784 | key = "PublicDate" if "PublicDate" in srcmap else "CRD" | 777 | key = "PublicDate" if "PublicDate" in srcmap else "CRD" |
1242 | 785 | filename = srcmap[key][0] | 778 | filename = srcmap[key][0] |
1243 | 786 | linenum = srcmap[key][1] | 779 | linenum = srcmap[key][1] |
1244 | diff --git a/scripts/check-syntax-fixup b/scripts/check-syntax-fixup | |||
1245 | index 24bf608..a38ad1e 100755 | |||
1246 | --- a/scripts/check-syntax-fixup | |||
1247 | +++ b/scripts/check-syntax-fixup | |||
1248 | @@ -147,11 +147,6 @@ for line in args.infile: | |||
1249 | 147 | # remove this hard-coded hack one-day... | 147 | # remove this hard-coded hack one-day... |
1250 | 148 | if rel in cve_lib.external_releases or \ | 148 | if rel in cve_lib.external_releases or \ |
1251 | 149 | (rel == "trusty/esm" and "DOES exist" in msg): | 149 | (rel == "trusty/esm" and "DOES exist" in msg): |
1252 | 150 | # ignore boilerplate files for external_releases | ||
1253 | 151 | if "boilerplate" in cve: | ||
1254 | 152 | # print unhandled lines | ||
1255 | 153 | print(line, file=sys.stderr) | ||
1256 | 154 | continue | ||
1257 | 155 | cve = os.path.join( | 150 | cve = os.path.join( |
1258 | 156 | cve_lib.get_external_subproject_cve_dir(rel), os.path.basename(cve) | 151 | cve_lib.get_external_subproject_cve_dir(rel), os.path.basename(cve) |
1259 | 157 | ) | 152 | ) |
1260 | diff --git a/scripts/cve-mode.el b/scripts/cve-mode.el | |||
1261 | index b7413b0..0992e4c 100644 | |||
1262 | --- a/scripts/cve-mode.el | |||
1263 | +++ b/scripts/cve-mode.el | |||
1264 | @@ -333,7 +333,7 @@ Queries umt for PACKAGE as well as looking in overlay files." | |||
1265 | 333 | (defun cve-mode-insert-package (package) | 333 | (defun cve-mode-insert-package (package) |
1266 | 334 | "Add PACKAGE as affected to this CVE. | 334 | "Add PACKAGE as affected to this CVE. |
1267 | 335 | Calls out to active_edit to do the heavy lifting so that | 335 | Calls out to active_edit to do the heavy lifting so that |
1269 | 336 | boilerplate entries are handled automatically." | 336 | meta_lists/package-db.json entries are handled automatically." |
1270 | 337 | (interactive | 337 | (interactive |
1271 | 338 | (list (completing-read "Package: " cve-mode--source-packages))) | 338 | (list (completing-read "Package: " cve-mode--source-packages))) |
1272 | 339 | (let ((args (list "-c" (file-name-base (buffer-file-name)) | 339 | (let ((args (list "-c" (file-name-base (buffer-file-name)) |
1273 | @@ -683,8 +683,6 @@ cross boundaries of block literals." | |||
1274 | 683 | 683 | ||
1275 | 684 | ;;;###autoload | 684 | ;;;###autoload |
1276 | 685 | (add-to-list 'auto-mode-alist '("CVE-[[:digit:]]\\{4\\}-[[:digit:]]\\{4,\\}\\'" . cve-mode)) | 685 | (add-to-list 'auto-mode-alist '("CVE-[[:digit:]]\\{4\\}-[[:digit:]]\\{4,\\}\\'" . cve-mode)) |
1277 | 686 | ;;;###autoload | ||
1278 | 687 | (add-to-list 'auto-mode-alist '("00boilerplate.*\\'" . cve-mode)) | ||
1279 | 688 | 686 | ||
1280 | 689 | (provide 'cve-mode) | 687 | (provide 'cve-mode) |
1281 | 690 | ;;; cve-mode.el ends here | 688 | ;;; cve-mode.el ends here |
1282 | diff --git a/scripts/cve.vim b/scripts/cve.vim | |||
1283 | index bc89578..a13b158 100644 | |||
1284 | --- a/scripts/cve.vim | |||
1285 | +++ b/scripts/cve.vim | |||
1286 | @@ -7,7 +7,6 @@ | |||
1287 | 7 | " $ ln -s $UCT/scripts/cve.vim ~/.vim/syntax/cve.vim | 7 | " $ ln -s $UCT/scripts/cve.vim ~/.vim/syntax/cve.vim |
1288 | 8 | " Add to ~/.vimrc: | 8 | " Add to ~/.vimrc: |
1289 | 9 | " autocmd BufNewFile,BufRead CVE-[0-9][0-9][0-9][0-9]-[0-9N]\\\{4,\} set syntax=cve | 9 | " autocmd BufNewFile,BufRead CVE-[0-9][0-9][0-9][0-9]-[0-9N]\\\{4,\} set syntax=cve |
1290 | 10 | " autocmd BufNewFile,BufRead 00boilerplate.* set syntax=cve | ||
1291 | 11 | " | 10 | " |
1292 | 12 | " TODO: | 11 | " TODO: |
1293 | 13 | " - turn the release names into variables so we only have to update in one | 12 | " - turn the release names into variables so we only have to update in one |
1294 | diff --git a/scripts/cve_lib.py b/scripts/cve_lib.py | |||
1295 | index fe8e4f5..60da772 100755 | |||
1296 | --- a/scripts/cve_lib.py | |||
1297 | +++ b/scripts/cve_lib.py | |||
1298 | @@ -23,6 +23,8 @@ import cache_urllib | |||
1299 | 23 | import json | 23 | import json |
1300 | 24 | import yaml | 24 | import yaml |
1301 | 25 | 25 | ||
1302 | 26 | from functools import reduce | ||
1303 | 27 | |||
1304 | 26 | def set_cve_dir(path): | 28 | def set_cve_dir(path): |
1305 | 27 | '''Return a path with CVEs in it. Specifically: | 29 | '''Return a path with CVEs in it. Specifically: |
1306 | 28 | - if 'path' has CVEs in it, return path | 30 | - if 'path' has CVEs in it, return path |
1307 | @@ -912,11 +914,33 @@ kernel_srcs = set(['linux', | |||
1308 | 912 | kernel_topic_branches = kernel_srcs.difference(['linux']) | 914 | kernel_topic_branches = kernel_srcs.difference(['linux']) |
1309 | 913 | 915 | ||
1310 | 914 | # for sanity, try to keep these in alphabetical order in the json file | 916 | # for sanity, try to keep these in alphabetical order in the json file |
1316 | 915 | def load_package_info_overrides(list_dir): | 917 | def load_package_db(dir=meta_dir): |
1317 | 916 | package_info_overrides = dict() | 918 | pkg_db = {} |
1318 | 917 | with open(os.path.join(list_dir, "package_info_overrides.json")) as _file: | 919 | pkg_db_json = os.path.join(dir, "package-db.json") |
1319 | 918 | package_info_overrides = json.load(_file) | 920 | try: |
1320 | 919 | return package_info_overrides | 921 | with open(pkg_db_json, "r", encoding='utf-8') as fp: |
1321 | 922 | pkg_db = json.load(fp) | ||
1322 | 923 | # add lookups based on aliases - we can't iterate over pkg_db and | ||
1323 | 924 | # modify it so collect aliases then add them manually | ||
1324 | 925 | alias_info = {} | ||
1325 | 926 | for p in pkg_db: | ||
1326 | 927 | try: | ||
1327 | 928 | aliases = pkg_db[p]["aliases"] | ||
1328 | 929 | if len(aliases) > 0: | ||
1329 | 930 | alias_info[p] = aliases | ||
1330 | 931 | except KeyError: | ||
1331 | 932 | pass | ||
1332 | 933 | for p in alias_info.keys(): | ||
1333 | 934 | for a in alias_info[p]: | ||
1334 | 935 | if a not in pkg_db: | ||
1335 | 936 | # use original info if already in pkg_db | ||
1336 | 937 | pkg_db[a] = pkg_db[p] | ||
1337 | 938 | except FileNotFoundError: | ||
1338 | 939 | # TODO: remove this exception handling once we have a package-db.json | ||
1339 | 940 | # checked into UCT git since in that case this should never occur so it | ||
1340 | 941 | # should be a fatal error if it is missing | ||
1341 | 942 | pass | ||
1342 | 943 | return pkg_db | ||
1343 | 920 | 944 | ||
1344 | 921 | 945 | ||
1345 | 922 | # "arch_list" is all the physical architectures buildable | 946 | # "arch_list" is all the physical architectures buildable |
1346 | @@ -1226,19 +1250,19 @@ def release_is_older_than(release_a, release_b): | |||
1347 | 1226 | 1250 | ||
1348 | 1227 | 1251 | ||
1349 | 1228 | 1252 | ||
1351 | 1229 | package_info_overrides = load_package_info_overrides(meta_dir) | 1253 | package_db = load_package_db() |
1352 | 1230 | 1254 | ||
1353 | 1231 | def lookup_package_override_title(source): | 1255 | def lookup_package_override_title(source): |
1356 | 1232 | global package_info_overrides | 1256 | global package_db |
1357 | 1233 | res = package_info_overrides.get(source) | 1257 | res = package_db.get(source) |
1358 | 1234 | if isinstance(res, dict): | 1258 | if isinstance(res, dict): |
1359 | 1235 | return(res.get("title")) | 1259 | return(res.get("title")) |
1360 | 1236 | 1260 | ||
1361 | 1237 | return None | 1261 | return None |
1362 | 1238 | 1262 | ||
1363 | 1239 | def lookup_package_override_description(source): | 1263 | def lookup_package_override_description(source): |
1366 | 1240 | global package_info_overrides | 1264 | global package_db |
1367 | 1241 | res = package_info_overrides.get(source) | 1265 | res = package_db.get(source) |
1368 | 1242 | if isinstance(res, dict): | 1266 | if isinstance(res, dict): |
1369 | 1243 | return(res.get("description")) | 1267 | return(res.get("description")) |
1370 | 1244 | 1268 | ||
1371 | @@ -1964,9 +1988,6 @@ def load_cve(cve, strict=False, srcmap=None): | |||
1372 | 1964 | nonempty = ['Candidate'] | 1988 | nonempty = ['Candidate'] |
1373 | 1965 | if strict: | 1989 | if strict: |
1374 | 1966 | nonempty += ['PublicDate'] | 1990 | nonempty += ['PublicDate'] |
1375 | 1967 | # boilerplate files are special and can (should?) be empty | ||
1376 | 1968 | if "boilerplate" in cve: | ||
1377 | 1969 | nonempty = [] | ||
1378 | 1970 | 1991 | ||
1379 | 1971 | if field not in data or field not in fields_seen: | 1992 | if field not in data or field not in fields_seen: |
1380 | 1972 | msg += "%s: %d: missing field '%s'\n" % (cve, linenum, field) | 1993 | msg += "%s: %d: missing field '%s'\n" % (cve, linenum, field) |
1381 | @@ -2008,17 +2029,6 @@ def load_cve(cve, strict=False, srcmap=None): | |||
1382 | 2008 | raise ValueError(msg.strip()) | 2029 | raise ValueError(msg.strip()) |
1383 | 2009 | return data | 2030 | return data |
1384 | 2010 | 2031 | ||
1385 | 2011 | def load_boilerplates(verbose=False): | ||
1386 | 2012 | boilerplates = dict() | ||
1387 | 2013 | prefix = "00boilerplate." | ||
1388 | 2014 | for bp in glob.glob(os.path.join(active_dir, prefix + "*")): | ||
1389 | 2015 | # TODO - should we differentiate on symlinks so we can tell which | ||
1390 | 2016 | # are the "real" primary ones and which ones are secondary? | ||
1391 | 2017 | name = bp[bp.find(prefix) + len(prefix):] | ||
1392 | 2018 | info = load_cve(bp) | ||
1393 | 2019 | boilerplates.setdefault(name, info) | ||
1394 | 2020 | return boilerplates | ||
1395 | 2021 | |||
1396 | 2022 | def load_all(cves, uems, rcves=[]): | 2032 | def load_all(cves, uems, rcves=[]): |
1397 | 2023 | table = dict() | 2033 | table = dict() |
1398 | 2024 | priority = dict() | 2034 | priority = dict() |
1399 | @@ -2865,3 +2875,23 @@ def parse_cvss(cvss): | |||
1400 | 2865 | js['baseMetricV3']['exploitabilityScore'] = round(exploitability * 10) / 10 | 2875 | js['baseMetricV3']['exploitabilityScore'] = round(exploitability * 10) / 10 |
1401 | 2866 | js['baseMetricV3']['impactScore'] = round(impact * 10) / 10 | 2876 | js['baseMetricV3']['impactScore'] = round(impact * 10) / 10 |
1402 | 2867 | return js | 2877 | return js |
1403 | 2878 | |||
1404 | 2879 | def wordwrap(text, width): | ||
1405 | 2880 | """ | ||
1406 | 2881 | A word-wrap function that preserves existing line breaks | ||
1407 | 2882 | and most spaces in the text. Expects that existing line | ||
1408 | 2883 | breaks are posix newlines (\n). | ||
1409 | 2884 | """ | ||
1410 | 2885 | return reduce(lambda line, word, width=width: | ||
1411 | 2886 | '%s%s%s' % | ||
1412 | 2887 | (line, | ||
1413 | 2888 | ' \n'[(len(line) - line.rfind('\n') - 1 + len(word.split('\n', 1)[0]) >= width)], | ||
1414 | 2889 | word), | ||
1415 | 2890 | text.split(' ') | ||
1416 | 2891 | ) | ||
1417 | 2892 | |||
1418 | 2893 | def wrap_text(text, width=75): | ||
1419 | 2894 | """ | ||
1420 | 2895 | Wrap text to width chars wide. | ||
1421 | 2896 | """ | ||
1422 | 2897 | return wordwrap(text, width).replace(' \n', '\n') | ||
1423 | diff --git a/scripts/dup-status-for-pkg b/scripts/dup-status-for-pkg | |||
1424 | index fc32055..c15d669 100755 | |||
1425 | --- a/scripts/dup-status-for-pkg | |||
1426 | +++ b/scripts/dup-status-for-pkg | |||
1427 | @@ -24,7 +24,7 @@ fi | |||
1428 | 24 | for pkg in "$@"; do | 24 | for pkg in "$@"; do |
1429 | 25 | echo "Duplicating status from ${src} to ${dst} for ${pkg} where was DNE for ${dst}" | 25 | echo "Duplicating status from ${src} to ${dst} for ${pkg} where was DNE for ${dst}" |
1430 | 26 | # find files to change | 26 | # find files to change |
1432 | 27 | for file in $(grep -l '^'"${dst}_${pkg}"': DNE' active/00boilerplate* active/CVE-* retired/CVE-*); do | 27 | for file in $(grep -l '^'"${dst}_${pkg}"': DNE' active/CVE-* retired/CVE-*); do |
1433 | 28 | status=$(grep '^'"${src}_${pkg}"': ' "${file}" | cut -f2- -d' ') | 28 | status=$(grep '^'"${src}_${pkg}"': ' "${file}" | cut -f2- -d' ') |
1434 | 29 | if [ ! -z "${status}" ]; then | 29 | if [ ! -z "${status}" ]; then |
1435 | 30 | echo "Duplicating status ${status} from ${src} to ${dst} in ${file}" | 30 | echo "Duplicating status ${status} from ${src} to ${dst} in ${file}" |
1436 | diff --git a/scripts/kernel-triage-missing-break-fix b/scripts/kernel-triage-missing-break-fix | |||
1437 | index e60cb38..e21a949 100755 | |||
1438 | --- a/scripts/kernel-triage-missing-break-fix | |||
1439 | +++ b/scripts/kernel-triage-missing-break-fix | |||
1440 | @@ -35,7 +35,7 @@ if [ -z "${debian_kernel_cve_tracker}" ] ; then | |||
1441 | 35 | echo | 35 | echo |
1442 | 36 | fi | 36 | fi |
1443 | 37 | 37 | ||
1445 | 38 | _CVES=$(cd "$UCT" && grep -lr '^Patches_linux:' --exclude '*boilerplate*' active/ | xargs grep -L '^ break-fix:' | sort) | 38 | _CVES=$(cd "$UCT" && grep -lr '^Patches_linux:' active/ | xargs grep -L '^ break-fix:' | sort) |
1446 | 39 | for _CVE in ${_CVES} ; do | 39 | for _CVE in ${_CVES} ; do |
1447 | 40 | echo "${_CVE}" | 40 | echo "${_CVE}" |
1448 | 41 | CVE="${_CVE##active/}" | 41 | CVE="${_CVE##active/}" |
1449 | diff --git a/scripts/release-cycle-devel-opens b/scripts/release-cycle-devel-opens | |||
1450 | index fccd5af..cf08707 100755 | |||
1451 | --- a/scripts/release-cycle-devel-opens | |||
1452 | +++ b/scripts/release-cycle-devel-opens | |||
1453 | @@ -18,7 +18,7 @@ if ! [ -d embargoed/ ]; then | |||
1454 | 18 | echo "Unable to find embargoed directory, is it set up?" >&2 | 18 | echo "Unable to find embargoed directory, is it set up?" >&2 |
1455 | 19 | fi | 19 | fi |
1456 | 20 | 20 | ||
1458 | 21 | for f in active/{CVE-,00boilerplate}* embargoed/CVE-* ; do | 21 | for f in active/CVE-* embargoed/CVE-* ; do |
1459 | 22 | if egrep -q "^(#?)devel_" "$f" ; then | 22 | if egrep -q "^(#?)devel_" "$f" ; then |
1460 | 23 | echo "DEBUG: skipping $f" | 23 | echo "DEBUG: skipping $f" |
1461 | 24 | continue | 24 | continue |
1462 | diff --git a/scripts/release-cycle-released b/scripts/release-cycle-released | |||
1463 | index b5cfc7a..c6d844f 100755 | |||
1464 | --- a/scripts/release-cycle-released | |||
1465 | +++ b/scripts/release-cycle-released | |||
1466 | @@ -22,7 +22,7 @@ if ! [ -d active/ ]; then | |||
1467 | 22 | exit 2 | 22 | exit 2 |
1468 | 23 | fi | 23 | fi |
1469 | 24 | 24 | ||
1471 | 25 | for f in active/{00boilerplate,CVE-}*; do | 25 | for f in active/CVE-*; do |
1472 | 26 | # ignore symlinks so we don't replace twice in the same underlying file | 26 | # ignore symlinks so we don't replace twice in the same underlying file |
1473 | 27 | if [ ! -L "$f" ]; then | 27 | if [ ! -L "$f" ]; then |
1474 | 28 | perl -pi -e 's/^((#?)devel_(.*))/${2}'"$REL"'_$3\n$1/g' "$f" | 28 | perl -pi -e 's/^((#?)devel_(.*))/${2}'"$REL"'_$3\n$1/g' "$f" |
1475 | diff --git a/scripts/sync-from-eol.py b/scripts/sync-from-eol.py | |||
1476 | index b2a020c..dd6d9eb 100755 | |||
1477 | --- a/scripts/sync-from-eol.py | |||
1478 | +++ b/scripts/sync-from-eol.py | |||
1479 | @@ -24,16 +24,12 @@ import sys | |||
1480 | 24 | import cve_lib | 24 | import cve_lib |
1481 | 25 | import source_map | 25 | import source_map |
1482 | 26 | 26 | ||
1483 | 27 | import warnings | ||
1484 | 28 | warnings.filterwarnings('ignore', 'apt API not stable yet', FutureWarning) | ||
1485 | 29 | import apt | ||
1486 | 30 | 27 | ||
1487 | 31 | parser = optparse.OptionParser() | 28 | parser = optparse.OptionParser() |
1488 | 32 | parser.add_option("-r", "--release", dest="release", default=None, help="release to modify") | 29 | parser.add_option("-r", "--release", dest="release", default=None, help="release to modify") |
1489 | 33 | parser.add_option("-W", "--whole", dest="whole", help="End of life the whole release", action='store_true') | 30 | parser.add_option("-W", "--whole", dest="whole", help="End of life the whole release", action='store_true') |
1490 | 34 | parser.add_option("-U", "--universe", dest="universe", help="Modify packages in universe and multiverse", action='store_true') | 31 | parser.add_option("-U", "--universe", dest="universe", help="Modify packages in universe and multiverse", action='store_true') |
1491 | 35 | parser.add_option("-u", "--update", dest="update", help="Update CVEs with released package versions", action='store_true') | 32 | parser.add_option("-u", "--update", dest="update", help="Update CVEs with released package versions", action='store_true') |
1492 | 36 | parser.add_option("-b", "--include-boilerplate", help="Update boilerplate files", action='store_true') | ||
1493 | 37 | (opt, args) = parser.parse_args() | 33 | (opt, args) = parser.parse_args() |
1494 | 38 | 34 | ||
1495 | 39 | if not opt.release: | 35 | if not opt.release: |
1496 | @@ -52,16 +48,13 @@ pkgs = source_map.load(releases=[opt.release], skip_eol_releases=False) | |||
1497 | 52 | 48 | ||
1498 | 53 | cves = glob.glob('%s/CVE-*' % cve_lib.active_dir) | 49 | cves = glob.glob('%s/CVE-*' % cve_lib.active_dir) |
1499 | 54 | 50 | ||
1500 | 55 | if opt.include_boilerplate: | ||
1501 | 56 | cves += glob.glob('%s/00boilerplate.*' % cve_lib.active_dir) | ||
1502 | 57 | |||
1503 | 58 | if os.path.islink('embargoed'): | 51 | if os.path.islink('embargoed'): |
1504 | 59 | cves += glob.glob('embargoed/CVE-*') | 52 | cves += glob.glob('embargoed/CVE-*') |
1505 | 60 | cves += glob.glob('embargoed/EMB-*') | 53 | cves += glob.glob('embargoed/EMB-*') |
1506 | 61 | 54 | ||
1507 | 62 | for filename in cves: | 55 | for filename in cves: |
1510 | 63 | # we don't want to edit boilerplate symlinks as that will cause them | 56 | # we don't want to edit symlinks as that will cause them to become |
1511 | 64 | # to become unsymlinked | 57 | # unsymlinked |
1512 | 65 | if os.path.islink(filename): | 58 | if os.path.islink(filename): |
1513 | 66 | continue | 59 | continue |
1514 | 67 | 60 | ||
1515 | @@ -88,17 +81,13 @@ for filename in cves: | |||
1516 | 88 | and cve_lib.is_active_esm_release(cve_lib.get_orig_rel_name(opt.release)) | 81 | and cve_lib.is_active_esm_release(cve_lib.get_orig_rel_name(opt.release)) |
1517 | 89 | ): | 82 | ): |
1518 | 90 | status = data['pkgs'][src][opt.release] | 83 | status = data['pkgs'][src][opt.release] |
1522 | 91 | if cve.startswith('00boilerplate.'): | 84 | if status[1] != '': |
1520 | 92 | cve_lib.update_state(filename, src, opt.release, 'ignored', 'end of ESM support') | ||
1521 | 93 | elif status[1] != '': | ||
1523 | 94 | cve_lib.update_state(filename, src, opt.release, 'ignored', 'end of ESM support, was %s [%s]' % (status[0], status[1])) | 85 | cve_lib.update_state(filename, src, opt.release, 'ignored', 'end of ESM support, was %s [%s]' % (status[0], status[1])) |
1524 | 95 | else: | 86 | else: |
1525 | 96 | cve_lib.update_state(filename, src, opt.release, 'ignored', 'end of ESM support, was %s' % (status[0])) | 87 | cve_lib.update_state(filename, src, opt.release, 'ignored', 'end of ESM support, was %s' % (status[0])) |
1526 | 97 | elif 'LTS' in cve_lib.release_name(opt.release): | 88 | elif 'LTS' in cve_lib.release_name(opt.release): |
1527 | 98 | status = data['pkgs'][src][opt.release] | 89 | status = data['pkgs'][src][opt.release] |
1531 | 99 | if cve.startswith('00boilerplate.'): | 90 | if status[1] != '': |
1529 | 100 | cve_lib.update_state(filename, src, opt.release, 'ignored', 'end of standard support') | ||
1530 | 101 | elif status[1] != '': | ||
1532 | 102 | cve_lib.update_state(filename, src, opt.release, 'ignored', 'end of standard support, was %s [%s]' % (status[0], status[1])) | 91 | cve_lib.update_state(filename, src, opt.release, 'ignored', 'end of standard support, was %s [%s]' % (status[0], status[1])) |
1533 | 103 | else: | 92 | else: |
1534 | 104 | cve_lib.update_state(filename, src, opt.release, 'ignored', 'end of standard support, was %s' % (status[0])) | 93 | cve_lib.update_state(filename, src, opt.release, 'ignored', 'end of standard support, was %s' % (status[0])) |
One thing I caught in the most briefest of surface level reviews:
On Wed, Aug 03, 2022 at 07:05:15AM -0000, Alex Murray wrote: code-copies and amalgate that into data tracker/ data/embedded- code-copies" ]:
> +# TODO - decide if we want to keep this - for now leave it out
> +
> +# also parse debian's embedded-
> +for f in ["../security-
If you do keep this, please use the `secure_ testing_ path` cve-tracker. conf config file (obtained via read_config( )) instead of hard-coding a relative path and code-copies data.
from the ~/.ubuntu-
cve_lib.
repo name when reading debian's embedded-
Thanks.
--
Steve Beattie
<email address hidden>