Merge lp:~mrazik/jenkins-launchpad-plugin/stale-locks-take2 into lp:~private-ps-quality-team/jenkins-launchpad-plugin/trunk
- stale-locks-take2
- Merge into trunk
Status: | Merged |
---|---|
Approved by: | Francis Ginther |
Approved revision: | 138 |
Merged at revision: | 112 |
Proposed branch: | lp:~mrazik/jenkins-launchpad-plugin/stale-locks-take2 |
Merge into: | lp:~private-ps-quality-team/jenkins-launchpad-plugin/trunk |
Diff against target: |
1526 lines (+820/-274) 11 files modified
jlp.config (+0/-3) jlp/__init__.py (+0/-2) jlp/commands/autoland.py (+1/-3) jlp/jenkinsutils.py (+315/-10) jlp/launchpadutils.py (+6/-13) jlp/mergeproposalreview.py (+0/-55) tests/test_MergeProposalReview.py (+0/-29) tests/test_autoland.py (+3/-24) tests/test_jenkinsutils.py (+484/-6) tests/test_launchpadutils.py (+11/-12) tests/test_socketlock.py (+0/-117) |
To merge this branch: | bzr merge lp:~mrazik/jenkins-launchpad-plugin/stale-locks-take2 |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Francis Ginther | Approve | ||
PS Jenkins bot | continuous-integration | Approve | |
Review via email: mp+166452@code.launchpad.net |
Commit message
This removes the locking mechanism and instead queries jenkins if a specific job is running (with specific params).
Description of the change
Take 2 on removing locks and querying jenkins instead.
PS Jenkins bot (ps-jenkins) wrote : | # |
- 134. By mrazik
-
pep8 fixed
PS Jenkins bot (ps-jenkins) wrote : | # |
PASSED: Continuous integration, rev:134
http://
Executed test runs:
SUCCESS: http://
Click here to trigger a rebuild:
http://
Martin Mrazik (mrazik) wrote : | # |
I accidentally removed test_socketlock and then added it back. There were not changes in that file.
Martin Mrazik (mrazik) wrote : | # |
One thing that concerns me a bit is how much load this checking will generate. I'll probably need to do some testing first on a real jenkins (by changing the triggering job to use this branch or something the like).
Francis Ginther (fginther) wrote : | # |
I still need to review the tests, but I have the following comments (as discussed in our 1 on 1):
In trigger_ci_build() and trigger_al_build(), the
if launchpadutils.
continue
check should be the last check before proceeding to the trigger as it is likely the most time and resource intensive check.
- 135. By mrazik
-
moving testing_in_progress check to the very end of trigger_
al_build/ trigger_ ci_build
PS Jenkins bot (ps-jenkins) wrote : | # |
FAILED: Continuous integration, rev:135
http://
Executed test runs:
FAILURE: http://
Click here to trigger a rebuild:
http://
- 136. By mrazik
-
fixed tests
PS Jenkins bot (ps-jenkins) wrote : | # |
PASSED: Continuous integration, rev:136
http://
Executed test runs:
SUCCESS: http://
Click here to trigger a rebuild:
http://
Martin Mrazik (mrazik) wrote : | # |
the triggering job is now using this new branch. So far I didn't see anything strange and the build time of trigger-
There is one issue, though -- fasttrack merges. I just noticed a pointless merge and as far as I can see it is because the autolanding was triggered while the other generic-land (which I started manually due to some unrelated bug) was still running.
OTOH the problem is probably not introduced by this branch and explains why I'm seeing these pointless merges every now and then.
Martin Mrazik (mrazik) wrote : | # |
Just for the reference. This was the first trigger-
#8425 Jun 4, 2013 6:17:18 AM
Martin Mrazik (mrazik) wrote : | # |
> There is one issue, though -- fasttrack merges. I just noticed a pointless
> merge and as far as I can see it is because the autolanding was triggered
> while the other generic-land (which I started manually due to some unrelated
> bug) was still running.
> OTOH the problem is probably not introduced by this branch and explains why
> I'm seeing these pointless merges every now and then.
I was wrong about this. It is a newly introduced regression. OTOH solving this would most likely add yet another level of complexity and:
1. even if it happens nothing really bad happens (just the pointless merge error)
2. it shouldn't be happening very often as the generic-land is usually finished in 15 mins (the time between to triggering operations)
Francis Ginther (fginther) wrote : | # |
> > There is one issue, though -- fasttrack merges. I just noticed a pointless
> > merge and as far as I can see it is because the autolanding was triggered
> > while the other generic-land (which I started manually due to some unrelated
> > bug) was still running.
> > OTOH the problem is probably not introduced by this branch and explains why
> > I'm seeing these pointless merges every now and then.
>
> I was wrong about this. It is a newly introduced regression. OTOH solving this
> would most likely add yet another level of complexity and:
> 1. even if it happens nothing really bad happens (just the pointless merge
> error)
> 2. it shouldn't be happening very often as the generic-land is usually
> finished in 15 mins (the time between to triggering operations)
Right. The fasttrack MPs never call the autolanding job, they go straight to generic-land. As this new method looks for the MPs that were started from the autolanding job, it won't find a downstream build due to the upstream check.
I agree that we are safe for now as long as the second generic-land does not change the state of the MP from Merged to something else.
Francis Ginther (fginther) wrote : | # |
I think a testcase is needed with a different MP in a downstream job in TestIsJobOrDown
The scenario is that job was started with "http://
Some comments appear to be out of place:
- In is_job_
- is_job_
- In _get_build_
Other then that, this looks good.
Martin Mrazik (mrazik) wrote : | # |
> I think a testcase is needed with a different MP in a downstream job in
> TestIsJobOrDown
> ('downstream_
> {'expected': False,
> 'job_params': {'merge_proposal': 'http://
> 'jenkins_data': {
> 'builds': [{
> 'building': True,
> 'actions': [{
> 'causes': [{
> 'upstreamProject': 'job',
> 'upstreamBuild': 22
> }]
> }],
> 'result': None}],
> 'upstreamProjects': [{
> 'name': 'job',
> 'builds': [
> {
> 'number': 22,
> 'actions': [{'parameters': [{
> 'name': 'merge_proposal',
> 'value': 'http://
> ]}]}}),
>
> The scenario is that job was started with "http://
> in a downstream build (like generic-land). A new MP, "http://
> checked, no matching builds should be found. I'm guessing at the correct
> scenario, it may not be quite right.
Added.
> Some comments appear to be out of place:
> - In is_job_
> request below" which is actually in is_downstream_
> - is_job_
> missing parameters and return value in the docstring.
> - In _get_build_
> does not belong.
Fixed.
PS Jenkins bot (ps-jenkins) wrote : | # |
PASSED: Continuous integration, rev:138
http://
Executed test runs:
SUCCESS: http://
Click here to trigger a rebuild:
http://
Francis Ginther (fginther) wrote : | # |
Thanks for the updates, approve.
Preview Diff
1 | === modified file 'jlp.config' |
2 | --- jlp.config 2013-05-16 08:20:42 +0000 |
3 | +++ jlp.config 2013-06-05 09:44:54 +0000 |
4 | @@ -40,9 +40,6 @@ |
5 | #Usually you don't need to change this |
6 | launchpad_review_type: continuous-integration |
7 | |
8 | -# directory containing lockfiles for Launchpad merge proposals |
9 | -launchpadlocks_dir: /tmp/jenkins-launchpad-plugin/locks |
10 | - |
11 | #lock file that is being used to limit the number of parallel launchpad |
12 | #connections |
13 | lock_name: launchpad-trigger-lock |
14 | |
15 | === modified file 'jlp/__init__.py' |
16 | --- jlp/__init__.py 2013-05-16 08:23:40 +0000 |
17 | +++ jlp/__init__.py 2013-06-05 09:44:54 +0000 |
18 | @@ -89,8 +89,6 @@ |
19 | assert Branch # silence pyflakes |
20 | from .bzrrecipe import BzrRecipe |
21 | assert BzrRecipe # silence pyflakes |
22 | -from .mergeproposalreview import MergeProposalReview |
23 | -assert MergeProposalReview # silence pyflakes |
24 | from .launchpadagent import get_launchpad |
25 | assert get_launchpad # silence pyflakes |
26 | from .dputrunner import DputRunner |
27 | |
28 | === modified file 'jlp/commands/autoland.py' |
29 | --- jlp/commands/autoland.py 2013-05-17 15:53:01 +0000 |
30 | +++ jlp/commands/autoland.py 2013-06-05 09:44:54 +0000 |
31 | @@ -2,7 +2,7 @@ |
32 | from argparse import ArgumentParser |
33 | import argparse |
34 | from jlp.launchpadutils import build_state, LaunchpadVote |
35 | -from jlp import (launchpadutils, MergeProposalReview, Branch, |
36 | +from jlp import (launchpadutils, Branch, |
37 | DputRunner, jenkinsutils, get_launchpad, |
38 | get_config_option, logger) |
39 | from tarmac.exceptions import TarmacMergeError, BranchHasConflicts |
40 | @@ -156,8 +156,6 @@ |
41 | return 1 |
42 | finally: |
43 | target.cleanup() |
44 | - review = MergeProposalReview(mp) |
45 | - review.unlock() |
46 | |
47 | return 0 |
48 | |
49 | |
50 | === modified file 'jlp/jenkinsutils.py' |
51 | --- jlp/jenkinsutils.py 2013-05-20 11:44:39 +0000 |
52 | +++ jlp/jenkinsutils.py 2013-06-05 09:44:54 +0000 |
53 | @@ -6,7 +6,6 @@ |
54 | from textwrap import dedent |
55 | from . import get_json_jenkins |
56 | from lazr.restfulclient.errors import Unauthorized |
57 | -from .mergeproposalreview import MergeProposalReview |
58 | from jlp import logger, get_config_option |
59 | |
60 | |
61 | @@ -15,6 +14,315 @@ |
62 | urlparse.urlparse(url).path.replace('//', '/')) |
63 | |
64 | |
65 | +def _actions_has_param(actions, param, value): |
66 | + """ Internal method to see if actions json object (as returned by jenkins) |
67 | + includes a given parameter (and value). |
68 | + |
69 | + It returns True or False based on whether a parameter/value pair is |
70 | + included in the given jenkins job actions ("actions" is where jenkins |
71 | + stores the parameters) |
72 | + |
73 | + :param actions: json result (list of dictionaries) as returned by jenkins |
74 | + :param param: name of the jenkins job parameter to check |
75 | + (e.g. 'merge_proposal') |
76 | + :param value: value that we want to check |
77 | + |
78 | + Returns True/False. |
79 | + """ |
80 | + for action in actions: |
81 | + if 'parameters' not in action: |
82 | + continue |
83 | + for parameter in action['parameters']: |
84 | + if parameter['name'] == param and 'value' in parameter and \ |
85 | + parameter['value'] == value: |
86 | + return True |
87 | + return False |
88 | + |
89 | + |
90 | +def _actions_has_all_params(actions, job_params): |
91 | + """ Internal method to see if actions json object (as returned by jenkins) |
92 | + includes all given job_params. See also _actions_has_param. |
93 | + |
94 | + The main difference between this and _actions_has_param is that this |
95 | + method checks for all params in a given dictionary (job_params). |
96 | + |
97 | + :param actions: json result (list of dictionaries) as returned by jenkins |
98 | + :param job_params: dictionary of parameters to check in the form of: |
99 | + {'parameter': 'value', 'parameter2': 'value2'} |
100 | + |
101 | + Returns True/False. |
102 | + """ |
103 | + all_params = True |
104 | + for param in job_params: |
105 | + all_params = all_params and \ |
106 | + _actions_has_param(actions, param, job_params[param]) |
107 | + return all_params |
108 | + |
109 | + |
110 | +def get_running_builds(job, job_params={}): |
111 | + """ For a given jenkins job return a list of currently active builds |
112 | + :param job: name of the jenkins job |
113 | + :param job_params: dictionary of job parameters |
114 | + |
115 | + Returns an empty list if there is no active build with given |
116 | + parameters. If job_params is empty (default) it is assumed no check |
117 | + for job_params is required. |
118 | + """ |
119 | + |
120 | + json_jenkins = get_json_jenkins() |
121 | + jenkins_url = get_config_option('jenkins_url') |
122 | + request = \ |
123 | + '{}/job/{}/api/json'.format(jenkins_url, job) +\ |
124 | + '?depth=1&tree=builds[number,building,actions[parameters[name,value]]]' |
125 | + data = json_jenkins.get_json_data(request, append_api=False) |
126 | + if 'builds' not in data: |
127 | + return [] |
128 | + result = [] |
129 | + |
130 | + for build in data['builds']: |
131 | + if build['building'] and \ |
132 | + _actions_has_all_params(build['actions'], job_params): |
133 | + result.append(build['number']) |
134 | + return result |
135 | + |
136 | + |
137 | +def _get_build_actions(job, build): |
138 | + """Internal method. For a given job and build return the "actions" object |
139 | + as returned by jenkins. This object holds the list of parameters of a |
140 | + given build. |
141 | + |
142 | + :param: job: jenkins job name |
143 | + :param: build: jenkins build number |
144 | + """ |
145 | + |
146 | + json_jenkins = get_json_jenkins() |
147 | + jenkins_url = get_config_option('jenkins_url') |
148 | + |
149 | + job = json_jenkins.get_json_data('{}/job/{}/{}'.format( |
150 | + jenkins_url, job, build)) |
151 | + if 'actions' not in job: |
152 | + return [] |
153 | + return job['actions'] |
154 | + |
155 | + |
156 | +def _is_item_queued_by_upstream(item, upstream_job, upstream_params): |
157 | + """Internal method to check if a given queued item |
158 | + (as returned by jenkins; see the "$jenkins_url/queue/json/api" API) was |
159 | + queued by an upstream job with a given upstream_job parameters. |
160 | + |
161 | + The idea here is to check e.g. if generic_land (item) was queued by |
162 | + unity-autolanding (upstream_job) with {'merge_proposal': 'http://'} |
163 | + (upstream_params). If there is such a generic-land queued then this method |
164 | + returns True otherwise it returns False |
165 | + |
166 | + :param item: "item" as returned in the json jenkins reply (see the queue |
167 | + API) |
168 | + :param: upstream_job: name of the upstream_job |
169 | + :param upstream_params: dictionary of the upstream job params we are |
170 | + interested in |
171 | + """ |
172 | + upstream_project = None |
173 | + upstream_build = None |
174 | + |
175 | + if 'actions' not in item: |
176 | + return False |
177 | + |
178 | + for action in item['actions']: |
179 | + upstream_project = None |
180 | + if 'causes' not in action: |
181 | + continue |
182 | + for cause in action['causes']: |
183 | + if isinstance(cause, dict) and 'upstreamProject' in cause: |
184 | + upstream_project = cause['upstreamProject'] |
185 | + # if the upstreamProject differes we can return False right |
186 | + # away |
187 | + if upstream_project != upstream_job: |
188 | + return False |
189 | + |
190 | + upstream_build = cause['upstreamBuild'] |
191 | + if upstream_params: |
192 | + upstream_actions = _get_build_actions( |
193 | + upstream_project, upstream_build) |
194 | + if _actions_has_all_params(upstream_actions, |
195 | + upstream_params): |
196 | + return True |
197 | + else: |
198 | + return True |
199 | + return False |
200 | + |
201 | + |
202 | +def is_job_queued(job, queue=None, job_params={}, upstream_job=None, |
203 | + upstream_params={}): |
204 | + """ Return if a job is queued in a given jenkins queue. |
205 | + If upstream_job is supplied the job in queue must have been triggered |
206 | + by upstream_job. |
207 | + If upstream_params are given then the triggering upstream_job must |
208 | + include the given upstream_params (dictionary of the form of |
209 | + {'param1': 'value1'}). |
210 | + |
211 | + :param job: jenkins job name |
212 | + :param queue: jenkins queue (as returned by $jenkinsurl/queue/api/json) |
213 | + :param upstream_job: name of the upstream job that was supposed to |
214 | + queue job |
215 | + :param upstream_params: parameters to check in the upstream_job |
216 | + """ |
217 | + if queue is None: |
218 | + json_jenkins = get_json_jenkins() |
219 | + jenkins_url = get_config_option('jenkins_url') |
220 | + queue = json_jenkins.get_json_data('{}/queue'.format(jenkins_url)) |
221 | + |
222 | + for item in queue['items']: |
223 | + if 'task' in item and 'name' in item['task']: |
224 | + # we can ignore jobs that are different from the job we are |
225 | + # looking for |
226 | + if item['task']['name'] != job: |
227 | + continue |
228 | + |
229 | + # check if the required parameters are present |
230 | + if job_params and 'actions' in item and \ |
231 | + not _actions_has_all_params(item['actions'], job_params): |
232 | + continue |
233 | + |
234 | + # if we have a match and we don't need to check |
235 | + # for upstream_job then we are done |
236 | + if upstream_job is None: |
237 | + return True |
238 | + |
239 | + if _is_item_queued_by_upstream(item, |
240 | + upstream_job, |
241 | + upstream_params): |
242 | + return True |
243 | + |
244 | + return False |
245 | + |
246 | + |
247 | +def is_downstream_job_running(downstream_project, |
248 | + upstream_job, upstream_params): |
249 | + """Check if downstream job is running. |
250 | + It returns True only if the downstream_project was triggered by |
251 | + upstream_job and upstream_job has upstream_params (e.g. a given merge |
252 | + proposal). |
253 | + |
254 | + This is unfortunately not 100% reliable. For some reasons the json request |
255 | + in (&tree=builds[number,url,...]) sometimes doesn't return what it should |
256 | + (maybe the jenkins needs to update some internal state). Once the |
257 | + relationship between jobs is somehow recorded the subsequent calls always |
258 | + return them correctly. |
259 | + |
260 | + :param downstream_project: name of the jenkins downstream_project (e.g. |
261 | + generic-land) |
262 | + :param upstream_job: name of the upstream_job (e.g. unity-autolanding) |
263 | + :param upstream_params: dictionary of the parameters (e.g. |
264 | + {'merge_proposal': 'http://...'}) |
265 | + """ |
266 | + json_jenkins = get_json_jenkins() |
267 | + jenkins_url = get_config_option('jenkins_url') |
268 | + json_request = \ |
269 | + '/api/json?depth=2&tree=builds[number,url,building,' + \ |
270 | + 'actions[causes[upstreamBuild,upstreamProject]]],' + \ |
271 | + 'upstreamProjects[name,builds[number,actions[parameters[name,value]]]]' |
272 | + |
273 | + request = '{}job/{}{}'.format( |
274 | + jenkins_url, downstream_project, json_request) |
275 | + |
276 | + data = json_jenkins.get_json_data(request, append_api=False) |
277 | + for build in data['builds']: |
278 | + if 'actions' not in build: |
279 | + continue |
280 | + for action in build['actions']: |
281 | + if 'causes' not in action: |
282 | + continue |
283 | + for cause in action['causes']: |
284 | + upstream_project = None |
285 | + if isinstance(cause, dict) and 'upstreamProject' in cause: |
286 | + upstream_project = cause['upstreamProject'] |
287 | + upstream_build = cause['upstreamBuild'] |
288 | + |
289 | + if build['building'] and upstream_project == upstream_job: |
290 | + #now we need to find the builds of the right upstream_job |
291 | + #in data['upstreamProjects'] |
292 | + builds = {} |
293 | + for upstream_project in data['upstreamProjects']: |
294 | + if upstream_project['name'] == upstream_job: |
295 | + builds = upstream_project['builds'] |
296 | + |
297 | + #find the upstream build in the data |
298 | + for upstreamBuild in builds: |
299 | + if upstreamBuild['number'] != upstream_build: |
300 | + continue |
301 | + if _actions_has_all_params(upstreamBuild['actions'], |
302 | + upstream_params): |
303 | + return True |
304 | + return False |
305 | + |
306 | + |
307 | +def is_job_or_downstream_building(job, job_params): |
308 | + """Check if a job (with given parameters) or its downstream job is |
309 | + currently building or not. |
310 | + |
311 | + Usually this will be called like this: |
312 | + is_job_or_downstream_building( |
313 | + 'my-project-ci', |
314 | + {'merge_proposal': 'http://my-merge-proposal/...'}) |
315 | + It will return True iff: |
316 | + - my-project-ci build is running or queued and it has the correct |
317 | + merge_proposal param |
318 | + - a downstream project (e.g. generic-land) is running (or queued) and |
319 | + my-project-ci which triggered the downstream has the right |
320 | + merge_proposal param |
321 | + |
322 | + This is unfortunately not 100% reliable because is_downstream_job_running |
323 | + is not 100% reliable. See the docstring there for more info. |
324 | + |
325 | + :param job: name of the jenkins job |
326 | + :param job_params: parameters of the jenkins job we want to check |
327 | + """ |
328 | + |
329 | + json_jenkins = get_json_jenkins() |
330 | + jenkins_url = get_config_option('jenkins_url') |
331 | + |
332 | + # get the jenkins queue. This is to prevent a case when a job is queued |
333 | + # while we are preforming the other checks. It might be the other way |
334 | + # round (i.e. a job will get out of the queue and finish its execution) |
335 | + # and our result won't be correct. Doing it this way is however safer. The |
336 | + # worst thing that can happen is that our job will be scheduled with the |
337 | + # next poll |
338 | + queue = json_jenkins.get_json_data('{}/queue'.format(jenkins_url)) |
339 | + |
340 | + # first check if a job with a given name and given params isn't already |
341 | + # running |
342 | + if get_running_builds(job, job_params): |
343 | + logger.debug('{} is currently building'.format(job)) |
344 | + return True |
345 | + |
346 | + # next check if such job (including the params) is not queued |
347 | + if is_job_queued(job, queue, job_params=job_params): |
348 | + logger.debug('{} is currently queued'.format(job)) |
349 | + return True |
350 | + |
351 | + # now do similar checks like the above for the (directly) downstream |
352 | + # projects |
353 | + downstream_projects = get_downstream_projects( |
354 | + json_jenkins, '{}/job/{}'.format(jenkins_url, job)) |
355 | + |
356 | + for downstream_project in downstream_projects: |
357 | + # first check the queue; If the downstream_project is in the queue |
358 | + # and was triggered by job then we are done |
359 | + if is_job_queued(downstream_project, queue, upstream_job=job, |
360 | + upstream_params=job_params): |
361 | + logger.debug('Downstream project "{}" is queued.'.format( |
362 | + downstream_project)) |
363 | + return True |
364 | + |
365 | + # build is not queued. Lets check if it is not running |
366 | + if is_downstream_job_running(downstream_project, job, job_params): |
367 | + logger.debug('Downstream project "{}" is building'.format( |
368 | + downstream_project)) |
369 | + return True |
370 | + |
371 | + return False |
372 | + |
373 | + |
374 | def is_job_published(job): |
375 | if not job: |
376 | logger.debug('is_job_published: no job specified') |
377 | @@ -353,14 +661,11 @@ |
378 | else: |
379 | logger.debug('Doing a regular build') |
380 | |
381 | - review = MergeProposalReview(mp) |
382 | try: |
383 | logger.debug('Starting job: ' + str(jenkins_params)) |
384 | - review.lock(jenkins_job) |
385 | j.build_job(jenkins_job, jenkins_params) |
386 | except jenkins.JenkinsException: |
387 | logger.debug('Starting a job failed') |
388 | - review.unlock() |
389 | message = "ERROR: failed to start a jenkins job" |
390 | mp.createComment( |
391 | review_type=get_config_option('launchpad_review_type'), |
392 | @@ -498,12 +803,12 @@ |
393 | if not launchpadutils.user_allowed_to_trigger_jobs(mp.registrant): |
394 | continue |
395 | |
396 | - if launchpadutils.testing_in_progress(mp): |
397 | - continue |
398 | - |
399 | if launchpadutils.latest_candidate_validated(launchpad_user, mp): |
400 | continue |
401 | |
402 | + if launchpadutils.testing_in_progress(mp, jenkins_job): |
403 | + continue |
404 | + |
405 | ret = start_jenkins_job(launchpad_user, jenkins_url, jenkins_job, mp) |
406 | result = ret and result |
407 | return result |
408 | @@ -544,13 +849,13 @@ |
409 | launchpadutils.LaunchpadVote.NEEDS_FIXING) |
410 | continue |
411 | |
412 | - if launchpadutils.testing_in_progress(mp): |
413 | - continue |
414 | - |
415 | if launchpadutils.unapproved_prerequisite_exists(mp): |
416 | logger.debug('Unapproved prerequisite exists for ' + str(mp)) |
417 | continue |
418 | |
419 | + if launchpadutils.testing_in_progress(mp, jenkins_job): |
420 | + continue |
421 | + |
422 | logger.debug('Going to trigger merge: ' + str(mp.source_branch)) |
423 | |
424 | launchpad_user = lp_handle.people( |
425 | |
426 | === modified file 'jlp/launchpadutils.py' |
427 | --- jlp/launchpadutils.py 2013-05-10 14:10:52 +0000 |
428 | +++ jlp/launchpadutils.py 2013-06-05 09:44:54 +0000 |
429 | @@ -1,6 +1,5 @@ |
430 | import re |
431 | from jlp import logger, get_config_option |
432 | -from .mergeproposalreview import MergeProposalReview |
433 | import jenkinsutils |
434 | from . import get_json_jenkins |
435 | |
436 | @@ -42,8 +41,6 @@ |
437 | mp.createComment( |
438 | review_type=get_config_option('launchpad_review_type'), |
439 | subject=subject, content=message, vote=vote) |
440 | - review = MergeProposalReview(mp) |
441 | - review.unlock() |
442 | else: |
443 | mp.createComment( |
444 | review_type=get_config_option('launchpad_review_type'), |
445 | @@ -172,17 +169,17 @@ |
446 | return False |
447 | |
448 | |
449 | -def testing_in_progress(mp): |
450 | +def testing_in_progress(mp, jenkins_job): |
451 | """ Returns whether testing is in progress. |
452 | |
453 | - We are using a file (represented by MergeProposalReview class) to indicate |
454 | - a testing is in progress. This file is created when the jenkins job is |
455 | - started and cleaned up either by generic-update-mp job or generic-land. |
456 | + This method queries jenkins to find out if we are not running a similar job |
457 | + (or downstream such as generic_land) already. |
458 | |
459 | :param mp: launchpad merge proposal handle |
460 | + :param jenkins_job: name of the jenkins job that is going to be triggered |
461 | """ |
462 | - review = MergeProposalReview(mp) |
463 | - if review.is_locked(): |
464 | + if jenkinsutils.is_job_or_downstream_building( |
465 | + jenkins_job, job_params={'merge_proposal': mp.web_link}): |
466 | logger.debug('Skipping this MP. It is curently being ' + |
467 | 'tested by Jenkins.') |
468 | return True |
469 | @@ -279,8 +276,6 @@ |
470 | mp.createComment(review_type=get_config_option('launchpad_review_type'), |
471 | vote=LaunchpadVote.APPROVE, subject=state, |
472 | content=content) |
473 | - review = MergeProposalReview(mp) |
474 | - review.unlock() |
475 | |
476 | |
477 | def disapprove_mp(mp, revision, build_url, reason=None): |
478 | @@ -305,8 +300,6 @@ |
479 | mp.createComment(review_type=get_config_option('launchpad_review_type'), |
480 | vote=LaunchpadVote.NEEDS_FIXING, subject=subject, |
481 | content=content) |
482 | - review = MergeProposalReview(mp) |
483 | - review.unlock() |
484 | |
485 | |
486 | class UpdateMergeProposalException(Exception): |
487 | |
488 | === removed file 'jlp/mergeproposalreview.py' |
489 | --- jlp/mergeproposalreview.py 2013-05-10 14:10:52 +0000 |
490 | +++ jlp/mergeproposalreview.py 1970-01-01 00:00:00 +0000 |
491 | @@ -1,55 +0,0 @@ |
492 | -import os |
493 | -import hashlib |
494 | -from jlp import logger, get_config_option |
495 | - |
496 | - |
497 | -class MergeProposalReview(): |
498 | - _mp = None |
499 | - |
500 | - def __init__(self, mp): |
501 | - self._mp = mp |
502 | - |
503 | - @property |
504 | - def mp(self): |
505 | - """Merge proposal.""" |
506 | - |
507 | - return self._mp |
508 | - |
509 | - def _get_hash(self): |
510 | - return hashlib.sha224(self.mp.web_link).hexdigest() |
511 | - |
512 | - def get_lockfile(self): |
513 | - return '{dir}/{hash}'.format( |
514 | - dir=get_config_option('launchpadlocks_dir'), |
515 | - hash=self._get_hash()) |
516 | - |
517 | - def lock(self, job_name): |
518 | - locks_dir = get_config_option('launchpadlocks_dir') |
519 | - if not os.path.exists(locks_dir): |
520 | - os.makedirs(locks_dir) |
521 | - lockfile = open(self.get_lockfile(), 'w') |
522 | - lock_info = "{jenkins_job}\n{merge_proposal}\n" |
523 | - lock_info = lock_info.format(jenkins_job=job_name, |
524 | - merge_proposal=self.mp.web_link) |
525 | - lockfile.write(lock_info) |
526 | - lockfile.close() |
527 | - |
528 | - def unlock(self): |
529 | - try: |
530 | - os.remove(self.get_lockfile()) |
531 | - return True |
532 | - except OSError as error: |
533 | - if error.errno == 2: |
534 | - logger.debug("mp lock doesn't exists. Ignoring this unlock.") |
535 | - return False |
536 | - |
537 | - def get_jenkins_job(self): |
538 | - if self.is_locked(): |
539 | - lockfile = open(self.get_lockfile(), 'r') |
540 | - job_name = lockfile.readline().rstrip() |
541 | - return job_name |
542 | - else: |
543 | - return None |
544 | - |
545 | - def is_locked(self): |
546 | - return os.path.exists(self.get_lockfile()) |
547 | |
548 | === removed file 'tests/test_MergeProposalReview.py' |
549 | --- tests/test_MergeProposalReview.py 2013-04-26 11:24:46 +0000 |
550 | +++ tests/test_MergeProposalReview.py 1970-01-01 00:00:00 +0000 |
551 | @@ -1,29 +0,0 @@ |
552 | -import unittest |
553 | -from jlp import MergeProposalReview |
554 | -from mock import MagicMock |
555 | - |
556 | - |
557 | -class MergeProposalReviewTestCase(unittest.TestCase): |
558 | - review = None |
559 | - job_name = 'test-job' |
560 | - |
561 | - def setUp(self): |
562 | - mp = MagicMock() |
563 | - mp.web_link = 'http://my-example-url.com' |
564 | - self.review = MergeProposalReview(mp) |
565 | - |
566 | - def test_is_unlocked(self): |
567 | - self.review.unlock() |
568 | - self.assertFalse(self.review.is_locked()) |
569 | - |
570 | - def test_is_locked(self): |
571 | - self.review.lock(self.job_name) |
572 | - self.assertTrue(self.review.is_locked()) |
573 | - |
574 | - def test_get_job_name(self): |
575 | - self.review.lock(self.job_name) |
576 | - self.assertEquals(self.review.get_jenkins_job(), self.job_name) |
577 | - |
578 | - def test_get_job_name_with_unlocked_mp(self): |
579 | - self.review.unlock() |
580 | - self.assertEquals(self.review.get_jenkins_job(), None) |
581 | |
582 | === modified file 'tests/test_autoland.py' |
583 | --- tests/test_autoland.py 2013-05-17 15:53:01 +0000 |
584 | +++ tests/test_autoland.py 2013-06-05 09:44:54 +0000 |
585 | @@ -172,16 +172,10 @@ |
586 | self.branch_patch = patch('jlp.commands.autoland.Branch', new=branch) |
587 | self.target = target |
588 | self.Branch = self.branch_patch.start() |
589 | - self.MergeProposalReview = MagicMock() |
590 | - self.merge_proposal_review_patch = patch( |
591 | - 'jlp.commands.autoland.MergeProposalReview', |
592 | - new=MagicMock(return_value=self.MergeProposalReview)) |
593 | - self.merge_proposal_review_patch.start() |
594 | |
595 | def tearDown(self): |
596 | super(TestAutolandForMergeAndCommit, self).tearDown() |
597 | self.branch_patch.stop() |
598 | - self.merge_proposal_review_patch.stop() |
599 | |
600 | |
601 | class TestAutolandMergeAndCommit(TestAutolandForMergeAndCommit): |
602 | @@ -194,7 +188,6 @@ |
603 | self.assertTrue(self.target.merge.call_count == 1) |
604 | self.assertTrue(self.target.commit.call_count == 1) |
605 | self.assertTrue(close_bugs_mock.call_count == 1) |
606 | - self.MergeProposalReview.unlock.assert_called_once_with() |
607 | |
608 | |
609 | class TestAutolandVoting(TestAutolandForMergeAndCommit): |
610 | @@ -244,7 +237,6 @@ |
611 | self.assertTrue(self.target.merge.call_count == 1) |
612 | self.assertTrue(self.target.commit.call_count == 1) |
613 | self.assertTrue(close_bugs_mock.call_count == 1) |
614 | - self.MergeProposalReview.unlock.assert_called_once_with() |
615 | |
616 | |
617 | class TestAutolandWithMultipleDputs(TestAutolandForMergeAndCommit): |
618 | @@ -326,7 +318,6 @@ |
619 | self.assertTrue(self.target.merge.call_count == 1) |
620 | self.assertTrue(self.target.commit.call_count == 1) |
621 | self.assertTrue(close_bugs_mock.call_count == 1) |
622 | - self.MergeProposalReview.unlock.assert_called_once_with() |
623 | self.assertTrue(self.dputRunner.dput.call_count == 1) |
624 | |
625 | |
626 | @@ -347,15 +338,11 @@ |
627 | super(TestAutolandMergeAndCommitMergeConflict, self).tearDown() |
628 | self.branch_patch.stop() |
629 | |
630 | - @patch('jlp.commands.autoland.MergeProposalReview') |
631 | - def test_merge_and_commit_merge_conflict(self, MergeProposalReview): |
632 | + def test_merge_and_commit_merge_conflict(self): |
633 | sys_argv = ['autoland.py', '-r', 'PASSED', |
634 | '-v', '123', '-m', 'url'] |
635 | - review = MagicMock() |
636 | - MergeProposalReview.return_value = review |
637 | with patch('sys.argv', sys_argv): |
638 | self.assertTrue(autoland.autoland() == 1) |
639 | - self.assertEqual(review.unlock.call_count, 1) |
640 | |
641 | |
642 | class TestAutolandMergeAndCommitPushFailed(TestAutoland): |
643 | @@ -374,15 +361,11 @@ |
644 | super(TestAutolandMergeAndCommitPushFailed, self).tearDown() |
645 | self.branch_patch.stop() |
646 | |
647 | - @patch('jlp.commands.autoland.MergeProposalReview') |
648 | - def test_merge_and_commit_push_error(self, MergeProposalReview): |
649 | + def test_merge_and_commit_push_error(self): |
650 | sys_argv = ['autoland.py', '-r', 'PASSED', |
651 | '-v', '123', '-m', 'url'] |
652 | - review = MagicMock() |
653 | - MergeProposalReview.return_value = review |
654 | with patch('sys.argv', sys_argv): |
655 | self.assertTrue(autoland.autoland() == 1) |
656 | - self.assertEqual(review.unlock.call_count, 1) |
657 | |
658 | |
659 | class TestAutolandMergeAndCommitLockFailed(TestAutoland): |
660 | @@ -401,15 +384,11 @@ |
661 | super(TestAutolandMergeAndCommitLockFailed, self).tearDown() |
662 | self.branch_patch.stop() |
663 | |
664 | - @patch('jlp.commands.autoland.MergeProposalReview') |
665 | - def test_merge_and_commit_lock_failed(self, MergeProposalReview): |
666 | + def test_merge_and_commit_lock_failed(self): |
667 | sys_argv = ['autoland.py', '-r', 'PASSED', |
668 | '-v', '123', '-m', 'url'] |
669 | - review = MagicMock() |
670 | - MergeProposalReview.return_value = review |
671 | with patch('sys.argv', sys_argv): |
672 | self.assertTrue(autoland.autoland() == 1) |
673 | - self.assertEqual(review.unlock.call_count, 1) |
674 | |
675 | |
676 | class TestAutolandMergeAndCommitUnapprovedChange(TestAutoland): |
677 | |
678 | === modified file 'tests/test_jenkinsutils.py' |
679 | --- tests/test_jenkinsutils.py 2013-05-20 11:44:39 +0000 |
680 | +++ tests/test_jenkinsutils.py 2013-06-05 09:44:54 +0000 |
681 | @@ -70,6 +70,476 @@ |
682 | "%s" % (err.exception)) |
683 | |
684 | |
685 | +class TestIsJobOrDownstreamBuilding(unittest.TestCase): |
686 | + def setUp(self): |
687 | + self.get_config_option_patch = patch( |
688 | + 'jlp.jenkinsutils.get_config_option', new=lambda option: '') |
689 | + self.get_config_option_patch.start() |
690 | + self.get_json_jenkins = self.get_json_jenkins_patch = patch( |
691 | + 'jlp.jenkinsutils.get_json_jenkins') |
692 | + self.get_json_jenkins_patch.start() |
693 | + |
694 | + def tearDown(self): |
695 | + self.get_config_option_patch.stop() |
696 | + self.get_json_jenkins_patch.stop() |
697 | + |
698 | + @patch('jlp.jenkinsutils.get_running_builds', |
699 | + new=lambda job, job_params: [1]) |
700 | + def test_job_is_building(self): |
701 | + self.assertTrue( |
702 | + jenkinsutils.is_job_or_downstream_building('job', {})) |
703 | + |
704 | + @patch('jlp.jenkinsutils.get_running_builds', |
705 | + new=lambda job, job_params: []) |
706 | + @patch('jlp.jenkinsutils.is_job_queued', |
707 | + new=lambda job, queue, job_params: True) |
708 | + def test_job_is_queued(self): |
709 | + self.assertTrue( |
710 | + jenkinsutils.is_job_or_downstream_building('job', {})) |
711 | + |
712 | + @patch('jlp.jenkinsutils.get_running_builds', |
713 | + new=lambda x, job_params: []) |
714 | + @patch('jlp.jenkinsutils.get_downstream_projects', new=lambda x, z: [1]) |
715 | + def test_downstream_job_is_queued(self): |
716 | + def is_job_queued(job, queue, job_params={}, upstream_job=None, |
717 | + upstream_params={}): |
718 | + if upstream_job: |
719 | + return True |
720 | + return False |
721 | + with patch('jlp.jenkinsutils.is_job_queued', new=is_job_queued): |
722 | + self.assertTrue(jenkinsutils.is_job_or_downstream_building( |
723 | + 'job', {})) |
724 | + |
725 | + |
726 | +class TestIsJobOrDownstreamBuildingScenarios(TestWithScenarios): |
727 | + scenarios = [ |
728 | + ('downstream_job_building_no_params_check', |
729 | + {'expected': True, |
730 | + 'job_params': {}, |
731 | + 'jenkins_data': { |
732 | + 'builds': [{ |
733 | + 'building': True, |
734 | + 'actions': [{ |
735 | + 'causes': [{ |
736 | + 'upstreamProject': 'job', |
737 | + 'upstreamBuild': 22 |
738 | + }] |
739 | + }], |
740 | + 'result': None}], |
741 | + 'upstreamProjects': [{ |
742 | + 'name': 'job', |
743 | + 'builds': [{ |
744 | + 'number': 22, |
745 | + 'actions': [{'parameters': [{ |
746 | + 'name': 'merge_proposal', |
747 | + 'value': 'http://mp3'}]}]}]}]}}), |
748 | + ('downstream_job_building_with_params_check', |
749 | + {'expected': True, |
750 | + 'job_params': {'merge_proposal': 'http://mp'}, |
751 | + 'jenkins_data': { |
752 | + 'builds': [{ |
753 | + 'building': True, |
754 | + 'actions': [{ |
755 | + 'causes': [{ |
756 | + 'upstreamProject': 'job', |
757 | + 'upstreamBuild': 22 |
758 | + }] |
759 | + }], |
760 | + 'result': None}], |
761 | + 'upstreamProjects': [{ |
762 | + 'name': 'job', |
763 | + 'builds': [ |
764 | + { |
765 | + 'number': 23, |
766 | + 'actions': [{'parameters': [{ |
767 | + 'name': 'merge_proposal', |
768 | + 'value': 'http://different-mp'}]}]}, |
769 | + { |
770 | + 'number': 22, |
771 | + 'actions': [{'parameters': [{ |
772 | + 'name': 'merge_proposal', |
773 | + 'value': 'http://mp'}]}]} |
774 | + ]}]}}), |
775 | + ('downstream_job_building_with_different_mp', |
776 | + {'expected': False, |
777 | + 'job_params': {'merge_proposal': 'http://mp'}, |
778 | + 'jenkins_data': { |
779 | + 'builds': [{ |
780 | + 'building': True, |
781 | + 'actions': [{ |
782 | + 'causes': [{ |
783 | + 'upstreamProject': 'job', |
784 | + 'upstreamBuild': 22 |
785 | + }] |
786 | + }], |
787 | + 'result': None}], |
788 | + 'upstreamProjects': [{ |
789 | + 'name': 'job', |
790 | + 'builds': [ |
791 | + { |
792 | + 'number': 22, |
793 | + 'actions': [{'parameters': [{ |
794 | + 'name': 'merge_proposal', |
795 | + 'value': 'http://different-mp'}]}]} |
796 | + ]}]}}), |
797 | + |
798 | + ('no_action', |
799 | + {'expected': False, |
800 | + 'job_params': {}, |
801 | + 'jenkins_data': { |
802 | + 'builds': [{ |
803 | + 'result': None |
804 | + }]}}), |
805 | + ('different_upstream', |
806 | + {'expected': False, |
807 | + 'job_params': {}, |
808 | + 'jenkins_data': { |
809 | + 'builds': [{ |
810 | + 'building': True, |
811 | + 'actions': [{ |
812 | + 'causes': [{ |
813 | + 'upstreamProject': 'different_job', |
814 | + 'upstreamBuild': 23 |
815 | + }] |
816 | + }], |
817 | + 'result': None |
818 | + }]}}), |
819 | + ('no_causes', |
820 | + {'expected': False, |
821 | + 'job_params': {}, |
822 | + 'jenkins_data': { |
823 | + 'builds': [{ |
824 | + 'actions': [{ |
825 | + }], |
826 | + 'result': None |
827 | + }]}}), |
828 | + ('no_upstream', |
829 | + {'expected': False, |
830 | + 'job_params': {}, |
831 | + 'jenkins_data': { |
832 | + 'builds': [{ |
833 | + 'building': True, |
834 | + 'actions': [{ |
835 | + 'causes': [{ |
836 | + }] |
837 | + }], |
838 | + 'result': None |
839 | + }]}}), |
840 | + ] |
841 | + |
842 | + @patch('jlp.jenkinsutils.get_config_option', new=lambda x: '') |
843 | + @patch('jlp.jenkinsutils.get_running_builds', |
844 | + new=lambda job, job_params={}: []) |
845 | + @patch('jlp.jenkinsutils.get_downstream_projects', |
846 | + new=lambda x, z: ['downstream_job']) |
847 | + @patch('jlp.jenkinsutils.is_job_queued', |
848 | + new=lambda x, y, upstream_job=None, |
849 | + job_params={}, upstream_params={}: False) |
850 | + def test_is_job_or_downstream_building(self): |
851 | + |
852 | + jenkins = MagicMock() |
853 | + jenkins.get_json_data = lambda x, append_api=True: self.jenkins_data |
854 | + with patch('jlp.jenkinsutils.get_json_jenkins', |
855 | + new=MagicMock(return_value=jenkins)): |
856 | + self.assertEquals( |
857 | + jenkinsutils.is_job_or_downstream_building( |
858 | + 'job', self.job_params), |
859 | + self.expected) |
860 | + |
861 | + |
862 | +class TestGetRunningBuilds(TestWithScenarios): |
863 | + scenarios = [ |
864 | + ('single_job_running', |
865 | + {'expected': [1], |
866 | + 'job_params': {}, |
867 | + 'jenkins_data': { |
868 | + 'builds': [{ |
869 | + 'building': True, |
870 | + 'number': 1, |
871 | + 'actions': [] |
872 | + }]}}), |
873 | + ('single_job_running_different_params', |
874 | + {'expected': [], |
875 | + 'job_params': {'merge_proposal': 'http://mp'}, |
876 | + 'jenkins_data': { |
877 | + 'builds': [{ |
878 | + 'building': True, |
879 | + 'number': 1, |
880 | + 'actions': [{ |
881 | + 'parameters': [{ |
882 | + 'name': 'merge_proposal', |
883 | + 'value': 'http://different-mp' |
884 | + }]}] |
885 | + }]}}), |
886 | + ('single_job_running_same_params', |
887 | + {'expected': [1], |
888 | + 'job_params': {'merge_proposal': 'http://mp'}, |
889 | + 'jenkins_data': { |
890 | + 'builds': [{ |
891 | + 'building': True, |
892 | + 'number': 1, |
893 | + 'actions': [{ |
894 | + 'parameters': [{ |
895 | + 'name': 'merge_proposal', |
896 | + 'value': 'http://mp' |
897 | + }]}] |
898 | + }]}}), |
899 | + ('two_jobs_running', |
900 | + {'expected': [1, 2], |
901 | + 'job_params': {}, |
902 | + 'jenkins_data': { |
903 | + 'builds': [{ |
904 | + 'building': True, |
905 | + 'actions': [], |
906 | + 'number': 1}, |
907 | + {'building': True, |
908 | + 'actions': [], |
909 | + 'number': 2} |
910 | + ]}}), |
911 | + ('two_jobs_running_only_one_matches', |
912 | + {'expected': [2], |
913 | + 'job_params': {'merge_proposal': 'http://mp'}, |
914 | + 'jenkins_data': { |
915 | + 'builds': [{ |
916 | + 'building': True, |
917 | + 'actions': [], |
918 | + 'number': 1}, |
919 | + {'building': True, |
920 | + 'actions': [{ |
921 | + 'parameters': [{ |
922 | + 'name': 'merge_proposal', |
923 | + 'value': 'http://mp' |
924 | + }]}], |
925 | + 'number': 2} |
926 | + ]}}), |
927 | + ('one_job_running_out_of_two', |
928 | + {'expected': [2], |
929 | + 'job_params': {}, |
930 | + 'jenkins_data': { |
931 | + 'builds': [{ |
932 | + 'building': False, |
933 | + 'actions': [], |
934 | + 'number': 1}, |
935 | + {'building': True, |
936 | + 'actions': [], |
937 | + 'number': 2} |
938 | + ]}}), |
939 | + ('no_job_running', |
940 | + {'expected': [], |
941 | + 'job_params': {}, |
942 | + 'jenkins_data': { |
943 | + 'builds': [{ |
944 | + 'building': False, |
945 | + 'actions': [], |
946 | + 'number': 1}, |
947 | + {'building': False, |
948 | + 'actions': [], |
949 | + 'number': 2} |
950 | + ]}}), |
951 | + ('no_builds', |
952 | + {'expected': [], |
953 | + 'job_params': {}, |
954 | + 'jenkins_data': {}}), |
955 | + ] |
956 | + |
957 | + @patch('jlp.jenkinsutils.get_config_option', new=lambda x: '') |
958 | + def test_get_running_builds(self): |
959 | + |
960 | + jenkins = MagicMock() |
961 | + jenkins.get_json_data = lambda x, append_api=True: self.jenkins_data |
962 | + with patch('jlp.jenkinsutils.get_json_jenkins', |
963 | + new=MagicMock(return_value=jenkins)): |
964 | + self.assertEquals( |
965 | + jenkinsutils.get_running_builds('job', self.job_params), |
966 | + self.expected) |
967 | + |
968 | + |
969 | +class TestIsJobQueued(TestWithScenarios): |
970 | + scenarios = [ |
971 | + ('no_items', |
972 | + {'expected': False, |
973 | + 'upstream_job': None, |
974 | + 'job_params': {}, |
975 | + 'queue': {'items': []}}), |
976 | + ('different_job_in_queue', |
977 | + {'expected': False, |
978 | + 'upstream_job': None, |
979 | + 'job_params': {}, |
980 | + 'queue': {'items': [{ |
981 | + 'task': {'name': 'different-job'} |
982 | + }]}}), |
983 | + ('no_upstream_job_and_match', |
984 | + {'expected': True, |
985 | + 'upstream_job': None, |
986 | + 'job_params': {}, |
987 | + 'queue': {'items': [{ |
988 | + 'task': {'name': 'job'} |
989 | + }]}}), |
990 | + ('no_upstream_job_and_param_match', |
991 | + {'expected': True, |
992 | + 'upstream_job': None, |
993 | + 'job_params': {'merge_proposal': 'http://mp'}, |
994 | + 'queue': {'items': [{ |
995 | + 'task': {'name': 'job'}, |
996 | + 'actions': [{ |
997 | + 'parameters': [{ |
998 | + 'name': 'merge_proposal', |
999 | + 'value': 'http://mp'}]}] |
1000 | + }]}}), |
1001 | + ('no_upstream_job_and_param_mismatch', |
1002 | + {'expected': False, |
1003 | + 'upstream_job': None, |
1004 | + 'job_params': {'merge_proposal': 'http://mp'}, |
1005 | + 'queue': {'items': [{ |
1006 | + 'task': {'name': 'job'}, |
1007 | + 'actions': [{ |
1008 | + 'parameters': [{ |
1009 | + 'name': 'merge_proposal', |
1010 | + 'value': 'http://different-mp'}]}] |
1011 | + }]}}), |
1012 | + ('no_actions', |
1013 | + {'expected': False, |
1014 | + 'upstream_job': 'upstream-job', |
1015 | + 'job_params': {}, |
1016 | + 'queue': {'items': [{ |
1017 | + 'task': {'name': 'job'} |
1018 | + }]}}), |
1019 | + ('no_causes', |
1020 | + {'expected': False, |
1021 | + 'upstream_job': 'upstream-job', |
1022 | + 'job_params': {}, |
1023 | + 'queue': {'items': [{ |
1024 | + 'task': {'name': 'job'}, |
1025 | + 'actions': [{}] |
1026 | + }]}}), |
1027 | + ('triggered_by_upstream', |
1028 | + {'expected': True, |
1029 | + 'upstream_job': 'upstream-job', |
1030 | + 'job_params': {}, |
1031 | + 'queue': {'items': [{ |
1032 | + 'task': {'name': 'job'}, |
1033 | + 'actions': [{ |
1034 | + 'causes': [{ |
1035 | + 'upstreamProject': 'upstream-job', |
1036 | + 'upstreamBuild': 22, |
1037 | + }] |
1038 | + }] |
1039 | + }]}}), |
1040 | + ('triggered_by_upstream_with_the_right_mp', |
1041 | + {'expected': True, |
1042 | + 'upstream_job': 'upstream-job', |
1043 | + 'job_params': {}, |
1044 | + 'upstream_params': {'merge_proposal': 'http://mp'}, |
1045 | + 'get_build_actions': [{ |
1046 | + 'parameters': [{ |
1047 | + 'name': 'merge_proposal', |
1048 | + 'value': 'http://mp' |
1049 | + }]}], |
1050 | + 'queue': {'items': [{ |
1051 | + 'task': {'name': 'job'}, |
1052 | + 'actions': [{ |
1053 | + 'causes': [{ |
1054 | + 'upstreamProject': 'upstream-job', |
1055 | + 'upstreamBuild': 22, |
1056 | + }] |
1057 | + }] |
1058 | + }]}}), |
1059 | + ('triggered_by_upstream_with_no_params', |
1060 | + {'expected': False, |
1061 | + 'upstream_job': 'upstream-job', |
1062 | + 'job_params': {}, |
1063 | + 'upstream_params': {'merge_proposal': 'http://mp'}, |
1064 | + 'get_build_actions': [{}], |
1065 | + 'queue': {'items': [{ |
1066 | + 'task': {'name': 'job'}, |
1067 | + 'actions': [{ |
1068 | + 'causes': [{ |
1069 | + 'upstreamProject': 'upstream-job', |
1070 | + 'upstreamBuild': 22, |
1071 | + }] |
1072 | + }] |
1073 | + }]}}), |
1074 | + |
1075 | + ('triggered_by_upstream_but_different_mp', |
1076 | + {'expected': False, |
1077 | + 'upstream_job': 'upstream-job', |
1078 | + 'job_params': {}, |
1079 | + 'upstream_params': {'merge_proposal': 'http://mp'}, |
1080 | + 'get_build_actions': [{ |
1081 | + 'parameters': [{ |
1082 | + 'name': 'merge_proposal', |
1083 | + 'value': 'http://different-mp' |
1084 | + }]}], |
1085 | + 'queue': {'items': [{ |
1086 | + 'task': {'name': 'job'}, |
1087 | + 'actions': [{ |
1088 | + 'causes': [{ |
1089 | + 'upstreamProject': 'upstream-job', |
1090 | + 'upstreamBuild': 22, |
1091 | + }] |
1092 | + }] |
1093 | + }]}}), |
1094 | + ('triggered_by_different_upstream', |
1095 | + {'expected': False, |
1096 | + 'upstream_job': 'upstream-job', |
1097 | + 'job_params': {}, |
1098 | + 'queue': {'items': [{ |
1099 | + 'task': {'name': 'job'}, |
1100 | + 'actions': [{ |
1101 | + 'causes': [{ |
1102 | + 'upstreamProject': 'different-upstream-job', |
1103 | + 'upstreamBuild': 23 |
1104 | + }] |
1105 | + }] |
1106 | + }]}}), |
1107 | + ('no_queue_provided', |
1108 | + {'expected': False, |
1109 | + 'upstream_job': 'upstream-job', |
1110 | + 'job_params': {}, |
1111 | + 'queue': None}), |
1112 | + |
1113 | + ] |
1114 | + |
1115 | + @patch('jlp.jenkinsutils._get_build_actions') |
1116 | + @patch('jlp.jenkinsutils.get_json_jenkins') |
1117 | + def test_is_job_queued(self, get_json_jenkins, get_build_actions): |
1118 | + json_jenkins = MagicMock() |
1119 | + json_jenkins.get_json_data.return_value = {'items': []} |
1120 | + get_json_jenkins.return_value = json_jenkins |
1121 | + |
1122 | + if 'get_build_actions' in dir(self): |
1123 | + get_build_actions.return_value = self.get_build_actions |
1124 | + if 'upstream_params' in dir(self): |
1125 | + upstream_params = self.upstream_params |
1126 | + else: |
1127 | + upstream_params = {} |
1128 | + self.assertEquals( |
1129 | + jenkinsutils.is_job_queued( |
1130 | + 'job', queue=self.queue, job_params=self.job_params, |
1131 | + upstream_job=self.upstream_job, |
1132 | + upstream_params=upstream_params), |
1133 | + self.expected) |
1134 | + |
1135 | + |
1136 | +class TestGetBuildActions(unittest.TestCase): |
1137 | + @patch('jlp.jenkinsutils.get_json_jenkins') |
1138 | + def test_no_actions(self, get_json_jenkins): |
1139 | + json_jenkins = MagicMock() |
1140 | + json_jenkins.get_json_data.return_value = {} |
1141 | + get_json_jenkins.return_value = json_jenkins |
1142 | + self.assertEquals(jenkinsutils._get_build_actions('job', 23), |
1143 | + []) |
1144 | + |
1145 | + @patch('jlp.jenkinsutils.get_json_jenkins') |
1146 | + def test_actions_are_returned(self, get_json_jenkins): |
1147 | + actions = object() |
1148 | + json_jenkins = MagicMock() |
1149 | + json_jenkins.get_json_data.return_value = {'actions': actions} |
1150 | + get_json_jenkins.return_value = json_jenkins |
1151 | + ret = jenkinsutils._get_build_actions('job', 23) |
1152 | + self.assertTrue(actions == ret) |
1153 | + |
1154 | + |
1155 | class JenkinsUtilsGetDownstreamBuilds(unittest.TestCase): |
1156 | |
1157 | def setUp(self): |
1158 | @@ -663,7 +1133,8 @@ |
1159 | jenkinsutils.trigger_ci_build(lp_handle, [], 'job', 'url') |
1160 | self.assertEqual(start_jenkins_job.call_count, 0) |
1161 | |
1162 | - @patch('jlp.launchpadutils.testing_in_progress', new=lambda x: True) |
1163 | + @patch('jlp.launchpadutils.testing_in_progress', |
1164 | + new=lambda mp, jenkins_job: True) |
1165 | @patch('jlp.launchpadutils.user_allowed_to_trigger_jobs') |
1166 | @patch('jlp.jenkinsutils.start_jenkins_job') |
1167 | def test_trigger_ci_while_testing_in_progress(self, |
1168 | @@ -676,7 +1147,8 @@ |
1169 | |
1170 | @patch('jlp.launchpadutils.latest_candidate_validated', |
1171 | new=lambda x, y: True) |
1172 | - @patch('jlp.launchpadutils.testing_in_progress', new=lambda x: False) |
1173 | + @patch('jlp.launchpadutils.testing_in_progress', |
1174 | + new=lambda mp, jenkins_job: False) |
1175 | @patch('jlp.launchpadutils.user_allowed_to_trigger_jobs') |
1176 | @patch('jlp.jenkinsutils.start_jenkins_job') |
1177 | def test_trigger_ci_while_Revision_validated(self, |
1178 | @@ -690,7 +1162,8 @@ |
1179 | |
1180 | @patch('jlp.launchpadutils.latest_candidate_validated', |
1181 | new=lambda x, y: False) |
1182 | - @patch('jlp.launchpadutils.testing_in_progress', new=lambda x: False) |
1183 | + @patch('jlp.launchpadutils.testing_in_progress', |
1184 | + new=lambda mp, jenkins_job: False) |
1185 | @patch('jlp.launchpadutils.user_allowed_to_trigger_jobs') |
1186 | @patch('jlp.jenkinsutils.start_jenkins_job') |
1187 | def test_trigger_ci_while_Revision_not_validated(self, |
1188 | @@ -743,7 +1216,7 @@ |
1189 | @patch('jlp.launchpadutils.user_allowed_to_trigger_jobs', |
1190 | new=lambda x: True) |
1191 | @patch('jlp.launchpadutils.testing_in_progress', |
1192 | - new=lambda mp: True) |
1193 | + new=lambda mp, jenkins_job: True) |
1194 | @patch('jlp.jenkinsutils.start_jenkins_job') |
1195 | def test_trigger_al_while_testing_in_progress(self, start_jenkins_job): |
1196 | """When testing is in progress jenkins job must not be started""" |
1197 | @@ -756,7 +1229,7 @@ |
1198 | @patch('jlp.launchpadutils.unapproved_prerequisite_exists', |
1199 | new=lambda x: True) |
1200 | @patch('jlp.launchpadutils.testing_in_progress', |
1201 | - new=lambda x: False) |
1202 | + new=lambda x, y: False) |
1203 | @patch('jlp.launchpadutils.is_commit_message_set', |
1204 | new=lambda x, y, z: True) |
1205 | @patch('jlp.launchpadutils.user_allowed_to_trigger_jobs', |
1206 | @@ -777,7 +1250,7 @@ |
1207 | @patch('jlp.launchpadutils.unapproved_prerequisite_exists', |
1208 | new=lambda x: False) |
1209 | @patch('jlp.launchpadutils.testing_in_progress', |
1210 | - new=lambda x: False) |
1211 | + new=lambda x, y: False) |
1212 | @patch('jlp.launchpadutils.user_allowed_to_trigger_jobs', |
1213 | new=lambda x: True) |
1214 | @patch('jlp.jenkinsutils.start_jenkins_job') |
1215 | @@ -804,6 +1277,10 @@ |
1216 | self.start_jenkins_job_patch = patch( |
1217 | 'jlp.jenkinsutils.start_jenkins_job') |
1218 | self.start_jenkins_job_patch.start() |
1219 | + self.unapproved_prerequisite_exists_patch = patch( |
1220 | + 'jlp.launchpadutils.unapproved_prerequisite_exists', |
1221 | + new=lambda mp: False) |
1222 | + self.unapproved_prerequisite_exists_patch.start() |
1223 | #return True so we stop the flow; We just want to test the permissions |
1224 | #above and not the whole trigger_al_build method |
1225 | test_in_progress = MagicMock(return_value=True) |
1226 | @@ -817,6 +1294,7 @@ |
1227 | self.testing_in_progress_patch.stop() |
1228 | self.user_allowed_to_trigger_jobs_patch.stop() |
1229 | self.start_jenkins_job_patch.stop() |
1230 | + self.unapproved_prerequisite_exists_patch.stop() |
1231 | |
1232 | def test_trigger_al_with_trusted_reviewer(self): |
1233 | mp = MagicMock() |
1234 | |
1235 | === modified file 'tests/test_launchpadutils.py' |
1236 | --- tests/test_launchpadutils.py 2013-05-10 14:10:52 +0000 |
1237 | +++ tests/test_launchpadutils.py 2013-06-05 09:44:54 +0000 |
1238 | @@ -1,5 +1,5 @@ |
1239 | import unittest |
1240 | -from jlp import launchpadutils, MergeProposalReview, get_config_option |
1241 | +from jlp import launchpadutils, get_config_option |
1242 | from mock import MagicMock, patch |
1243 | from testscenarios import TestWithScenarios |
1244 | from textwrap import dedent |
1245 | @@ -301,27 +301,26 @@ |
1246 | self.assertTrue(ret) |
1247 | self._cleanup() |
1248 | |
1249 | + @patch('jlp.launchpadutils.jenkinsutils.is_job_or_downstream_building', |
1250 | + new=lambda jenkins_job, job_params: False) |
1251 | def test_testing_not_in_progress_approved(self): |
1252 | - """ If there is no MergeProposalReview lock then testing_in_progress |
1253 | + """ If there is no job running then testing_in_progress |
1254 | must return False. |
1255 | """ |
1256 | mp = MagicMock() |
1257 | mp.web_link = 'http://my-example-url.com' |
1258 | - review = MergeProposalReview(mp) |
1259 | - review.unlock() |
1260 | - self.assertFalse(launchpadutils.testing_in_progress(mp)) |
1261 | + self.assertFalse(launchpadutils.testing_in_progress(mp, 'job')) |
1262 | |
1263 | + @patch('jlp.launchpadutils.jenkinsutils.is_job_or_downstream_building', |
1264 | + new=lambda jenkins_job, job_params: True) |
1265 | def test_testing_not_in_progress_abstain(self): |
1266 | - """ If a MergeProposalReview lock exists then testing_in_progress |
1267 | - must return True and it must report the correct jenkins job which |
1268 | - holds the lock. |
1269 | + """ If there is job running |
1270 | + then testing_in_progress must return True and it must report the |
1271 | + correct jenkins job which holds the lock. |
1272 | """ |
1273 | mp = MagicMock() |
1274 | mp.web_link = 'http://my-example-url.com' |
1275 | - review = MergeProposalReview(mp) |
1276 | - review.lock('my-job') |
1277 | - self.assertTrue(launchpadutils.testing_in_progress(mp)) |
1278 | - self.assertEquals(review.get_jenkins_job(), 'my-job') |
1279 | + self.assertTrue(launchpadutils.testing_in_progress(mp, 'job')) |
1280 | |
1281 | |
1282 | class TestLatestReview(unittest.TestCase): |
1283 | |
1284 | === added file 'tests/test_socketlock.py' |
1285 | --- tests/test_socketlock.py 1970-01-01 00:00:00 +0000 |
1286 | +++ tests/test_socketlock.py 2013-06-05 09:44:54 +0000 |
1287 | @@ -0,0 +1,117 @@ |
1288 | +from testtools import TestCase |
1289 | +from testtools.matchers import Equals |
1290 | +from jlp import SocketLock |
1291 | +import socket |
1292 | +from lockfile import LockTimeout, AlreadyLocked |
1293 | +from multiprocessing import Process |
1294 | +from autopilot.matchers import Eventually |
1295 | +import sys |
1296 | +import os |
1297 | +import time |
1298 | +from jlp import get_config_option |
1299 | + |
1300 | +LOCK_STATUS_ACQUIRED = 0 |
1301 | +LOCK_STATUS_TIMEOUT = 1 |
1302 | +LOCK_STATUS_ALREADYLOCKED = 2 |
1303 | + |
1304 | + |
1305 | +def run_lock_process(timeout=None, sleeptime=5, kill_process=False): |
1306 | + try: |
1307 | + with SocketLock(get_config_option('lock_name'), timeout=timeout): |
1308 | + time.sleep(sleeptime) |
1309 | + if kill_process: |
1310 | + os.kill(os.getpid(), 9) |
1311 | + except LockTimeout: |
1312 | + sys.exit(LOCK_STATUS_TIMEOUT) |
1313 | + except AlreadyLocked: |
1314 | + sys.exit(LOCK_STATUS_ALREADYLOCKED) |
1315 | + sys.exit(LOCK_STATUS_ACQUIRED) |
1316 | + |
1317 | + |
1318 | +def is_locked(): |
1319 | + try: |
1320 | + mysocket = socket.socket(socket.AF_UNIX, socket.SOCK_DGRAM) |
1321 | + mysocket.bind('\0' + get_config_option('lock_name')) |
1322 | + mysocket.close() |
1323 | + return False |
1324 | + except socket.error: |
1325 | + return True |
1326 | + |
1327 | + |
1328 | +class TestLocking(TestCase): |
1329 | + |
1330 | + def setUp(self): |
1331 | + super(TestLocking, self).setUp() |
1332 | + self.assertThat(is_locked(), Equals(False)) |
1333 | + |
1334 | + def test_lock_can_be_acquired(self): |
1335 | + """Create a process and check the lock exists """ |
1336 | + lock_process = Process(target=run_lock_process, args=()) |
1337 | + lock_process.start() |
1338 | + self.assertThat(is_locked, Eventually(Equals(True))) |
1339 | + lock_process.join() |
1340 | + self.assertThat(lock_process.exitcode, Equals(LOCK_STATUS_ACQUIRED)) |
1341 | + |
1342 | + def test_lock_timeouts(self): |
1343 | + """Create a process that holds the lock for 20s. Then create another |
1344 | + process that timeouts after trying to acquire the lock for 1s. |
1345 | + Make sure the timeout happens.""" |
1346 | + |
1347 | + # first create a process that holds the lock for 10 seconds |
1348 | + holder = Process(target=run_lock_process, args=(120, 20)) |
1349 | + holder.start() |
1350 | + #give the holder some time to acquire the lock |
1351 | + self.assertThat(is_locked, Eventually(Equals(True))) |
1352 | + #now create a process that will try to acquire the lock with 1s |
1353 | + #timeout |
1354 | + acquirer = Process(target=run_lock_process, args=(1, 5)) |
1355 | + acquirer.start() |
1356 | + acquirer.join() |
1357 | + self.assertThat(acquirer.exitcode, Equals(LOCK_STATUS_TIMEOUT)) |
1358 | + holder.join() |
1359 | + self.assertThat(holder.exitcode, Equals(LOCK_STATUS_ACQUIRED)) |
1360 | + |
1361 | + def test_consecutive_holders(self): |
1362 | + """spawn 5 process each holding the lock for 1s and make sure |
1363 | + they all exit with LOCK_STATUS_ACQUIRED""" |
1364 | + holders = [] |
1365 | + for i in range(0, 5): |
1366 | + holders.append(Process(target=run_lock_process, args=(120, 1))) |
1367 | + holders[i].start() |
1368 | + for i in range(0, 5): |
1369 | + holders[i].join() |
1370 | + for i in range(0, 5): |
1371 | + self.assertThat(holders[i].exitcode, Equals(LOCK_STATUS_ACQUIRED)) |
1372 | + |
1373 | + def test_double_locking(self): |
1374 | + """spawn 1 holder and then 5 process trying to lock the same lock |
1375 | + with 0 timeout. They should all exit with |
1376 | + LOCK_STATUS_ALREADYLOCKED""" |
1377 | + holder = Process(target=run_lock_process, args=(120, 20)) |
1378 | + holder.start() |
1379 | + #give the holder some time to acquire the lock |
1380 | + self.assertThat(is_locked, Eventually(Equals(True))) |
1381 | + acquirers = [] |
1382 | + for i in range(0, 5): |
1383 | + acquirers.append(Process(target=run_lock_process, args=(0, 1))) |
1384 | + acquirers[i].start() |
1385 | + for i in range(0, 5): |
1386 | + acquirers[i].join() |
1387 | + for i in range(0, 5): |
1388 | + self.assertThat(acquirers[i].exitcode, |
1389 | + Equals(LOCK_STATUS_ALREADYLOCKED)) |
1390 | + holder.join() |
1391 | + self.assertThat(holder.exitcode, Equals(LOCK_STATUS_ACQUIRED)) |
1392 | + |
1393 | + def test_nonrunning_holder(self): |
1394 | + """Create a lock which is not released properly and check if the |
1395 | + consequtive process can reclaim it.""" |
1396 | + for i in range(0, 5): |
1397 | + holder = Process(target=run_lock_process, args=(120, 1, True)) |
1398 | + holder.start() |
1399 | + holder.join() |
1400 | + self.assertThat(-9, Equals(holder.exitcode)) |
1401 | + holder = Process(target=run_lock_process, args=(0, 1)) |
1402 | + holder.start() |
1403 | + holder.join() |
1404 | + self.assertThat(holder.exitcode, Equals(LOCK_STATUS_ACQUIRED)) |
1405 | |
1406 | === removed file 'tests/test_socketlock.py' |
1407 | --- tests/test_socketlock.py 2013-05-10 14:10:52 +0000 |
1408 | +++ tests/test_socketlock.py 1970-01-01 00:00:00 +0000 |
1409 | @@ -1,117 +0,0 @@ |
1410 | -from testtools import TestCase |
1411 | -from testtools.matchers import Equals |
1412 | -from jlp import SocketLock |
1413 | -import socket |
1414 | -from lockfile import LockTimeout, AlreadyLocked |
1415 | -from multiprocessing import Process |
1416 | -from autopilot.matchers import Eventually |
1417 | -import sys |
1418 | -import os |
1419 | -import time |
1420 | -from jlp import get_config_option |
1421 | - |
1422 | -LOCK_STATUS_ACQUIRED = 0 |
1423 | -LOCK_STATUS_TIMEOUT = 1 |
1424 | -LOCK_STATUS_ALREADYLOCKED = 2 |
1425 | - |
1426 | - |
1427 | -def run_lock_process(timeout=None, sleeptime=5, kill_process=False): |
1428 | - try: |
1429 | - with SocketLock(get_config_option('lock_name'), timeout=timeout): |
1430 | - time.sleep(sleeptime) |
1431 | - if kill_process: |
1432 | - os.kill(os.getpid(), 9) |
1433 | - except LockTimeout: |
1434 | - sys.exit(LOCK_STATUS_TIMEOUT) |
1435 | - except AlreadyLocked: |
1436 | - sys.exit(LOCK_STATUS_ALREADYLOCKED) |
1437 | - sys.exit(LOCK_STATUS_ACQUIRED) |
1438 | - |
1439 | - |
1440 | -def is_locked(): |
1441 | - try: |
1442 | - mysocket = socket.socket(socket.AF_UNIX, socket.SOCK_DGRAM) |
1443 | - mysocket.bind('\0' + get_config_option('lock_name')) |
1444 | - mysocket.close() |
1445 | - return False |
1446 | - except socket.error: |
1447 | - return True |
1448 | - |
1449 | - |
1450 | -class TestLocking(TestCase): |
1451 | - |
1452 | - def setUp(self): |
1453 | - super(TestLocking, self).setUp() |
1454 | - self.assertThat(is_locked(), Equals(False)) |
1455 | - |
1456 | - def test_lock_can_be_acquired(self): |
1457 | - """Create a process and check the lock exists """ |
1458 | - lock_process = Process(target=run_lock_process, args=()) |
1459 | - lock_process.start() |
1460 | - self.assertThat(is_locked, Eventually(Equals(True))) |
1461 | - lock_process.join() |
1462 | - self.assertThat(lock_process.exitcode, Equals(LOCK_STATUS_ACQUIRED)) |
1463 | - |
1464 | - def test_lock_timeouts(self): |
1465 | - """Create a process that holds the lock for 20s. Then create another |
1466 | - process that timeouts after trying to acquire the lock for 1s. |
1467 | - Make sure the timeout happens.""" |
1468 | - |
1469 | - # first create a process that holds the lock for 10 seconds |
1470 | - holder = Process(target=run_lock_process, args=(120, 20)) |
1471 | - holder.start() |
1472 | - #give the holder some time to acquire the lock |
1473 | - self.assertThat(is_locked, Eventually(Equals(True))) |
1474 | - #now create a process that will try to acquire the lock with 1s |
1475 | - #timeout |
1476 | - acquirer = Process(target=run_lock_process, args=(1, 5)) |
1477 | - acquirer.start() |
1478 | - acquirer.join() |
1479 | - self.assertThat(acquirer.exitcode, Equals(LOCK_STATUS_TIMEOUT)) |
1480 | - holder.join() |
1481 | - self.assertThat(holder.exitcode, Equals(LOCK_STATUS_ACQUIRED)) |
1482 | - |
1483 | - def test_consecutive_holders(self): |
1484 | - """spawn 5 process each holding the lock for 1s and make sure |
1485 | - they all exit with LOCK_STATUS_ACQUIRED""" |
1486 | - holders = [] |
1487 | - for i in range(0, 5): |
1488 | - holders.append(Process(target=run_lock_process, args=(120, 1))) |
1489 | - holders[i].start() |
1490 | - for i in range(0, 5): |
1491 | - holders[i].join() |
1492 | - for i in range(0, 5): |
1493 | - self.assertThat(holders[i].exitcode, Equals(LOCK_STATUS_ACQUIRED)) |
1494 | - |
1495 | - def test_double_locking(self): |
1496 | - """spawn 1 holder and then 5 process trying to lock the same lock |
1497 | - with 0 timeout. They should all exit with |
1498 | - LOCK_STATUS_ALREADYLOCKED""" |
1499 | - holder = Process(target=run_lock_process, args=(120, 20)) |
1500 | - holder.start() |
1501 | - #give the holder some time to acquire the lock |
1502 | - self.assertThat(is_locked, Eventually(Equals(True))) |
1503 | - acquirers = [] |
1504 | - for i in range(0, 5): |
1505 | - acquirers.append(Process(target=run_lock_process, args=(0, 1))) |
1506 | - acquirers[i].start() |
1507 | - for i in range(0, 5): |
1508 | - acquirers[i].join() |
1509 | - for i in range(0, 5): |
1510 | - self.assertThat(acquirers[i].exitcode, |
1511 | - Equals(LOCK_STATUS_ALREADYLOCKED)) |
1512 | - holder.join() |
1513 | - self.assertThat(holder.exitcode, Equals(LOCK_STATUS_ACQUIRED)) |
1514 | - |
1515 | - def test_nonrunning_holder(self): |
1516 | - """Create a lock which is not released properly and check if the |
1517 | - consequtive process can reclaim it.""" |
1518 | - for i in range(0, 5): |
1519 | - holder = Process(target=run_lock_process, args=(120, 1, True)) |
1520 | - holder.start() |
1521 | - holder.join() |
1522 | - self.assertThat(-9, Equals(holder.exitcode)) |
1523 | - holder = Process(target=run_lock_process, args=(0, 1)) |
1524 | - holder.start() |
1525 | - holder.join() |
1526 | - self.assertThat(holder.exitcode, Equals(LOCK_STATUS_ACQUIRED)) |
FAILED: Continuous integration, rev:133 jenkins. qa.ubuntu. com/job/ jenkins- launchpad- plugin- ci/7/ jenkins. qa.ubuntu. com/job/ jenkins- launchpad- plugin- saucy-amd64- ci/8/console
http://
Executed test runs:
FAILURE: http://
Click here to trigger a rebuild: s-jenkins: 8080/job/ jenkins- launchpad- plugin- ci/7/rebuild
http://