Merge lp:~maddevelopers/mg5amcnlo/NLO_with_1.4 into lp:~maddevelopers/mg5amcnlo/NLO
- NLO_with_1.4
- Merge into NLO
Status: | Merged |
---|---|
Merged at revision: | 220 |
Proposed branch: | lp:~maddevelopers/mg5amcnlo/NLO_with_1.4 |
Merge into: | lp:~maddevelopers/mg5amcnlo/NLO |
Diff against target: |
9682 lines (+2863/-3324) 78 files modified
Template/Cards/run_card.dat (+1/-1) Template/HTML/sortable.js (+22/-11) Template/Source/gen_ximprove.f (+1/-2) Template/SubProcesses/cuts.f (+1/-1) Template/SubProcesses/reweight.f (+10/-6) Template/bin/generate_events (+48/-17) Template/bin/internal/Gridpack/compile (+4/-0) Template/bin/internal/Gridpack/gridrun (+5/-3) Template/bin/internal/Gridpack/run.sh (+11/-7) Template/bin/internal/collect_decay_widths.py (+0/-126) Template/bin/internal/create_matching_plots.sh (+12/-10) Template/bin/internal/put_banner (+3/-1) Template/bin/internal/run_delphes (+3/-1) Template/bin/internal/store (+0/-153) Template/bin/internal/store4grid (+0/-145) Template/bin/madevent (+7/-2) Template/makefile (+0/-1) UpdateNotes.txt (+71/-40) aloha/aloha_fct.py (+76/-0) aloha/aloha_lib.py (+6/-0) aloha/aloha_object.py (+1/-1) aloha/aloha_writers.py (+46/-37) aloha/create_aloha.py (+29/-37) bin/mg5 (+1/-0) input/mg5_configuration.txt (+4/-2) madgraph/VERSION (+2/-2) madgraph/core/base_objects.py (+4/-4) madgraph/core/helas_objects.py (+2/-2) madgraph/interface/.mg5_logging.conf (+5/-2) madgraph/interface/__init__.py (+1/-0) madgraph/interface/cmd_interface.py (+221/-126) madgraph/interface/coloring_logging.py (+52/-0) madgraph/interface/extended_cmd.py (+122/-30) madgraph/interface/launch_ext_program.py (+29/-14) madgraph/interface/madevent_interface.py (+971/-439) madgraph/iolibs/export_v4.py (+11/-9) madgraph/iolibs/files.py (+5/-2) madgraph/iolibs/helas_call_writers.py (+13/-16) madgraph/iolibs/misc.py (+16/-9) madgraph/loop/loop_helas_objects.py (+15/-8) madgraph/various/banner.py (+166/-26) madgraph/various/cluster.py (+113/-81) madgraph/various/gen_crossxhtml.py (+234/-138) madgraph/various/process_checks.py (+3/-3) madgraph/various/sum_html.py (+60/-11) models/RS/RS_UFO.log (+9/-4) models/RS/__init__.py (+3/-3) models/RS/coupling_orders.py (+5/-6) models/RS/couplings.py (+2/-2) models/RS/lorentz.py (+4/-4) models/RS/parameters.py (+23/-7) models/RS/particles.py (+54/-53) models/RS/vertices.py (+2/-2) models/check_param_card.py (+50/-1) models/import_ufo.py (+154/-32) models/mssm/restrict_default.dat (+0/-1) models/uutt_tch_4fermion/__init__.py (+0/-22) models/uutt_tch_4fermion/coupling_orders.py (+0/-19) models/uutt_tch_4fermion/couplings.py (+0/-122) models/uutt_tch_4fermion/function_library.py (+0/-54) models/uutt_tch_4fermion/lorentz.py (+0/-79) models/uutt_tch_4fermion/object_library.py (+0/-245) models/uutt_tch_4fermion/parameters.py (+0/-237) models/uutt_tch_4fermion/particles.py (+0/-384) models/uutt_tch_4fermion/vertices.py (+0/-354) models/uutt_tch_4fermion/write_param_card.py (+0/-65) tests/acceptance_tests/test_cmd.py (+19/-18) tests/acceptance_tests/test_cmd_madevent.py (+60/-20) tests/input_files/test_mssm_generation (+1/-0) tests/unit_tests/core/test_base_objects.py (+3/-3) tests/unit_tests/core/test_drawing.py (+33/-33) tests/unit_tests/iolibs/test_export_v4.py (+3/-3) tests/unit_tests/loop/test_loop_diagram_generation.py (+1/-1) tests/unit_tests/various/test_4fermion_models.py (+14/-12) tests/unit_tests/various/test_aloha.py (+5/-5) tests/unit_tests/various/test_check_param_card.py (+4/-0) tests/unit_tests/various/test_import_ufo.py (+3/-3) tests/unit_tests/various/test_write_param.py (+4/-4) |
To merge this branch: | bzr merge lp:~maddevelopers/mg5amcnlo/NLO_with_1.4 |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Valentin Hirschi | Approve | ||
Review via email: mp+91162@code.launchpad.net |
Commit message
Description of the change
Merge with last version of 1.4.0.
update the aloha code for loop.
Hi Valentin,
This is a merge with the beta version of the code,
this version has some unit-test that are not fine,
but less than the current NLO branch, so in principle this doesn't add any additional trouble.
This include a new way to deal with the tags, as you asked. For the drawing, this branch doesn't fix the problem. We can skype for that if you want.
Cheers,
Olivier
- 215. By Olivier Mattelaer
-
correct some of the unit_test failure
- 216. By Olivier Mattelaer
-
fix a couple of unit tests
- 217. By mattelaer-olivier
-
try to merge
- 218. By mattelaer-olivier
-
acceptance tests working
- 219. By mattelaer-olivier
-
only 3 tests didn't pass
Olivier Mattelaer (olivier-mattelaer) wrote : | # |
Olivier Mattelaer (olivier-mattelaer) wrote : | # |
I Valentin, could you run the test on your computer (just for security)
and after that approve offically the merge?
as soon as this is done, I'll merge this version in the NLO branch.
- 220. By Olivier Mattelaer
-
merge with NLO branch
Valentin Hirschi (valentin-hirschi) wrote : | # |
All unit and acceptance tests pass on my computer so I approve the merge.
Preview Diff
1 | === modified file 'Template/Cards/run_card.dat' |
2 | --- Template/Cards/run_card.dat 2011-12-05 08:47:14 +0000 |
3 | +++ Template/Cards/run_card.dat 2012-02-16 18:25:20 +0000 |
4 | @@ -20,7 +20,7 @@ |
5 | #********************************************************************* |
6 | # Tag name for the run (one word) * |
7 | #********************************************************************* |
8 | - fermi = run_tag ! name of the run |
9 | + tag_1 = run_tag ! name of the run |
10 | #********************************************************************* |
11 | # Run to generate the grid pack * |
12 | #********************************************************************* |
13 | |
14 | === modified file 'Template/HTML/sortable.js' |
15 | --- Template/HTML/sortable.js 2011-12-06 16:10:41 +0000 |
16 | +++ Template/HTML/sortable.js 2012-02-16 18:25:20 +0000 |
17 | @@ -2,17 +2,14 @@ |
18 | Table sorting script by Joost de Valk, check it out at http://www.joostdevalk.nl/code/sortable-table/. |
19 | Based on a script from http://www.kryogenix.org/code/browser/sorttable/. |
20 | Distributed under the MIT license: http://www.kryogenix.org/code/browser/licence.html . |
21 | +Modify by the MG team for compatibility issue |
22 | |
23 | Copyright (c) 1997-2007 Stuart Langridge, Joost de Valk. |
24 | |
25 | Version 1.5.7 |
26 | */ |
27 | |
28 | -/* You can change these values */ |
29 | -var image_path = "http://www.joostdevalk.nl/code/sortable-table/"; |
30 | -var image_up = "arrowup.gif"; |
31 | -var image_down = "arrowdown.gif"; |
32 | -var image_none = "arrownone.gif"; |
33 | +/* You can change these values */ |
34 | var europeandate = true; |
35 | var alternate_row_colors = true; |
36 | |
37 | @@ -30,6 +27,18 @@ |
38 | thisTbl = tbls[ti]; |
39 | if (((' '+thisTbl.className+' ').indexOf("sortable") != -1) && (thisTbl.id)) { |
40 | ts_makeSortable(thisTbl); |
41 | + // make it sortable according to the second column |
42 | + if (thisTbl.tHead && thisTbl.tHead.rows.length > 0) { |
43 | + var firstRow = thisTbl.tHead.rows[thisTbl.tHead.rows.length-1]; |
44 | + } else { |
45 | + var firstRow = thisTbl.rows[0]; |
46 | + } |
47 | + |
48 | + for (var ci=0;ci<firstRow.cells[1].childNodes.length;ci++) { |
49 | + if (firstRow.cells[1].childNodes[ci].tagName && firstRow.cells[1].childNodes[ci].tagName.toLowerCase() == 'a') var lnk = firstRow.cells[1].childNodes[ci]; |
50 | + } |
51 | + ts_resortTable(lnk, 1); //order by cross-section |
52 | + ts_resortTable(lnk, 1); //order by cross-section |
53 | } |
54 | } |
55 | } |
56 | @@ -50,7 +59,7 @@ |
57 | var cell = firstRow.cells[i]; |
58 | var txt = ts_getInnerText(cell); |
59 | if (cell.className != "unsortable" && cell.className.indexOf("unsortable") == -1) { |
60 | - cell.innerHTML = '<a href="#" class="sortheader" onclick="ts_resortTable(this, '+i+');return false;">'+txt+'<span class="sortarrow"> <img src="'+ image_path + image_none + '" alt="↓"/></span></a>'; |
61 | + cell.innerHTML = '<a href="#" class="sortheader" onclick="ts_resortTable(this, '+i+');return false;">'+txt+'<span class="sortarrow"></span></a>'; |
62 | } |
63 | } |
64 | if (alternate_row_colors) { |
65 | @@ -106,6 +115,7 @@ |
66 | if (itm.match(/^\d\d[\/\.-]\d\d[\/\.-]\d\d\d{2}?$/)) sortfn = ts_sort_date; |
67 | if (itm.match(/^-?[£$€Û¢´]\d/)) sortfn = ts_sort_numeric; |
68 | if (itm.match(/^-?(\d+[,\.]?)+(E[-+][\d]+)?%?$/)) sortfn = ts_sort_numeric; |
69 | + if (itm.match(/^-?(\d+[,\.]?\d+e[-+][\d]+)?%?$/)) sortfn = ts_sort_numeric; |
70 | if (itm.match(/^-?(\d+[,\.]?)+([\d]+)?%?$/)) sortfn = ts_sort_numeric; |
71 | SORT_COLUMN_INDEX = column; |
72 | var firstRow = new Array(); |
73 | @@ -130,11 +140,11 @@ |
74 | } |
75 | newRows.sort(sortfn); |
76 | if (span.getAttribute("sortdir") == 'down') { |
77 | - ARROW = ' <img src="'+ image_path + image_down + '" alt="↓"/>'; |
78 | + ARROW = ' ↓'; |
79 | newRows.reverse(); |
80 | span.setAttribute('sortdir','up'); |
81 | } else { |
82 | - ARROW = ' <img src="'+ image_path + image_up + '" alt="↑"/>'; |
83 | + ARROW = ' ↑'; |
84 | span.setAttribute('sortdir','down'); |
85 | } |
86 | // We appendChild rows that already exist to the tbody, so it moves them rather than creating new ones |
87 | @@ -154,7 +164,7 @@ |
88 | for (var ci=0;ci<allspans.length;ci++) { |
89 | if (allspans[ci].className == 'sortarrow') { |
90 | if (getParent(allspans[ci],"table") == getParent(lnk,"table")) { // in the same table as us? |
91 | - allspans[ci].innerHTML = ' <img src="'+ image_path + image_none + '" alt="↓"/>'; |
92 | + allspans[ci].innerHTML = ''; |
93 | } |
94 | } |
95 | } |
96 | @@ -235,7 +245,7 @@ |
97 | } |
98 | function ts_sort_numeric(a,b) { |
99 | var aa = ts_getInnerText(a.cells[SORT_COLUMN_INDEX]); |
100 | - aa = clean_num(aa); |
101 | + aa = clean_num(aa); |
102 | var bb = ts_getInnerText(b.cells[SORT_COLUMN_INDEX]); |
103 | bb = clean_num(bb); |
104 | return compare_numeric(aa,bb); |
105 | @@ -269,6 +279,7 @@ |
106 | } |
107 | return 1; |
108 | } |
109 | + |
110 | function addEvent(elm, evType, fn, useCapture) |
111 | // addEvent and removeEvent |
112 | // cross-browser event handling for IE5+, NS6 and Mozilla |
113 | @@ -285,7 +296,7 @@ |
114 | } |
115 | } |
116 | function clean_num(str) { |
117 | - str = str.replace(new RegExp(/[^-?0-9.]/g),""); |
118 | + str = str.replace(new RegExp(/[^-?0-9.eE]/g),""); |
119 | return str; |
120 | } |
121 | function trim(s) { |
122 | |
123 | === renamed file 'Template/Events/banner_header.txt' => 'Template/Source/banner_header.txt' |
124 | === modified file 'Template/Source/gen_ximprove.f' |
125 | --- Template/Source/gen_ximprove.f 2011-11-18 17:35:49 +0000 |
126 | +++ Template/Source/gen_ximprove.f 2012-02-16 18:25:20 +0000 |
127 | @@ -196,9 +196,8 @@ |
128 | $ i,nevents,gname,nhel_refine) |
129 | else |
130 | open(unit=25,file='../results.dat',status='old',err=199) |
131 | + read(25,*) xtot |
132 | write(*,'(a,e12.3)') 'Reading total xsection ',xtot |
133 | - read(25,*) xtot |
134 | - write(*,'(e12.3)') xtot |
135 | 199 close(25) |
136 | if (gridpack) then |
137 | call write_gen_grid(err_goal,dble(ngran),i,nevents,gname, |
138 | |
139 | === modified file 'Template/SubProcesses/cuts.f' |
140 | --- Template/SubProcesses/cuts.f 2011-11-10 06:06:00 +0000 |
141 | +++ Template/SubProcesses/cuts.f 2012-02-16 18:25:20 +0000 |
142 | @@ -300,7 +300,7 @@ |
143 | endif |
144 | endif |
145 | if (mmnl.gt.0d0.or.mmnlmax.lt.1d5)then |
146 | - if(SumDot(ptemp,ptemp2,1d0).lt.mmnl.or.SumDot(ptemp, ptemp2,1d0).gt.mmnlmax) then |
147 | + if(dsqrt(SumDot(ptemp,ptemp2,1d0)).lt.mmnl.or.dsqrt(SumDot(ptemp, ptemp2,1d0)).gt.mmnlmax) then |
148 | if(debug) write (*,*) 'lepton invariant mass -> fails' |
149 | passcuts=.false. |
150 | return |
151 | |
152 | === modified file 'Template/SubProcesses/reweight.f' |
153 | --- Template/SubProcesses/reweight.f 2011-11-16 10:55:44 +0000 |
154 | +++ Template/SubProcesses/reweight.f 2012-02-16 18:25:20 +0000 |
155 | @@ -723,6 +723,9 @@ |
156 | integer ISPROC |
157 | DOUBLE PRECISION PD(0:MAXPROC) |
158 | COMMON /SubProc/ PD, ISPROC |
159 | + INTEGER SUBDIAG(MAXSPROC),IB(2) |
160 | + COMMON/TO_SUB_DIAG/SUBDIAG,IB |
161 | + data IB/1,2/ |
162 | c q2bck holds the central q2fact scales |
163 | real*8 q2bck(2) |
164 | common /to_q2bck/q2bck |
165 | @@ -825,7 +828,7 @@ |
166 | endif |
167 | c Set x values for the two sides, for IS Sudakovs |
168 | do i=1,2 |
169 | - xnow(i)=xbk(i) |
170 | + xnow(i)=xbk(ib(i)) |
171 | enddo |
172 | if(btest(mlevel,3))then |
173 | write(*,*) 'Set x values to ',xnow(1),xnow(2) |
174 | @@ -999,11 +1002,12 @@ |
175 | $ ' to: ',sqrt(pt2pdf(imocl(n))) |
176 | else if(pt2pdf(idacl(n,i)).lt.q2now |
177 | $ .and.isjet(ipdgcl(idacl(n,i),igraphs(1),iproc))) then |
178 | - pdfj1=pdg2pdf(abs(lpp(j)),ipdgcl(idacl(n,i), |
179 | - $ igraphs(1),iproc)*sign(1,ibeam(j)),xnow(j),sqrt(q2now)) |
180 | - pdfj2=pdg2pdf(abs(lpp(j)),ipdgcl(idacl(n,i), |
181 | - $ igraphs(1),iproc)*sign(1,ibeam(j)),xnow(j), |
182 | - $ sqrt(pt2pdf(idacl(n,i)))) |
183 | + pdfj1=pdg2pdf(abs(lpp(IB(j))),ipdgcl(idacl(n,i), |
184 | + $ igraphs(1),iproc)*sign(1,lpp(IB(j))), |
185 | + $ xnow(j),sqrt(q2now)) |
186 | + pdfj2=pdg2pdf(abs(lpp(IB(j))),ipdgcl(idacl(n,i), |
187 | + $ igraphs(1),iproc)*sign(1,lpp(IB(j))), |
188 | + $ xnow(j),sqrt(pt2pdf(idacl(n,i)))) |
189 | if(pdfj2.lt.1d-10)then |
190 | c Scale too low for heavy quark |
191 | rewgt=0d0 |
192 | |
193 | === modified file 'Template/bin/generate_events' |
194 | --- Template/bin/generate_events 2011-11-17 20:45:50 +0000 |
195 | +++ Template/bin/generate_events 2012-02-16 18:25:20 +0000 |
196 | @@ -27,25 +27,23 @@ |
197 | pjoin = os.path.join |
198 | sys.path.append(pjoin(root_path,'bin','internal')) |
199 | import madevent_interface as ME |
200 | -################################################################################ |
201 | -## EXECUTABLE |
202 | -################################################################################ |
203 | -if '__main__' == __name__: |
204 | - # Check that python version is valid |
205 | - if not (sys.version_info[0] == 2 or sys.version_info[1] > 5): |
206 | - sys.exit('MadEvent works with python 2.6 or higher (but not python 3.X).\n\ |
207 | + |
208 | +if not (sys.version_info[0] == 2 or sys.version_info[1] > 5): |
209 | + sys.exit('MadEvent works with python 2.6 or higher (but not python 3.X).\n\ |
210 | Please upgrade your version of python.') |
211 | - |
212 | - # MadEvent is sensitive to the initial directory. |
213 | - # So go to the main directory |
214 | - os.chdir(root_path) |
215 | - |
216 | + |
217 | + |
218 | +def set_configuration(): |
219 | + import coloring_logging |
220 | logging.config.fileConfig(os.path.join(root_path, 'bin', 'internal', 'me5_logging.conf')) |
221 | logging.root.setLevel(logging.INFO) |
222 | logging.getLogger('madevent').setLevel(logging.INFO) |
223 | - logging.getLogger('madgraph').setLevel(logging.INFO) |
224 | - |
225 | - argument = sys.argv |
226 | + logging.getLogger('madgraph').setLevel(logging.INFO) |
227 | + |
228 | + |
229 | +def treat_old_argument(argument): |
230 | + """Have the MG4 behavior for this script""" |
231 | + |
232 | try: |
233 | mode = int(argument[1]) |
234 | except: |
235 | @@ -69,7 +67,7 @@ |
236 | except: |
237 | name = raw_input('enter run name\n') |
238 | |
239 | - launch = ME.MadEventCmd(me_dir=os.getcwd()) |
240 | + launch = ME.MadEventCmd(me_dir=root_path) |
241 | |
242 | |
243 | if mode == 1: |
244 | @@ -81,7 +79,40 @@ |
245 | else: |
246 | launch.run_cmd('generate_events -f %s' % name) |
247 | |
248 | - launch.run_cmd('quit') |
249 | + launch.run_cmd('quit') |
250 | + |
251 | + |
252 | + |
253 | + |
254 | + |
255 | + |
256 | + |
257 | + |
258 | +################################################################################ |
259 | +## EXECUTABLE |
260 | +################################################################################ |
261 | +if '__main__' == __name__: |
262 | + # Check that python version is valid |
263 | + |
264 | + set_configuration() |
265 | + argument = sys.argv |
266 | + try: |
267 | + if '-h' in argument or '--help' in argument: |
268 | + launch = ME.MadEventCmd(me_dir=root_path) |
269 | + launch.exec_cmd('help generate_events') |
270 | + elif len(argument) > 1 and argument[1] in ['0', '1', '2']: |
271 | + treat_old_argument(argument) |
272 | + else: |
273 | + open('/tmp/mg5tmp.cmd','w').write('generate_events %s' % ' '.join(argument[1:])) |
274 | + os.system('%s/madevent /tmp/mg5tmp.cmd' % pjoin(root_path, 'bin')) |
275 | + except KeyboardInterrupt: |
276 | + pass |
277 | + except ME.MadEventAlreadyRunning, error: |
278 | + logging.error(str(error)) |
279 | + sys.exit() |
280 | + |
281 | + if os.path.exists(pjoin(root_path, 'RunWeb')): |
282 | + os.remove(pjoin(root_path, 'RunWeb')) |
283 | |
284 | # reconfigure path for the web |
285 | #if len(argument) == 5: |
286 | |
287 | === modified file 'Template/bin/internal/Gridpack/compile' |
288 | --- Template/bin/internal/Gridpack/compile 2011-07-26 16:15:01 +0000 |
289 | +++ Template/bin/internal/Gridpack/compile 2012-02-16 18:25:20 +0000 |
290 | @@ -45,6 +45,10 @@ |
291 | unset lhapdf |
292 | fi |
293 | |
294 | +# Clean the previous compilation |
295 | +cd Source |
296 | +make clean |
297 | +cd - |
298 | # |
299 | # Now let shell know where to find important executables |
300 | # |
301 | |
302 | === modified file 'Template/bin/internal/Gridpack/gridrun' |
303 | --- Template/bin/internal/Gridpack/gridrun 2011-12-06 16:10:41 +0000 |
304 | +++ Template/bin/internal/Gridpack/gridrun 2012-02-16 18:25:20 +0000 |
305 | @@ -48,7 +48,7 @@ |
306 | |
307 | # Check if optimize mode is (and should be) activated |
308 | if __debug__ and not options.debug and \ |
309 | - (not os.path.exists(os.path.join(root_path,'../..', 'bin','create_release.py')) or options.web): |
310 | + (not os.path.exists(os.path.join(root_path,'../..', 'bin','create_release.py'))): |
311 | subprocess.call([sys.executable] + ['-O'] + sys.argv) |
312 | sys.exit() |
313 | |
314 | @@ -84,10 +84,12 @@ |
315 | cmd_line = cmd_interface.GridPackCmd(me_dir=root_path, nb_event=args[0], seed=args[1]) |
316 | except KeyboardInterrupt: |
317 | print 'Quit on KeyboardInterrupt' |
318 | - |
319 | + |
320 | +print 'DONE' |
321 | + |
322 | try: |
323 | # Remove lock file |
324 | - os.remove(os.path.join(root_path,os.pardir, 'RunWeb')) |
325 | + os.remove(os.path.join(root_path, 'RunWeb')) |
326 | if cmd_line.history[-1] not in ['EOF','quit','exit']: |
327 | cmd_line.results.store_result() |
328 | except: |
329 | |
330 | === modified file 'Template/bin/internal/Gridpack/run.sh' |
331 | --- Template/bin/internal/Gridpack/run.sh 2011-10-31 16:33:58 +0000 |
332 | +++ Template/bin/internal/Gridpack/run.sh 2012-02-16 18:25:20 +0000 |
333 | @@ -57,20 +57,24 @@ |
334 | echo "Now generating $num_events events with random seed $seed and granularity $gran" |
335 | |
336 | |
337 | -if [[ ! -x ./madevent/bin/internal/gridrun ]]; then |
338 | +if [[ ! -x ./madevent/bin/gridrun ]]; then |
339 | echo "Error: gridrun script not found !" |
340 | exit |
341 | else |
342 | cd ./madevent |
343 | - ./bin/internal/gridrun.py $num_events $seed |
344 | -fi |
345 | - |
346 | -if [[ ! -e ./Events/unweighted_events.lhe ]]; then |
347 | + ./bin/gridrun $num_events $seed |
348 | +fi |
349 | + |
350 | +if [[ -e ./Events/GridRun_${seed}/unweighted_events.lhe.gz ]]; then |
351 | + gunzip ./Events/GridRun_${seed}/unweighted_events.lhe.gz |
352 | +fi |
353 | + |
354 | +if [[ ! -e ./Events/GridRun_${seed}/unweighted_events.lhe ]]; then |
355 | echo "Error: event file not found !" |
356 | exit |
357 | else |
358 | - echo "Moving events from madevent/Events/unweighted_events.lhe to events.lhe" |
359 | - mv ./Events/unweighted_events.lhe ../events.lhe |
360 | + echo "Moving events from events.lhe" |
361 | + mv ./Events/GridRun_${seed}/unweighted_events.lhe ../events.lhe |
362 | cd .. |
363 | fi |
364 | |
365 | |
366 | === removed file 'Template/bin/internal/collect_decay_widths.py' |
367 | --- Template/bin/internal/collect_decay_widths.py 2011-11-22 08:36:23 +0000 |
368 | +++ Template/bin/internal/collect_decay_widths.py 1970-01-01 00:00:00 +0000 |
369 | @@ -1,126 +0,0 @@ |
370 | -#!/usr/bin/python |
371 | -# |
372 | -# Collect the decay widths and calculate BRs for all particles, and put in param_card form |
373 | - |
374 | -import glob |
375 | -import optparse |
376 | -import os |
377 | -import re |
378 | - |
379 | -root_path = os.path.abspath(os.path.join(os.path.dirname(__file__), |
380 | - os.path.pardir, os.path.pardir)) |
381 | - |
382 | -# Main routine |
383 | -if __name__ == "__main__": |
384 | - |
385 | - # Find available runs |
386 | - run_names = glob.glob(os.path.join(root_path, "Events", "*_banner.txt")) |
387 | - run_names = [os.path.basename(name) for name in run_names] |
388 | - run_names = [re.match("(.*)_banner.txt", name).group(1) for name in run_names] |
389 | - usage = "usage: %prog run_name" |
390 | - usage +="\n where run_name in (%s)" % ",".join(run_names) |
391 | - parser = optparse.OptionParser(usage=usage) |
392 | - (options, args) = parser.parse_args() |
393 | - if len(args) != 1 or args[0] not in run_names: |
394 | - if len(run_names) == 1: |
395 | - args = run_names |
396 | - else: |
397 | - parser.error("Please give a run name") |
398 | - exit |
399 | - run_name = args[0].strip() |
400 | - |
401 | - print "Collecting results for run ", run_name |
402 | - |
403 | - # Start collecting results |
404 | - subproc_dirs = \ |
405 | - open(os.path.join(root_path, "SubProcesses", "subproc.mg")).read().split('\n') |
406 | - |
407 | - particle_dict = {} |
408 | - for subdir in subproc_dirs[:-1]: |
409 | - subdir = os.path.join(root_path, "SubProcesses", subdir) |
410 | - leshouche = open(os.path.join(subdir, 'leshouche.inc')).read().split('\n')[0] |
411 | - particles = re.search("/([\d,-]+)/", leshouche) |
412 | - if not particles: |
413 | - continue |
414 | - particles = [int(p) for p in particles.group(1).split(',')] |
415 | - results = \ |
416 | - open(os.path.join(subdir, run_name + '_results.dat')).read().split('\n')[0] |
417 | - result = float(results.strip().split(' ')[0]) |
418 | - try: |
419 | - particle_dict[particles[0]].append([particles[1:], result]) |
420 | - except KeyError: |
421 | - particle_dict[particles[0]] = [[particles[1:], result]] |
422 | - |
423 | - |
424 | - # Open the param_card.dat and insert the calculated decays and BRs |
425 | - param_card_file = open(os.path.join(root_path, 'Cards', 'param_card.dat')) |
426 | - param_card = param_card_file.read().split('\n') |
427 | - param_card_file.close() |
428 | - |
429 | - decay_lines = [] |
430 | - line_number = 0 |
431 | - # Read and remove all decays from the param_card |
432 | - while line_number < len(param_card): |
433 | - line = param_card[line_number] |
434 | - if line.lower().startswith('decay'): |
435 | - # Read decay if particle in particle_dict |
436 | - # DECAY 6 1.455100e+00 |
437 | - line = param_card.pop(line_number) |
438 | - line = line.split() |
439 | - particle = 0 |
440 | - if int(line[1]) not in particle_dict: |
441 | - try: # If formatting is wrong, don't want this particle |
442 | - particle = int(line[1]) |
443 | - width = float(line[2]) |
444 | - except: |
445 | - particle = 0 |
446 | - # Read BRs for this decay |
447 | - line = param_card[line_number] |
448 | - while line.startswith('#') or line.startswith(' '): |
449 | - line = param_card.pop(line_number) |
450 | - if not particle or line.startswith('#'): |
451 | - line=param_card[line_number] |
452 | - continue |
453 | - # 6.668201e-01 3 5 2 -1 |
454 | - line = line.split() |
455 | - try: # Remove BR if formatting is wrong |
456 | - partial_width = float(line[0])*width |
457 | - decay_products = [int(p) for p in line[2:2+int(line[1])]] |
458 | - except: |
459 | - line=param_card[line_number] |
460 | - continue |
461 | - try: |
462 | - particle_dict[particle].append([decay_products, partial_width]) |
463 | - except KeyError: |
464 | - particle_dict[particle] = [[decay_products, partial_width]] |
465 | - line=param_card[line_number] |
466 | - if particle and particle not in particle_dict: |
467 | - # No decays given, only total width |
468 | - particle_dict[particle] = [[[], width]] |
469 | - else: # Not decay |
470 | - line_number += 1 |
471 | - # Clean out possible remaining comments at the end of the card |
472 | - while not param_card[-1] or param_card[-1].startswith('#'): |
473 | - param_card.pop(-1) |
474 | - |
475 | - # Append calculated and read decays to the param_card |
476 | - param_card.append("#\n#*************************") |
477 | - param_card.append("# Decay widths *") |
478 | - param_card.append("#*************************") |
479 | - for key in sorted(particle_dict.keys()): |
480 | - width = sum([r for p,r in particle_dict[key]]) |
481 | - param_card.append("#\n# PDG Width") |
482 | - param_card.append("DECAY %i %e" % (key, width)) |
483 | - if not width: |
484 | - continue |
485 | - if particle_dict[key][0][0]: |
486 | - param_card.append("# BR NDA ID1 ID2 ...") |
487 | - brs = [[val[1]/width, val[0]] for val in particle_dict[key] if val[1]] |
488 | - for val in sorted(brs, reverse=True): |
489 | - param_card.append(" %e %i %s" % (val[0], len(val[1]), |
490 | - " ".join([str(v) for v in val[1]]))) |
491 | - output_name = os.path.join(root_path, run_name + "_param_card.dat") |
492 | - decay_table = open(output_name, 'w') |
493 | - decay_table.write("\n".join(param_card) + "\n") |
494 | - print "Results written to ", output_name |
495 | - |
496 | |
497 | === modified file 'Template/bin/internal/create_matching_plots.sh' |
498 | --- Template/bin/internal/create_matching_plots.sh 2011-11-18 23:12:02 +0000 |
499 | +++ Template/bin/internal/create_matching_plots.sh 2012-02-16 18:25:20 +0000 |
500 | @@ -12,25 +12,27 @@ |
501 | export PATH=$ROOTSYS/bin:$PATH |
502 | fi |
503 | donerun=0 |
504 | -if [[ -e $1_pythia_events.tree.gz ]];then |
505 | +if [[ -e $1/$2_pythia_events.tree.gz ]];then |
506 | donerun=1 |
507 | - echo gunzip $1_pythia_events.tree.gz |
508 | - gunzip -c $1_pythia_events.tree.gz > events.tree |
509 | - cp $1_pythia_xsecs.tree xsecs.tree |
510 | + echo gunzip $1/$2_pythia_events.tree.gz |
511 | + gunzip -c $1/$2_pythia_events.tree.gz > events.tree |
512 | + cp $1/$2_pythia_xsecs.tree xsecs.tree |
513 | fi |
514 | + |
515 | if [[ ! -e events.tree || ! -e xsecs.tree ]];then |
516 | echo "No events.tree or xsecs.tree files found" |
517 | exit |
518 | fi |
519 | echo Running root |
520 | -root -q -b ../bin/internal/read_tree_files.C |
521 | +root -q -b ../bin/internal/read_tree_files.C &> /dev/null |
522 | echo Creating plots |
523 | -root -q -b ../bin/internal/create_matching_plots.C |
524 | -mv pythia.root $1_pythia.root |
525 | -if [[ ! -d $1_pythia ]];then |
526 | - mkdir $1_pythia |
527 | +root -q -b ../bin/internal/create_matching_plots.C &> /dev/null |
528 | +mv pythia.root $1/$2_pythia.root |
529 | + |
530 | +if [[ ! -d ../HTML/$1/plots_pythia_$2 ]];then |
531 | + mkdir ../HTML/$1/plots_pythia_$2 |
532 | fi |
533 | -for i in DJR*.eps; do mv $i $1_pythia/${i%.*}.ps;done |
534 | +for i in DJR*.eps; do mv $i ../HTML/$1/plots_pythia_$2/${i%.*}.ps;done |
535 | if [[ donerun -eq 1 ]];then |
536 | rm events.tree xsecs.tree |
537 | fi |
538 | |
539 | === modified file 'Template/bin/internal/put_banner' |
540 | --- Template/bin/internal/put_banner 2011-06-29 14:15:21 +0000 |
541 | +++ Template/bin/internal/put_banner 2012-02-16 18:25:20 +0000 |
542 | @@ -19,12 +19,14 @@ |
543 | else |
544 | c=$1 |
545 | fi |
546 | + |
547 | + |
548 | if [[ -d Events ]]; then |
549 | cd Events |
550 | # Put all the info in a long banner |
551 | # First the header |
552 | # $B$ begin_banner $B$ !this is a tag for MadWeight. Don't edit this line |
553 | - cat banner_header.txt >& b.txt |
554 | + cat ../Source/banner_header.txt >& b.txt |
555 | # $E$ begin_banner $E$ !this is a tag for MadWeight. Don't edit this line |
556 | # Now the version information |
557 | # grab the info first |
558 | |
559 | === modified file 'Template/bin/internal/run_delphes' |
560 | --- Template/bin/internal/run_delphes 2011-12-06 16:10:41 +0000 |
561 | +++ Template/bin/internal/run_delphes 2012-02-16 18:25:20 +0000 |
562 | @@ -7,7 +7,8 @@ |
563 | |
564 | delphesdir=$1 |
565 | run=$2 |
566 | -tag=$3 |
567 | +tag=$3 |
568 | +cross=$4 |
569 | |
570 | main=`pwd` |
571 | |
572 | @@ -38,6 +39,7 @@ |
573 | if [ -e delphes_events.lhco ]; then |
574 | # write the delphes banner |
575 | sed -e "s/^/#/g" ${run}/${run}_${tag}_banner.txt > ${run}/${tag}_delphes_events.lhco |
576 | + echo "## Integrated weight (pb) : ${cross}" >> ${run}/${tag}_delphes_events.lhco |
577 | cat delphes_events.lhco >> ${run}/${tag}_delphes_events.lhco |
578 | rm -f delphes_events.lhco |
579 | fi |
580 | |
581 | === removed file 'Template/bin/internal/store' |
582 | --- Template/bin/internal/store 2011-12-05 08:47:14 +0000 |
583 | +++ Template/bin/internal/store 1970-01-01 00:00:00 +0000 |
584 | @@ -1,153 +0,0 @@ |
585 | -#!/bin/bash |
586 | -# |
587 | -# Need to launch as store RUN_NAME TAG_NAME |
588 | -# |
589 | -# |
590 | -if [[ ! -d ./bin ]]; then |
591 | - cd ../ |
592 | - if [[ ! -d ./bin ]]; then |
593 | - echo "Error: store must be executed from the main, or bin directory" |
594 | - exit |
595 | - fi |
596 | -fi |
597 | -if [[ ! -d SubProcesses ]]; then |
598 | - echo "Error: SubProcesses directory not found" |
599 | - exit |
600 | -fi |
601 | -cd SubProcesses |
602 | -if [[ "$1" == "" ]]; then |
603 | - echo 'Enter you must specify a name to store files under. (eg store TeV)' |
604 | - exit |
605 | -fi |
606 | -#$i=1; |
607 | -#$max_store = 99; |
608 | -#while(-e "SubProcesses/".$t.$i."_results.html" && $i < $max_store){ |
609 | -# $i++; |
610 | -#} |
611 | - |
612 | -if [[ -e $1_results.html ]]; then |
613 | - rm -f $1_results.html |
614 | -fi |
615 | -sed s/results.html/$1_results.html/g results.html | sed s/log.txt/$1_log.txt/g | sed s/results.dat/$1_results.dat/g | sed s/plots.html/$1_plots.html/g | sed s/pyplots.html/$1_pyplots.html/g > $1_results.html |
616 | -mv -f results.dat $1_results.dat >& /dev/null |
617 | -for i in `cat subproc.mg` ; do |
618 | - cd $i |
619 | - echo $i |
620 | - if [[ -e $1_results.html ]]; then |
621 | - rm -f $1_results.html |
622 | - fi |
623 | - mv -f results.dat $1_results.dat >& /dev/null |
624 | - sed s/results.html/$1_results.html/g results.html | sed s/log.txt/$1_log.txt/g | sed s/results.dat/$1_results.dat/g > $1_results.html |
625 | - for k in G* ; do |
626 | - if [[ ! -d $k ]]; then |
627 | - continue |
628 | - fi |
629 | - cd $k |
630 | - if [[ -e events.lhe ]]; then |
631 | - rm -f events.lhe |
632 | -# cp -f events.lhe $1_events.lhe |
633 | -# rm -f $1_events.lhe.gz >& /dev/null |
634 | -# gzip $1_events.lhe |
635 | - fi |
636 | - for j in results.dat log.txt ; do |
637 | - if [[ -e $j ]]; then |
638 | - mv -f $j $1_$j |
639 | - fi |
640 | - done |
641 | - for j in ftn25 ftn99 ; do |
642 | - if [[ -e $j ]]; then |
643 | - mv -f $j $1_$j |
644 | - rm -f $1_$j.gz >&/dev/null |
645 | - gzip $1_$j |
646 | - fi |
647 | - done |
648 | - cd ../ |
649 | - done |
650 | - cd ../ |
651 | -done |
652 | -cd .. |
653 | -./bin/internal/gen_cardhtml-pl |
654 | -if [[ ! -e Events/$1 ]]; |
655 | -if [[ -e Events/events.lhe ]]; then |
656 | - mv -f Events/events.lhe Events/$1_events.lhe |
657 | - rm -f Events/$1_events.lhe.gz >& /dev/null |
658 | - gzip -f Events/$1_events.lhe |
659 | -fi |
660 | -if [[ -e Events/banner.txt ]]; then |
661 | - mv -f Events/banner.txt Events/$1_banner.txt |
662 | -fi |
663 | -if [[ -e Events/unweighted_events.lhe ]]; then |
664 | - mv -f Events/unweighted_events.lhe Events/$1_unweighted_events.lhe |
665 | - rm -f Events/$1_unweighted_events.lhe.gz >& /dev/null |
666 | - gzip -f Events/$1_unweighted_events.lhe |
667 | -fi |
668 | -cd Events |
669 | -if [[ -e plots.html ]]; then |
670 | - mv -f plots.html $1_plots.html |
671 | -fi |
672 | -if [[ -e pyplots.html ]]; then |
673 | - mv -f pyplots.html $1_pyplots.html |
674 | -fi |
675 | -if [[ -e ./$1/events.list ]]; then |
676 | - sed s/unweighted_events.root/$1_unweighted_events.root/g ./$1/events.list > ./$1/events.list.temp |
677 | - mv -f ./$1/events.list.temp ./$1/events.list |
678 | -fi |
679 | -if [[ -e unweighted_events.root ]]; then |
680 | - mv -f unweighted_events.root $1_unweighted_events.root |
681 | -fi |
682 | -if [[ -e pythia_events.hep ]]; then |
683 | - mv -f pythia_events.hep $1_pythia_events.hep |
684 | - gzip -f $1_pythia_events.hep |
685 | - mv -f pythia.log $1_pythia.log |
686 | -fi |
687 | - |
688 | -if [[ -e pythia_events.lhe ]]; then |
689 | -# write the pythia banner |
690 | - echo \<LesHouchesEvents version=\"1.0\"\> > temp.lhe |
691 | - echo \<\!-- >> temp.lhe |
692 | - echo "# Warning\! Never use this file for detector studies\!" >> temp.lhe |
693 | - echo --\> >> temp.lhe |
694 | - cat $1_banner.txt >> temp.lhe |
695 | - sed /'<LesHouchesEvents version=\"1.0\">'/s//'-->'/ pythia_events.lhe >> temp.lhe |
696 | - rm -f pythia_events.lhe |
697 | - mv -f temp.lhe $1_pythia_events.lhe |
698 | - gzip -f $1_pythia_events.lhe |
699 | -fi |
700 | - |
701 | -if [[ -e pythia_lhe_events.root ]]; then |
702 | - mv -f pythia_lhe_events.root $1_pythia_lhe_events.root |
703 | -fi |
704 | - |
705 | -if [[ -e events.tree ]]; then |
706 | - mv -f events.tree $1_pythia_events.tree |
707 | - gzip -f $1_pythia_events.tree |
708 | - mv -f xsecs.tree $1_pythia_xsecs.tree |
709 | - mv -f beforeveto.tree $1_pythia_beforeveto.tree |
710 | - gzip -f $1_pythia_beforeveto.tree |
711 | -fi |
712 | -if [[ -e pgs_events.lhco ]]; then |
713 | - mv -f pgs_events.lhco $1_pgs_events.lhco |
714 | - gzip -f $1_pgs_events.lhco |
715 | - if [ -e olympics.log ];then |
716 | - mv -f olympics.log $1_pgs.log |
717 | - else |
718 | - mv -f pgs.log $1_pgs.log |
719 | - fi |
720 | - mv -f pgs_uncleaned_events.lhco $1_pgs_uncleaned_events.lhco |
721 | - gzip -f $1_pgs_uncleaned_events.lhco |
722 | -fi |
723 | -if [[ -e pgs_events.root ]]; then |
724 | - mv -f pgs_events.root $1_pgs_events.root |
725 | -fi |
726 | -if [[ -e pythia_events.root ]]; then |
727 | - mv -f pythia_events.root $1_pythia_events.root |
728 | -fi |
729 | -if [ -e delphes_events.lhco ]; then |
730 | - mv -f delphes_events.lhco $1_delphes_events.lhco |
731 | - gzip -f $1_delphes_events.lhco |
732 | - mv -f delphes_run.log $1_delphes.log |
733 | -fi |
734 | -if [ -e delphes.root ]; then |
735 | - mv -f delphes.root $1_delphes_events.root |
736 | -fi |
737 | -cd .. |
738 | |
739 | === removed file 'Template/bin/internal/store4grid' |
740 | --- Template/bin/internal/store4grid 2011-10-26 01:11:44 +0000 |
741 | +++ Template/bin/internal/store4grid 1970-01-01 00:00:00 +0000 |
742 | @@ -1,145 +0,0 @@ |
743 | -#!/bin/bash |
744 | -# |
745 | -# First we need to get into the main directory |
746 | -# |
747 | -if [[ ! -d ./bin ]]; then |
748 | - cd ../ |
749 | - if [[ ! -d ./bin ]]; then |
750 | - echo "Error: store must be executed from the main, or bin directory" |
751 | - exit |
752 | - fi |
753 | -fi |
754 | -if [[ ! -d SubProcesses ]]; then |
755 | - echo "Error: SubProcesses directory not found" |
756 | - exit |
757 | -fi |
758 | -cd SubProcesses |
759 | -if [[ "$1" == "" ]]; then |
760 | - echo 'Enter you must specify a name to store files under. (eg store TeV)' |
761 | - exit |
762 | -fi |
763 | -#$i=1; |
764 | -#$max_store = 99; |
765 | -#while(-e "SubProcesses/".$t.$i."_results.html" && $i < $max_store){ |
766 | -# $i++; |
767 | -#} |
768 | - |
769 | -if [[ -e $1_results.html ]]; then |
770 | - rm -f $1_results.html |
771 | -fi |
772 | -sed s/results.html/$1_results.html/g results.html | sed s/log.txt/$1_log.txt/g | sed s/results.dat/$1_results.dat/g | sed s/plots.html/$1_plots.html/g | sed s/pyplots.html/$1_pyplots.html/g > $1_results.html |
773 | -mv -f results.dat $1_results.dat >& /dev/null |
774 | -for i in `cat subproc.mg` ; do |
775 | - cd $i |
776 | - echo $i |
777 | - if [[ -e $1_results.html ]]; then |
778 | - rm -f $1_results.html |
779 | - fi |
780 | - mv -f results.dat $1_results.dat >& /dev/null |
781 | - sed s/results.html/$1_results.html/g results.html | sed s/log.txt/$1_log.txt/g | sed s/results.dat/$1_results.dat/g > $1_results.html |
782 | - for k in G* ; do |
783 | - if [[ ! -d $k ]]; then |
784 | - continue |
785 | - fi |
786 | - cd $k |
787 | - if [[ -e events.lhe ]]; then |
788 | - rm -f events.lhe |
789 | -# cp -f events.lhe $1_events.lhe |
790 | -# rm -f $1_events.lhe.gz >& /dev/null |
791 | -# gzip $1_events.lhe |
792 | - fi |
793 | - for j in results.dat log.txt ; do |
794 | - if [[ -e $j ]]; then |
795 | - mv -f $j $1_$j |
796 | - fi |
797 | - done |
798 | - for j in ftn26 ; do |
799 | - if [[ -e $j ]]; then |
800 | - mv -f $j $1_$j |
801 | - rm -f $1_$j.gz >&/dev/null |
802 | - gzip $1_$j |
803 | - fi |
804 | - done |
805 | - cd ../ |
806 | - done |
807 | - cd ../ |
808 | -done |
809 | -cd .. |
810 | -./bin/internal/gen_cardhtml-pl |
811 | -if [[ -e Events/events.lhe ]]; then |
812 | - mv -f Events/events.lhe Events/$1_events.lhe |
813 | - rm -f Events/$1_events.lhe.gz >& /dev/null |
814 | - gzip -f Events/$1_events.lhe |
815 | -fi |
816 | -if [[ -e Events/unweighted_events.lhe ]]; then |
817 | - ./bin/internal/extract_banner-pl Events/unweighted_events.lhe Events/$1_banner.txt |
818 | - mv -f Events/unweighted_events.lhe Events/$1_unweighted_events.lhe |
819 | - rm -f Events/$1_unweighted_events.lhe.gz >& /dev/null |
820 | - gzip -f Events/$1_unweighted_events.lhe |
821 | -fi |
822 | -cd Events |
823 | -if [[ -e plots.html ]]; then |
824 | - mv -f plots.html $1_plots.html |
825 | -fi |
826 | -if [[ -e pyplots.html ]]; then |
827 | - mv -f pyplots.html $1_pyplots.html |
828 | -fi |
829 | -if [[ -e ./$1/events.list ]]; then |
830 | - sed s/unweighted_events.root/$1_unweighted_events.root/g ./$1/events.list > ./$1/events.list.temp |
831 | - mv -f ./$1/events.list.temp ./$1/events.list |
832 | -fi |
833 | -if [[ -e unweighted_events.root ]]; then |
834 | - mv -f unweighted_events.root $1_unweighted_events.root |
835 | -fi |
836 | -if [[ -e pythia_events.hep ]]; then |
837 | - mv -f pythia_events.hep $1_pythia_events.hep |
838 | - gzip -f $1_pythia_events.hep |
839 | - mv -f pythia.log $1_pythia.log |
840 | -# pythia_card.dat |
841 | - echo "<MGPythiaCard>" >>$1_banner.txt |
842 | - if [[ -e ../Cards/pythia_card.dat ]]; then cat ../Cards/pythia_card.dat >> $1_banner.txt; fi |
843 | - echo "</MGPythiaCard>" >>$1_banner.txt |
844 | -fi |
845 | - |
846 | -if [[ -e pythia_events.lhe ]]; then |
847 | -# write the pythia banner |
848 | - echo \<LesHouchesEvents version=\"1.0\"\> > temp.lhe |
849 | - echo \<\!-- >> temp.lhe |
850 | - cat $1_banner.txt >> temp.lhe |
851 | - sed /'<LesHouchesEvents version=\"1.0\">'/s//'-->'/ pythia_events.lhe >> temp.lhe |
852 | - rm -f pythia_events.lhe |
853 | - mv -f temp.lhe $1_pythia_events.lhe |
854 | - gzip -f $1_pythia_events.lhe |
855 | -fi |
856 | - |
857 | -if [[ -e pythia_lhe_events.root ]]; then |
858 | - mv -f pythia_lhe_events.root $1_pythia_lhe_events.root |
859 | -fi |
860 | - |
861 | -if [[ -e events.tree ]]; then |
862 | - mv -f events.tree $1_pythia_events.tree |
863 | - gzip -f $1_pythia_events.tree |
864 | - mv -f xsecs.tree $1_pythia_xsecs.tree |
865 | - mv -f beforeveto.tree $1_pythia_beforeveto.tree |
866 | - gzip -f $1_pythia_beforeveto.tree |
867 | -fi |
868 | -if [[ -e pgs_events.lhco ]]; then |
869 | -# pgs_card.dat |
870 | - echo "<MGPGSCard>" >>$1_banner.txt |
871 | - if [[ -e ../Cards/pgs_card.dat ]]; then cat ../Cards/pgs_card.dat >> $1_banner.txt; fi |
872 | - echo "</MGPGSCard>" >>$1_banner.txt |
873 | -# write the pgs banner |
874 | - sed -e "s/^/#/g" $1_banner.txt > temp.dat |
875 | - cat pgs_events.lhco >> temp.dat |
876 | - rm -f pgs_events.lhco |
877 | - mv -f temp.dat $1_pgs_events.lhco |
878 | - gzip -f $1_pgs_events.lhco |
879 | - mv -f pgs.log $1_pgs.log |
880 | -fi |
881 | -if [[ -e pgs_events.root ]]; then |
882 | - mv -f pgs_events.root $1_pgs_events.root |
883 | -fi |
884 | -if [[ -e pythia_events.root ]]; then |
885 | - mv -f pythia_events.root $1_pythia_events.root |
886 | -fi |
887 | -cd .. |
888 | |
889 | === modified file 'Template/bin/madevent' |
890 | --- Template/bin/madevent 2011-10-14 22:45:25 +0000 |
891 | +++ Template/bin/madevent 2012-02-16 18:25:20 +0000 |
892 | @@ -53,6 +53,7 @@ |
893 | # Check if optimize mode is (and should be) activated |
894 | if __debug__ and not options.debug and \ |
895 | (not os.path.exists(os.path.join(root_path,'../..', 'bin','create_release.py')) or options.web): |
896 | + print 'launch in debug mode' |
897 | subprocess.call([sys.executable] + ['-O'] + sys.argv) |
898 | sys.exit() |
899 | |
900 | @@ -101,6 +102,7 @@ |
901 | |
902 | # Set logging level according to the logging level given by options |
903 | #logging.basicConfig(level=vars(logging)[options.logging]) |
904 | +import internal.coloring_logging |
905 | try: |
906 | if __debug__ and options.logging == 'INFO': |
907 | options.logging = 'DEBUG' |
908 | @@ -142,6 +144,9 @@ |
909 | except KeyboardInterrupt: |
910 | print 'writting history and directory and quit on KeyboardInterrupt' |
911 | pass |
912 | +except cmd_interface.MadEventAlreadyRunning, error: |
913 | + logging.error(str(error)) |
914 | + SystemError.exit() |
915 | try: |
916 | readline.set_history_length(100) |
917 | if not os.path.exists(os.path.join(os.environ['HOME'], '.mg5')): |
918 | @@ -151,9 +156,9 @@ |
919 | pass |
920 | |
921 | try: |
922 | + if cmd_line.history[-1] not in ['EOF','quit','exit']: |
923 | + cmd_line.results.store_result() |
924 | # Remove lock file |
925 | os.remove(os.path.join(root_path,os.pardir, 'RunWeb')) |
926 | - if cmd_line.history[-1] not in ['EOF','quit','exit']: |
927 | - cmd_line.results.store_result() |
928 | except: |
929 | pass |
930 | \ No newline at end of file |
931 | |
932 | === modified file 'Template/makefile' |
933 | --- Template/makefile 2011-09-13 02:02:28 +0000 |
934 | +++ Template/makefile 2012-02-16 18:25:20 +0000 |
935 | @@ -15,7 +15,6 @@ |
936 | gridpack.tar: |
937 | mv Events Events1 |
938 | mkdir Events |
939 | - cp Events1/banner_header.txt Events/ |
940 | mkdir madevent |
941 | cp bin/internal/Gridpack/run.sh ./ |
942 | cp bin/internal/Gridpack/* bin/ |
943 | |
944 | === modified file 'UpdateNotes.txt' |
945 | --- UpdateNotes.txt 2011-12-07 00:15:33 +0000 |
946 | +++ UpdateNotes.txt 2012-02-16 18:25:20 +0000 |
947 | @@ -1,23 +1,51 @@ |
948 | Update notes for MadGraph 5 (in reverse time order) |
949 | |
950 | -1.4.b6 (1/12/11): OM: New structure for madevent script (./bin/madevent) |
951 | - allowing to have an interface similar to MG5 |
952 | - for running madevent. |
953 | - (accessible from MG5 via launch [DIR] -i) |
954 | - This script replaces most of the earlier scripts in |
955 | - ./bin/ (such as survey/refine/combine/run_XXXX/ |
956 | - multi_run/plot) |
957 | - The script generate_events still exists but now calls |
958 | - ./bin/madevent. |
959 | - OM: MadEvent checks that the param_card is coherent with the |
960 | - restriction used during the model generation. |
961 | +1.4.b10 (19/01/12): OM: New Interface to controll the madevent run. |
962 | + This one is accessible by |
963 | + 1) (from madevent output) ./bin/madevent |
964 | + 2) (from MG5 command line) launch [MADEVENT_PATH] -i |
965 | + This interface replaces various script like refine, |
966 | + survey, combine, run_..., rm_run, ... |
967 | + The script generate_events still exists but now calls |
968 | + ./bin/madevent. |
969 | OM: For MSSM model, convert param_card to SLAH1. This card is |
970 | converted to SLAH2 during the MadEvent run since the UFO |
971 | model uses SLAH2. This allows to use Pythia 6, |
972 | as well as have a coherent definition for the flavor. |
973 | + JA+OM: For decay width computations, the launch command in |
974 | + addition to compute the width, creates a new param_card |
975 | + with the width set to the associates values, and with the |
976 | + Branching ratio associated (usefull for pythia). |
977 | + NOTE: This param_card makes sense for future run ONLY if all |
978 | + relevant decay are generated. |
979 | + EXAMPLE: (after launch bin/mg5): |
980 | + import model sm-full |
981 | + generate t > b w+ |
982 | + define all = p b b~ l+ l- ta+ ta- vl vl~ |
983 | + add process w+ > all all |
984 | + add process z > all all |
985 | + define v = z w+ w- |
986 | + add process h > all all |
987 | + add process h > v v, v > all all |
988 | + output |
989 | + launch |
990 | + OM: change output pythia8 syntax: If a path is specified this |
991 | + is considered as the output directory. |
992 | + OM: Change the path of the madevent output files. This allow to |
993 | + run mulitple times pythia/pgs/delphes for the same set of |
994 | + events (with different parameters). |
995 | + OM: MadEvent checks that the param_card is coherent with the |
996 | + restriction used during the model generation. |
997 | + OM: Restriction Model will now also force opposite number to match. |
998 | + (helpfull for constraining to rotation matrix). |
999 | OM: Change the import command. It's now allowed to omit the |
1000 | type of import. The type is guessed automaticaly. |
1001 | This is NOT allowed on the web. |
1002 | + OM: Add a check that the fermion flow is coherent with the |
1003 | + Lorentz structure associates to the vertex. |
1004 | + OM: Add a check that color representation is coherent. |
1005 | + This allow to detect/fix various problem linked to the new model |
1006 | + created by FR and SARAH. |
1007 | OM: Change the default fortran compiler to gfortran. |
1008 | OM: Add the possibility to force which fortran compiler will |
1009 | be used, either via the configuration file or via the set |
1010 | @@ -40,40 +68,43 @@ |
1011 | their associated weight (for automatic order restriction) |
1012 | d) display couplings now returns the list of all couplings |
1013 | with the associated expression |
1014 | - OM: New Python script for the creation of the crossx.html/results.html page |
1015 | - Requiring less disk access for the generation of the file. |
1016 | + e) display interactions PART1 [PART2] [PART3] ... |
1017 | + display all interactions containing at least the particles set in arguments |
1018 | + OM: New Python script for the creation of the various html pages. |
1019 | + This Requires less disk access for the generation of the files. |
1020 | OM: Modify error treatment, especially for Invalid commands |
1021 | and Configuration problems. |
1022 | - JA: Add new run mode, calculate_decay_widths, which calculates |
1023 | - decay widths for processes of type A > B C ... and inserts |
1024 | - decay widths and branching ratios into the param_card. |
1025 | - Example of use (after launch bin/mg5): |
1026 | - import model sm-full |
1027 | - generate t > b w+ |
1028 | - define all = p b b~ l+ l- ta+ ta- vl vl~ |
1029 | - add process w+ > all all |
1030 | - add process z > all all |
1031 | - define v = z w+ w- |
1032 | - add process h > all all |
1033 | - add process h > v v, v > all all |
1034 | - output |
1035 | - launch -i |
1036 | - calculate_decay_widths |
1037 | - JA: Ensure that we get zero cross section if we have |
1038 | + JA: Ensure that we get zero cross section if we have |
1039 | non-parton initial states with proton/antiproton beams |
1040 | - OM: Improve cluster support. MadEvent now supports PBS/Condor/SGE |
1041 | - Thanks to Arian Abrahantes for the SGE implementation. |
1042 | - OM: Improve the parralel suite and change the release script to run |
1043 | - some of the parralel suite. This ensures more stability of the |
1044 | - code for the future release. |
1045 | - |
1046 | - Thanks to Johan Alwall, Sho Iwamoto for all the important testing/bug reports |
1047 | + OM: Improve cluster support. MadEvent now supports PBS/Condor/SGE |
1048 | + Thanks to Arian Abrahantes for the SGE implementation. |
1049 | + OM: Improve auto-completion (better output/dealing with multi line/...) |
1050 | + OM: Improve the parralel suite and change the release script to run |
1051 | + some of the parralel suite. This ensures more stability of the |
1052 | + code for the future release. |
1053 | + |
1054 | + |
1055 | + Thanks to Johan Alwall, Sho Iwamoto for all the important |
1056 | + testing/bug reports. |
1057 | + |
1058 | +1.3.32 (21/12/11) JA: Fixed a bug in the PDF reweighting routine, |
1059 | + which caused skewed eta distributions for |
1060 | + matched samples with pdfwgt=T. Thanks to Giulio |
1061 | + Lenzi and the CMS MC generation team for finding this. |
1062 | + |
1063 | +1.3.31 (29/11/11) OM: Fix a bug an overflow in RAMBO (affects standalone |
1064 | + output only) |
1065 | + PdA (via OM): Change RS model (add a width to the spin2) |
1066 | + OM: Fix a bug in the cuts associate to allowed mass of all |
1067 | + neutrinos+leptons (thanks to Brock Tweedie for finding it) |
1068 | + OM: Remove some limitation in the name for the particles |
1069 | + |
1070 | |
1071 | 1.3.30 (18/11/11) OM: Fix a bug for the instalation of pythia-pgs on a 64 bit |
1072 | - UNIX machine. |
1073 | - OM: If ROOTSYS is define but root in the PATH, add it |
1074 | - automatically in create_matching_plots.sh |
1075 | - This is require for the UIUC cluster. |
1076 | + UNIX machine. |
1077 | + OM: If ROOTSYS is define but root in the PATH, add it |
1078 | + automatically in create_matching_plots.sh |
1079 | + This is require for the UIUC cluster. |
1080 | |
1081 | 1.3.29 (16/11/11) OM: Fixed particle identities in the Feynman diagram drawing |
1082 | JA: Fixed bug in pdf reweighting when external LHAPDF is used. |
1083 | |
1084 | === added file 'aloha/aloha_fct.py' |
1085 | --- aloha/aloha_fct.py 1970-01-01 00:00:00 +0000 |
1086 | +++ aloha/aloha_fct.py 2012-02-16 18:25:20 +0000 |
1087 | @@ -0,0 +1,76 @@ |
1088 | +################################################################################ |
1089 | +# |
1090 | +# Copyright (c) 2012 The ALOHA Development team and Contributors |
1091 | +# |
1092 | +# This file is a part of the ALOHA project, an application which |
1093 | +# automatically generates HELAS ROUTINES |
1094 | +# |
1095 | +# It is subject to the ALOHA license which should accompany this |
1096 | +# distribution. |
1097 | +# |
1098 | +# |
1099 | +################################################################################ |
1100 | +from aloha.aloha_object import * |
1101 | +import cmath |
1102 | + |
1103 | +class WrongFermionFlow(Exception): |
1104 | + pass |
1105 | + |
1106 | +################################################################################ |
1107 | +## CHECK FLOW VALIDITY OF A LORENTZ STRUCTURE |
1108 | +################################################################################ |
1109 | +def check_flow_validity(expression, nb_fermion): |
1110 | + """Check that the fermion flow follows the UFO convention |
1111 | + 1) Only one flow is defined and is 1 -> 2, 3 -> 4, ... |
1112 | + 2) that 1/3/... are on the left side of any Gamma matrices |
1113 | + """ |
1114 | + |
1115 | + assert nb_fermion != 0 and (nb_fermion % 2) == 0 |
1116 | + |
1117 | + # Need to expand the expression in order to have a simple sum of expression |
1118 | + expr = eval(expression) |
1119 | + expr = expr.simplify() |
1120 | + #expr is now a valid AddVariable object if they are a sum or |
1121 | + if expr.vartype != 1: # not AddVariable |
1122 | + expr = [expr] # put in a list to allow comparison |
1123 | + |
1124 | + for term in expr: |
1125 | + if term.vartype == 0: # Single object |
1126 | + if not term.spin_ind in [[1,2], [2,1]]: |
1127 | + raise WrongFermionFlow, 'Fermion should be the first particles of any interactions' |
1128 | + if isinstance(term, (Gamma, Gamma5, Sigma)): |
1129 | + if not term.spin_ind == [2,1]: |
1130 | + raise WrongFermionFlow, 'Not coherent Incoming/outcoming fermion flow' |
1131 | + |
1132 | + elif term.vartype == 2: # product of object |
1133 | + link, rlink = {}, {} |
1134 | + for obj in term: |
1135 | + if not obj.spin_ind: |
1136 | + continue |
1137 | + ind1, ind2 = obj.spin_ind |
1138 | + if isinstance(obj, (Gamma, Gamma5, Sigma)): |
1139 | + if (ind1 in range(1, nb_fermion+1) and ind1 % 2 == 1) or \ |
1140 | + (ind2 in range(2, nb_fermion+1) and ind2 % 2 == 0 ): |
1141 | + raise WrongFermionFlow, 'Not coherent Incoming/outcoming fermion flow' |
1142 | + if ind1 not in link.keys(): |
1143 | + link[ind1] = ind2 |
1144 | + else: |
1145 | + rlink[ind1] = ind2 |
1146 | + if ind2 not in link.keys(): |
1147 | + link[ind2] = ind1 |
1148 | + else: |
1149 | + rlink[ind2] = ind1 |
1150 | + for i in range(1, nb_fermion,2): |
1151 | + old = [] |
1152 | + pos = i |
1153 | + while 1: |
1154 | + old.append(pos) |
1155 | + if pos in link.keys() and link[pos] not in old: |
1156 | + pos = link[pos] |
1157 | + elif pos in rlink.keys() and rlink[pos] not in old: |
1158 | + pos = rlink[pos] |
1159 | + elif pos != i+1: |
1160 | + raise WrongFermionFlow, 'Not coherent Incoming/outcoming fermion flow' |
1161 | + elif pos == i+1: |
1162 | + break |
1163 | + |
1164 | |
1165 | === modified file 'aloha/aloha_lib.py' |
1166 | --- aloha/aloha_lib.py 2011-08-09 18:34:24 +0000 |
1167 | +++ aloha/aloha_lib.py 2012-02-16 18:25:20 +0000 |
1168 | @@ -1715,3 +1715,9 @@ |
1169 | |
1170 | def __str__(self): |
1171 | return str(self.value) |
1172 | + |
1173 | + |
1174 | + |
1175 | + |
1176 | + |
1177 | + |
1178 | |
1179 | === modified file 'aloha/aloha_object.py' |
1180 | --- aloha/aloha_object.py 2011-08-09 18:34:24 +0000 |
1181 | +++ aloha/aloha_object.py 2012-02-16 18:25:20 +0000 |
1182 | @@ -478,7 +478,7 @@ |
1183 | (3,0): 0, (3,1): 0, (3,2): 0, (3,3): 1} |
1184 | |
1185 | def __init__(self, spin1, spin2, prefactor=1): |
1186 | - if spin1 < spin2: |
1187 | + if spin1 > spin2: |
1188 | aloha_lib.LorentzObject.__init__(self,[], [spin1, spin2], prefactor) |
1189 | else: |
1190 | aloha_lib.LorentzObject.__init__(self,[], [spin2, spin1], prefactor) |
1191 | |
1192 | === modified file 'aloha/aloha_writers.py' |
1193 | --- aloha/aloha_writers.py 2012-02-03 10:07:01 +0000 |
1194 | +++ aloha/aloha_writers.py 2012-02-16 18:25:20 +0000 |
1195 | @@ -23,7 +23,7 @@ |
1196 | def __init__(self, abstract_routine, dirpath): |
1197 | |
1198 | |
1199 | - name = get_routine_name(abstract_routine.name, abstract_routine.outgoing) |
1200 | + name = get_routine_name(abstract = abstract_routine) |
1201 | if dirpath: |
1202 | self.dir_out = dirpath |
1203 | self.out_path = os.path.join(dirpath, name + self.extension) |
1204 | @@ -39,7 +39,8 @@ |
1205 | self.comment = abstract_routine.infostr |
1206 | self.offshell = abstract_routine.outgoing |
1207 | self.symmetries = abstract_routine.symmetries |
1208 | - self.loop_routine = abstract_routine.loop |
1209 | + self.tag = abstract_routine.tag |
1210 | + self.loop_routine = 'L' in self.tag |
1211 | |
1212 | #prepare the necessary object |
1213 | self.collect_variables() # Look for the different variables |
1214 | @@ -517,7 +518,7 @@ |
1215 | else: |
1216 | sym = None # deactivate symetry |
1217 | |
1218 | - name = combine_name(self.abstractname, lor_names, offshell) |
1219 | + name = combine_name(self.abstractname, lor_names, offshell, self.tag) |
1220 | |
1221 | |
1222 | # write header |
1223 | @@ -535,21 +536,8 @@ |
1224 | text += ' double complex TMP(%s)\n integer i' % self.type_to_size[spin] |
1225 | |
1226 | # Define which part of the routine should be called |
1227 | - addon = '' |
1228 | - if 'C' in self.namestring: |
1229 | - short_name, addon = name.split('C',1) |
1230 | - if addon.split('_')[0].isdigit(): |
1231 | - addon = 'C' +self.namestring.split('C',1)[1] |
1232 | - elif all([n.isdigit() for n in addon.split('_')[0].split('C')]): |
1233 | - addon = 'C' +self.namestring.split('C',1)[1] |
1234 | - else: |
1235 | - addon = '_%s' % self.offshell |
1236 | - else: |
1237 | - addon = '_%s' % self.offshell |
1238 | - # StartValentin 30.01.2012 |
1239 | - if 'L' in self.namestring and self.LOOP_MODE: |
1240 | - addon = 'L'+addon |
1241 | - # EndValentin |
1242 | + addon = ''.join(self.tag) + '_%s' % self.offshell |
1243 | + |
1244 | # how to call the routine |
1245 | if not offshell: |
1246 | main = 'vertex' |
1247 | @@ -603,21 +591,38 @@ |
1248 | |
1249 | return text |
1250 | |
1251 | -def get_routine_name(name,outgoing): |
1252 | +def get_routine_name(name=None, outgoing=None, tag=None, abstract=None): |
1253 | """ build the name of the aloha function """ |
1254 | |
1255 | - return '%s_%s' % (name, outgoing) |
1256 | - |
1257 | -def combine_name(name, other_names, outgoing): |
1258 | + assert (name and outgoing) or abstract |
1259 | + |
1260 | + if tag is None: |
1261 | + tag = abstract.tag |
1262 | + |
1263 | + if name is None: |
1264 | + name = abstract.name + ''.join(tag) |
1265 | + |
1266 | + if outgoing is None: |
1267 | + outgoing = abstract.outgoing |
1268 | + |
1269 | + |
1270 | + return '%s_%s' % (name, outgoing) |
1271 | + |
1272 | +def combine_name(name, other_names, outgoing, tag=None): |
1273 | """ build the name for combined aloha function """ |
1274 | |
1275 | + |
1276 | + |
1277 | # Two possible scheme FFV1C1_2_X or FFV1__FFV2C1_X |
1278 | # If they are all in FFVX scheme then use the first |
1279 | p=re.compile('^(?P<type>[FSVT]+)(?P<id>\d+)') |
1280 | routine = '' |
1281 | if p.search(name): |
1282 | base, id = p.search(name).groups() |
1283 | - routine = name |
1284 | + if tag is not None: |
1285 | + routine = name + ''.join(tag) |
1286 | + else: |
1287 | + routine = name |
1288 | for s in other_names: |
1289 | try: |
1290 | base2,id2 = p.search(s).groups() |
1291 | @@ -632,18 +637,22 @@ |
1292 | if routine: |
1293 | return routine +'_%s' % outgoing |
1294 | |
1295 | - addon = '' |
1296 | - if 'C' in name: |
1297 | - short_name, addon = name.split('C',1) |
1298 | - try: |
1299 | - addon = 'C' + str(int(addon)) |
1300 | - except: |
1301 | - addon = '' |
1302 | - else: |
1303 | - name = short_name |
1304 | + if tag is not None: |
1305 | + addon = ''.join(tag) |
1306 | + else: |
1307 | + addon = '' |
1308 | + if 'C' in name: |
1309 | + short_name, addon = name.split('C',1) |
1310 | + try: |
1311 | + addon = 'C' + str(int(addon)) |
1312 | + except: |
1313 | + addon = '' |
1314 | + else: |
1315 | + name = short_name |
1316 | |
1317 | return '_'.join((name,) + tuple(other_names)) + addon + '_%s' % outgoing |
1318 | |
1319 | + |
1320 | class ALOHAWriterForCPP(WriteALOHA): |
1321 | """Routines for writing out helicity amplitudes as C++ .h and .cc files.""" |
1322 | |
1323 | @@ -860,7 +869,7 @@ |
1324 | def write_combined_h(self, lor_names, offshell=None, compiler_cmd=True): |
1325 | """Return the content of the .h file linked to multiple lorentz call.""" |
1326 | |
1327 | - name = combine_name(self.abstractname, lor_names, offshell) |
1328 | + name = combine_name(self.abstractname, lor_names, offshell, self.tag) |
1329 | text= '' |
1330 | if compiler_cmd: |
1331 | text = '#ifndef '+ name + '_guard\n' |
1332 | @@ -912,7 +921,7 @@ |
1333 | if offshell is None: |
1334 | offshell = self.offshell |
1335 | |
1336 | - name = combine_name(self.abstractname, lor_names, offshell) |
1337 | + name = combine_name(self.abstractname, lor_names, offshell, self.tag) |
1338 | |
1339 | text = '' |
1340 | if compiler_cmd: |
1341 | @@ -1032,7 +1041,7 @@ |
1342 | if offshell is None: |
1343 | offshell = self.offshell |
1344 | |
1345 | - name = combine_name(self.abstractname, lor_names, offshell) |
1346 | + name = combine_name(self.abstractname, lor_names, offshell, self.tag) |
1347 | |
1348 | h_text = self.write_combined_h(lor_names, offshell, **opt) |
1349 | cc_text = self.write_combined_cc(lor_names, offshell, sym=True, **opt) |
1350 | @@ -1246,7 +1255,7 @@ |
1351 | offshell = self.offshell |
1352 | else: |
1353 | sym = None |
1354 | - name = combine_name(self.abstractname, lor_names, offshell) |
1355 | + name = combine_name(self.abstractname, lor_names, offshell, self.tag) |
1356 | |
1357 | # write head - momenta - body - foot |
1358 | text = '' |
1359 | @@ -1314,4 +1323,4 @@ |
1360 | text += self.write_combined(lor_names, mode, elem) |
1361 | |
1362 | return text |
1363 | - |
1364 | \ No newline at end of file |
1365 | + |
1366 | |
1367 | === modified file 'aloha/create_aloha.py' |
1368 | --- aloha/create_aloha.py 2012-02-03 10:07:01 +0000 |
1369 | +++ aloha/create_aloha.py 2012-02-16 18:25:20 +0000 |
1370 | @@ -58,7 +58,8 @@ |
1371 | self.infostr = infostr |
1372 | self.symmetries = [] |
1373 | self.combined = [] |
1374 | - self.loop = False # Check if we need the denominator |
1375 | + #self.loop = False # Check if we need the denominator |
1376 | + self.tag = [] |
1377 | |
1378 | def add_symmetry(self, outgoing): |
1379 | """ add an outgoing """ |
1380 | @@ -158,10 +159,10 @@ |
1381 | # = C_ac G_bc (-1) C_bd = C_ac G_bc C_db |
1382 | self.routine_kernel = \ |
1383 | C(new_id, old_id + 1) * self.routine_kernel * C(new_id + 1, old_id) |
1384 | - self.name += 'C' |
1385 | |
1386 | - if pair: |
1387 | - self.name += str(pair) |
1388 | +# self.name += 'C' |
1389 | +# if pair: |
1390 | +# self.name += str(pair) |
1391 | self.conjg.append(pair) |
1392 | |
1393 | |
1394 | @@ -170,9 +171,12 @@ |
1395 | """ define a simple output for this AbstractRoutine """ |
1396 | |
1397 | infostr = str(self.lorentz_expr) |
1398 | - return AbstractRoutine(self.expr, self.outgoing, self.spins, self.name, \ |
1399 | + output = AbstractRoutine(self.expr, self.outgoing, self.spins, self.name, \ |
1400 | infostr) |
1401 | - |
1402 | + |
1403 | + output.tag += ['C%s' % pair for pair in self.conjg] |
1404 | + return output |
1405 | + |
1406 | def change_sign_for_outcoming_fermion(self): |
1407 | """change the sign of P for outcoming fermion in order to |
1408 | correct the mismatch convention between HELAS and FR""" |
1409 | @@ -187,8 +191,7 @@ |
1410 | momentum_pattern = re.compile(r'\bP\(([\+\-\d]+),(%s)\)' % '|'.join(flip_sign)) |
1411 | lorentz_expr = momentum_pattern.sub(r'P(\1,\2, -1)', self.lorentz_expr) |
1412 | return lorentz_expr |
1413 | - |
1414 | - |
1415 | + |
1416 | def compute_aloha_high_kernel(self, mode, factorize=True): |
1417 | """compute the abstract routine associate to this mode """ |
1418 | |
1419 | @@ -437,11 +440,12 @@ |
1420 | def set(self, lorentzname, outgoing, abstract_routine, loop=False): |
1421 | """ add in the dictionary """ |
1422 | |
1423 | - if loop and not abstract_routine.loop: |
1424 | + if loop and not 'L' in abstract_routine.tag: |
1425 | abstract_routine = copy.copy(abstract_routine) |
1426 | - abstract_routine.loop = True |
1427 | - abstract_routine.name += 'L' |
1428 | - |
1429 | + abstract_routine.tag.append('L') |
1430 | + |
1431 | + |
1432 | + |
1433 | self[(lorentzname, outgoing)] = abstract_routine |
1434 | |
1435 | def compute_all(self, save=True, wanted_lorentz = []): |
1436 | @@ -504,11 +508,10 @@ |
1437 | |
1438 | def compute_subset(self, data): |
1439 | """ create the requested ALOHA routine. |
1440 | - data should be a list of tuple (lorentz, conjugate, outgoing, loop) |
1441 | - if loop is not given it's suppose to be False. |
1442 | - conjugate should be a tuple with the pair number to conjugate. |
1443 | - outgoing a tuple of the requested routines.""" |
1444 | - |
1445 | + data should be a list of tuple (lorentz, tag, outgoing) |
1446 | + tag should be the list of special tag (like conjugation on pair) |
1447 | + to apply on the object """ |
1448 | + |
1449 | # Search identical particles in the vertices in order to avoid |
1450 | #to compute identical contribution |
1451 | self.look_for_symmetries() |
1452 | @@ -518,14 +521,12 @@ |
1453 | request = {} |
1454 | |
1455 | #Check Loop status |
1456 | - aloha_writers.WriteALOHA.LOOP_MODE = any([len(d)==4 for d in data]) |
1457 | + aloha_writers.WriteALOHA.LOOP_MODE = any(['L' in tag for l,tag,out in data]) |
1458 | # Add loop attribut for those which are not defined |
1459 | |
1460 | - for i, d in enumerate(data): |
1461 | - if len(d) == 3: |
1462 | - data[i] = data[i] + (False,) |
1463 | - |
1464 | - for list_l_name, conjugate, outgoing, loop in data: |
1465 | + for list_l_name, tag, outgoing in data: |
1466 | + conjugate = tuple([int(c[1:]) for c in tag if c.startswith('C')]) |
1467 | + loop = ('L' in tag) |
1468 | for l_name in list_l_name: |
1469 | try: |
1470 | request[l_name][conjugate].append((outgoing, loop)) |
1471 | @@ -565,20 +566,10 @@ |
1472 | routines_loop=outgoing_loop ) |
1473 | |
1474 | # Build mutiple lorentz call |
1475 | - for list_l_name, conjugate, outgoing, loop in data: |
1476 | + for list_l_name, tag, outgoing in data: |
1477 | if len(list_l_name) >1: |
1478 | - lorentzname = list_l_name[0] |
1479 | - # StartValentin 30.01.2012 |
1480 | - others=list(list_l_name[1:]) |
1481 | - for c in conjugate: |
1482 | - lorentzname += 'C%s' % c |
1483 | - if loop: |
1484 | - lorentzname += 'L' |
1485 | -# for i, name in enumerate(others): |
1486 | -# others[i]=name+'L' |
1487 | -# self[(lorentzname, outgoing)].add_combine(list_l_name[1:]) |
1488 | - # EndValentin 30.01.2012 |
1489 | - self[(lorentzname, outgoing)].add_combine(tuple(others)) |
1490 | + lorentzname = list_l_name[0] + ''.join(tag) |
1491 | + self[(lorentzname, outgoing)].add_combine(list_l_name[1:]) |
1492 | |
1493 | def compute_aloha(self, builder, symmetry=None, routines=None, routines_loop=None): |
1494 | """ define all the AbstractRoutine linked to a given lorentz structure |
1495 | @@ -604,7 +595,8 @@ |
1496 | else: |
1497 | wavefunction = builder.compute_routine(outgoing) |
1498 | #Store the information |
1499 | - self.set(name, outgoing, wavefunction) |
1500 | + realname = name + ''.join(wavefunction.tag) |
1501 | + self.set(realname, outgoing, wavefunction) |
1502 | |
1503 | # Creates the Loop routines |
1504 | for outgoing in routines_loop: |
1505 | |
1506 | === modified file 'bin/mg5' |
1507 | --- bin/mg5 2011-09-13 02:02:28 +0000 |
1508 | +++ bin/mg5 2012-02-16 18:25:20 +0000 |
1509 | @@ -60,6 +60,7 @@ |
1510 | |
1511 | import logging |
1512 | import logging.config |
1513 | +import madgraph.interface.coloring_logging |
1514 | |
1515 | try: |
1516 | import readline |
1517 | |
1518 | === modified file 'input/mg5_configuration.txt' |
1519 | --- input/mg5_configuration.txt 2012-01-05 12:58:34 +0000 |
1520 | +++ input/mg5_configuration.txt 2012-02-16 18:25:20 +0000 |
1521 | @@ -14,7 +14,9 @@ |
1522 | ################################################################################ |
1523 | # |
1524 | # This File contains some configuration variable for MadGraph/MadEvent |
1525 | -# This File is use if the file ~/.mg5/mg5_configuration.txt is not present. |
1526 | +# This File is use if the file ~/.mg5/mg5_configuration.txt is NOT present. |
1527 | +# If you place this files in ~/.mg5/mg5_configuration.txt then all path should |
1528 | +# be absolute. |
1529 | # |
1530 | ################################################################################ |
1531 | |
1532 | @@ -39,7 +41,7 @@ |
1533 | |
1534 | # Prefered Fortran Compiler |
1535 | # If None: try to find g77 or gfortran on the system |
1536 | -fortran_compiler = gfortran |
1537 | +fortran_compiler = None |
1538 | |
1539 | |
1540 | ################################################################################ |
1541 | |
1542 | === modified file 'madgraph/VERSION' |
1543 | --- madgraph/VERSION 2011-12-07 00:15:33 +0000 |
1544 | +++ madgraph/VERSION 2012-02-16 18:25:20 +0000 |
1545 | @@ -1,3 +1,3 @@ |
1546 | -version = 1.4.0.beta_6 |
1547 | -date = 2011-12-1 |
1548 | +version = 1.4.0.beta_10 |
1549 | +date = 2012-01-19 |
1550 | |
1551 | |
1552 | === modified file 'madgraph/core/base_objects.py' |
1553 | --- madgraph/core/base_objects.py 2011-12-19 17:55:05 +0000 |
1554 | +++ madgraph/core/base_objects.py 2012-02-16 18:25:20 +0000 |
1555 | @@ -80,7 +80,8 @@ |
1556 | |
1557 | if name not in self.keys(): |
1558 | raise self.PhysicsObjectError, \ |
1559 | - "%s is not a valid property for this object" % name |
1560 | + "%s is not a valid property for this object: %s" % \ |
1561 | + (name, self.__class__.__name__) |
1562 | |
1563 | return True |
1564 | |
1565 | @@ -228,9 +229,8 @@ |
1566 | """Filter for valid particle property values.""" |
1567 | |
1568 | if name in ['name', 'antiname']: |
1569 | - # Must start with a letter, followed by letters, digits, |
1570 | - # - and + only |
1571 | - p = re.compile('\A[a-zA-Z]+[\w]*[\-\+]*~?\Z') |
1572 | + # Forbid special character but +-~_ |
1573 | + p=re.compile('''^[\w\-\+~_]+$''') |
1574 | if not p.match(value): |
1575 | raise self.PhysicsObjectError, \ |
1576 | "%s is not a valid particle name" % value |
1577 | |
1578 | === modified file 'madgraph/core/helas_objects.py' |
1579 | --- madgraph/core/helas_objects.py 2012-01-05 12:58:34 +0000 |
1580 | +++ madgraph/core/helas_objects.py 2012-02-16 18:25:20 +0000 |
1581 | @@ -3649,10 +3649,10 @@ |
1582 | return self.generate_color_amplitudes(self['color_basis'],self['diagrams']) |
1583 | |
1584 | def get_used_lorentz(self): |
1585 | - """Return a list of (lorentz_name, conjugate, outgoing) with |
1586 | + """Return a list of (lorentz_name, conjugate_tag, outgoing) with |
1587 | all lorentz structures used by this HelasMatrixElement.""" |
1588 | |
1589 | - return [(tuple(wa.get('lorentz')), tuple(wa.get_conjugate_index()), |
1590 | + return [(tuple(wa.get('lorentz')), tuple(['C%s' % w for w in wa.get_conjugate_index()]), |
1591 | wa.find_outgoing_number()) for wa in \ |
1592 | self.get_all_wavefunctions() + self.get_all_amplitudes() \ |
1593 | if wa.get('interaction_id') != 0] |
1594 | |
1595 | === modified file 'madgraph/interface/.mg5_logging.conf' |
1596 | --- madgraph/interface/.mg5_logging.conf 2011-07-26 16:15:01 +0000 |
1597 | +++ madgraph/interface/.mg5_logging.conf 2012-02-16 18:25:20 +0000 |
1598 | @@ -9,13 +9,16 @@ |
1599 | |
1600 | |
1601 | [formatter_cmdprint] |
1602 | -format: %(message)s |
1603 | +class: logging.ColorFormatter |
1604 | +format: $COLOR%(message)s$RESET |
1605 | |
1606 | [formatter_simple] |
1607 | +class: logging.ColorFormatter |
1608 | format: %(name)s: %(message)s |
1609 | |
1610 | [formatter_madgraph] |
1611 | -format: %(levelname)s: %(message)s |
1612 | +class: logging.ColorFormatter |
1613 | +format: $COLOR%(levelname)s: %(message)s $RESET |
1614 | |
1615 | [handler_cmdprint] |
1616 | class: StreamHandler |
1617 | |
1618 | === modified file 'madgraph/interface/__init__.py' |
1619 | --- madgraph/interface/__init__.py 2010-06-16 13:50:52 +0000 |
1620 | +++ madgraph/interface/__init__.py 2012-02-16 18:25:20 +0000 |
1621 | @@ -0,0 +1,1 @@ |
1622 | + |
1623 | |
1624 | === modified file 'madgraph/interface/cmd_interface.py' |
1625 | --- madgraph/interface/cmd_interface.py 2012-02-15 12:45:43 +0000 |
1626 | +++ madgraph/interface/cmd_interface.py 2012-02-16 18:25:20 +0000 |
1627 | @@ -239,20 +239,20 @@ |
1628 | " FILENAME") |
1629 | logger.info("-- imports file(s) in various formats") |
1630 | logger.info("") |
1631 | - logger.info(" import model MODEL[-RESTRICTION] [-modelname]:") |
1632 | + logger.info(" import model MODEL[-RESTRICTION] [--modelname]:") |
1633 | logger.info(" Import a UFO model.") |
1634 | logger.info(" MODEL should be a valid UFO model name") |
1635 | logger.info(" Model restrictions are specified by MODEL-RESTRICTION") |
1636 | logger.info(" with the file restrict_RESTRICTION.dat in the model dir.") |
1637 | logger.info(" By default, restrict_default.dat is used.") |
1638 | logger.info(" Specify model_name-full to get unrestricted model.") |
1639 | - logger.info(" -modelname keeps the original particle names for the model") |
1640 | + logger.info(" '--modelname' keeps the original particle names for the model") |
1641 | logger.info("") |
1642 | - logger.info(" import model_v4 MODEL [-modelname] :") |
1643 | + logger.info(" import model_v4 MODEL [--modelname] :") |
1644 | logger.info(" Import an MG4 model.") |
1645 | logger.info(" Model should be the name of the model") |
1646 | logger.info(" or the path to theMG4 model directory") |
1647 | - logger.info(" -modelname keeps the original particle names for the model") |
1648 | + logger.info(" '--modelname' keeps the original particle names for the model") |
1649 | logger.info("") |
1650 | logger.info(" import proc_v4 [PATH] :" ) |
1651 | logger.info(" Execute MG5 based on a proc_card.dat in MG4 format.") |
1652 | @@ -308,8 +308,9 @@ |
1653 | logger.info(" - If mode is standalone_cpp, create a standalone C++") |
1654 | logger.info(" directory in \"path\".") |
1655 | logger.info(" - If mode is pythia8, output all files needed to generate") |
1656 | - logger.info(" the processes using Pythia 8. Directory \"path\"") |
1657 | - logger.info(" should be a Pythia 8 main directory (v. 8.150 or later).") |
1658 | + logger.info(" the processes using Pythia 8. The files are written in") |
1659 | + logger.info(" the Pythia 8 directory (default).") |
1660 | + logger.info(" NOTE: The Pythia 8 directory is set in the ./input/mg5_configuration.txt") |
1661 | logger.info(" path: The path of the process directory.") |
1662 | logger.info(" If you put '.' as path, your pwd will be used.") |
1663 | logger.info(" If you put 'auto', an automatic directory PROC_XX_n will be created.") |
1664 | @@ -326,7 +327,7 @@ |
1665 | |
1666 | def help_check(self): |
1667 | |
1668 | - logger.info("syntax: check " + "|".join(self._check_opts) + " [param_card] process_definition") |
1669 | + logger.info("syntax: check [" + "|".join(self._check_opts) + "] [param_card] process_definition") |
1670 | logger.info("-- check a process or set of processes. Options:") |
1671 | logger.info("full: Perform all three checks described below:") |
1672 | logger.info(" permutation, gauge and lorentz_invariance.") |
1673 | @@ -382,12 +383,14 @@ |
1674 | def help_set(self): |
1675 | logger.info("syntax: set %s argument" % "|".join(self._set_options)) |
1676 | logger.info("-- set options for generation or output") |
1677 | - logger.info(" group_subprocesses True/False: ") |
1678 | - logger.info(" (default True) Smart grouping of subprocesses into ") |
1679 | + logger.info(" group_subprocesses True/False/Auto: ") |
1680 | + logger.info(" (default Auto) Smart grouping of subprocesses into ") |
1681 | logger.info(" directories, mirroring of initial states, and ") |
1682 | logger.info(" combination of integration channels.") |
1683 | logger.info(" Example: p p > j j j w+ gives 5 directories and 184 channels") |
1684 | logger.info(" (cf. 65 directories and 1048 channels for regular output)") |
1685 | + logger.info(" Auto means False for decay computation and True for") |
1686 | + logger.info(" collisions.") |
1687 | logger.info(" ignore_six_quark_processes multi_part_label") |
1688 | logger.info(" (default none) ignore processes with at least 6 of any") |
1689 | logger.info(" of the quarks given in multi_part_label.") |
1690 | @@ -461,9 +464,12 @@ |
1691 | syntax: display XXXXX |
1692 | """ |
1693 | |
1694 | - if len(args) < 1 or args[0] not in self._display_opts: |
1695 | - self.help_display() |
1696 | - raise self.InvalidCmd |
1697 | + if len(args) < 1: |
1698 | + self.help_display() |
1699 | + raise self.InvalidCmd, 'display requires an argument specifying what to display' |
1700 | + if args[0] not in self._display_opts: |
1701 | + self.help_display() |
1702 | + raise self.InvalidCmd, 'Invalid arguments for display command: %s' % args[0] |
1703 | |
1704 | if not self._curr_model: |
1705 | raise self.InvalidCmd("No model currently active, please import a model!") |
1706 | @@ -503,7 +509,7 @@ |
1707 | |
1708 | if len(args) < 2: |
1709 | self.help_check() |
1710 | - raise self.InvalidCmd("\"check\" requires an argument and a process.") |
1711 | + raise self.InvalidCmd("\"check\" requires a process.") |
1712 | |
1713 | param_card = None |
1714 | if os.path.isfile(args[1]): |
1715 | @@ -569,7 +575,10 @@ |
1716 | modelname = False |
1717 | if '-modelname' in args: |
1718 | args.remove('-modelname') |
1719 | - modelname = True |
1720 | + modelname = True |
1721 | + elif '--modelname' in args: |
1722 | + args.remove('--modelname') |
1723 | + modelname = True |
1724 | |
1725 | if not args: |
1726 | self.help_import() |
1727 | @@ -664,8 +673,9 @@ |
1728 | path = os.path.realpath(args[0]) |
1729 | else: |
1730 | raise self.InvalidCmd, '%s is not a valid directory' % args[0] |
1731 | - |
1732 | + |
1733 | mode = self.find_output_type(path) |
1734 | + |
1735 | args[0] = mode |
1736 | args.append(path) |
1737 | # inform where we are for future command |
1738 | @@ -714,7 +724,10 @@ |
1739 | |
1740 | if os.path.isfile(pjoin(include_path, 'Pythia.h')): |
1741 | return 'pythia8' |
1742 | - elif os.path.isdir(src_path): |
1743 | + elif not os.path.isdir(os.path.join(path, 'SubProcesses')): |
1744 | + raise self.InvalidCmd, '%s : Not a valid directory' % path |
1745 | + |
1746 | + if os.path.isdir(src_path): |
1747 | return 'standalone_cpp' |
1748 | elif os.path.isfile(pjoin(bin_path,'generate_events')): |
1749 | return 'madevent' |
1750 | @@ -751,8 +764,8 @@ |
1751 | self._set_options) |
1752 | |
1753 | if args[0] in ['group_subprocesses']: |
1754 | - if args[1] not in ['False', 'True']: |
1755 | - raise self.InvalidCmd('%s needs argument False or True' % \ |
1756 | + if args[1] not in ['False', 'True', 'Auto']: |
1757 | + raise self.InvalidCmd('%s needs argument False, True or Auto' % \ |
1758 | args[0]) |
1759 | if args[0] in ['ignore_six_quark_processes']: |
1760 | if args[1] not in self._multiparticles.keys(): |
1761 | @@ -835,12 +848,23 @@ |
1762 | if path == 'auto' and self._export_format in \ |
1763 | ['madevent', 'standalone', 'standalone_cpp']: |
1764 | self.get_default_path() |
1765 | - else: |
1766 | + elif path != 'auto': |
1767 | self._export_dir = path |
1768 | + elif path == 'auto': |
1769 | + if self.configuration['pythia8_path']: |
1770 | + self._export_dir = self.configuration['pythia8_path'] |
1771 | + else: |
1772 | + self._export_dir = '.' |
1773 | else: |
1774 | - # No valid path |
1775 | - self.get_default_path() |
1776 | - |
1777 | + if self._export_format != 'pythia8': |
1778 | + # No valid path |
1779 | + self.get_default_path() |
1780 | + else: |
1781 | + if self.configuration['pythia8_path']: |
1782 | + self._export_dir = self.configuration['pythia8_path'] |
1783 | + else: |
1784 | + self._export_dir = '.' |
1785 | + |
1786 | self._export_dir = os.path.realpath(self._export_dir) |
1787 | |
1788 | def get_default_path(self): |
1789 | @@ -882,8 +906,8 @@ |
1790 | auto_path = lambda i: pjoin(self.writing_dir, |
1791 | name_dir(i)) |
1792 | elif self._export_format == 'pythia8': |
1793 | - if self.pythia8_path: |
1794 | - self._export_dir = self.pythia8_path |
1795 | + if self.configuration['pythia8_path']: |
1796 | + self._export_dir = self.configuration['pythia8_path'] |
1797 | else: |
1798 | self._export_dir = '.' |
1799 | return |
1800 | @@ -1290,7 +1314,6 @@ |
1801 | |
1802 | def complete_set(self, text, line, begidx, endidx): |
1803 | "Complete the set command" |
1804 | - |
1805 | args = self.split_arg(line[0:begidx]) |
1806 | |
1807 | # Format |
1808 | @@ -1300,7 +1323,7 @@ |
1809 | |
1810 | if len(args) == 2: |
1811 | if args[1] in ['group_subprocesses']: |
1812 | - return self.list_completion(text, ['False', 'True']) |
1813 | + return self.list_completion(text, ['False', 'True', 'Auto']) |
1814 | |
1815 | elif args[1] in ['ignore_six_quark_processes']: |
1816 | return self.list_completion(text, self._multiparticles.keys()) |
1817 | @@ -1322,7 +1345,7 @@ |
1818 | |
1819 | def complete_import(self, text, line, begidx, endidx): |
1820 | "Complete the import command" |
1821 | - |
1822 | + |
1823 | args=self.split_arg(line[0:begidx]) |
1824 | |
1825 | # Format |
1826 | @@ -1334,68 +1357,68 @@ |
1827 | elif args[1] in self._import_formats: |
1828 | mode = args[1] |
1829 | else: |
1830 | + args.insert(1, 'all') |
1831 | mode = 'all' |
1832 | |
1833 | + |
1834 | + completion_categories = {} |
1835 | # restriction continuation (for UFO) |
1836 | - if mode in ['model', 'all'] and ('-' in args[-1] + text): |
1837 | + if mode in ['model', 'all'] and '-' in text: |
1838 | # deal with - in readline splitting (different on some computer) |
1839 | - if not GNU_SPLITTING: |
1840 | - prefix = '-'.join([part for part in text.split('-')[:-1]])+'-' |
1841 | - args.append(prefix) |
1842 | - text = text.split('-')[-1] |
1843 | - #model name |
1844 | - path = args[-1][:-1] # remove the final - for the model name |
1845 | + path = '-'.join([part for part in text.split('-')[:-1]]) |
1846 | + # remove the final - for the model name |
1847 | # find the different possibilities |
1848 | all_name = self.find_restrict_card(path, no_restrict=False) |
1849 | all_name += self.find_restrict_card(path, no_restrict=False, |
1850 | base_dir=pjoin(MG5DIR,'models')) |
1851 | |
1852 | - # select the possibility according to the current line |
1853 | - all_name = [name.split('-')[-1] for name in all_name ] |
1854 | + # select the possibility according to the current line |
1855 | all_name = [name+' ' for name in all_name if name.startswith(text) |
1856 | and name.strip() != text] |
1857 | - # adapt output for python2.7 (due to different splitting) |
1858 | - if not GNU_SPLITTING: |
1859 | - all_name = [prefix + name for name in all_name ] |
1860 | + |
1861 | |
1862 | if all_name: |
1863 | - return all_name |
1864 | + completion_categories['Restricted model'] = all_name |
1865 | |
1866 | # Path continuation |
1867 | - if os.path.sep in args[-1] + text and text: |
1868 | - if mode.startswith('model'): |
1869 | + if os.path.sep in args[-1]: |
1870 | + if mode.startswith('model') or mode == 'all': |
1871 | # Directory continuation |
1872 | - cur_path = pjoin('.',*[a for a in args \ |
1873 | + try: |
1874 | + cur_path = pjoin(*[a for a in args \ |
1875 | if a.endswith(os.path.sep)]) |
1876 | - all_dir = self.path_completion(text, cur_path, only_dirs = True) |
1877 | - if mode == 'model_v4': |
1878 | - return all_dir |
1879 | - new = [] |
1880 | - data = [new.__iadd__(self.find_restrict_card(name, base_dir=cur_path)) |
1881 | - for name in all_dir] |
1882 | - if data: |
1883 | - return data[0] |
1884 | + except: |
1885 | + pass |
1886 | else: |
1887 | - return [] |
1888 | - else: |
1889 | - cur_path = pjoin('.',*[a for a in args \ |
1890 | - if a.endswith(os.path.sep)]) |
1891 | - all_path = self.path_completion(text, cur_path) |
1892 | - if mode == 'all': |
1893 | - new = [] |
1894 | - data = [new.__iadd__(self.find_restrict_card(name, base_dir=cur_path)) |
1895 | - for name in all_path] |
1896 | + all_dir = self.path_completion(text, cur_path, only_dirs = True) |
1897 | + if mode in ['model_v4','all']: |
1898 | + completion_categories['Path Completion'] = all_dir |
1899 | + # Only UFO model here |
1900 | + new = [] |
1901 | + data = [new.__iadd__(self.find_restrict_card(name, base_dir=cur_path)) |
1902 | + for name in all_dir] |
1903 | if data: |
1904 | - return data[0] |
1905 | + completion_categories['Path Completion'] = all_dir + new |
1906 | + else: |
1907 | + try: |
1908 | + cur_path = pjoin(*[a for a in args \ |
1909 | + if a.endswith(os.path.sep)]) |
1910 | + except: |
1911 | + pass |
1912 | + else: |
1913 | + all_path = self.path_completion(text, cur_path) |
1914 | + if mode == 'all': |
1915 | + new = [] |
1916 | + data = [new.__iadd__(self.find_restrict_card(name, base_dir=cur_path)) |
1917 | + for name in all_path] |
1918 | + if data: |
1919 | + completion_categories['Path Completion'] = data[0] |
1920 | else: |
1921 | - return [] |
1922 | - else: |
1923 | - return all_path |
1924 | - |
1925 | - |
1926 | + completion_categories['Path Completion'] = all_path |
1927 | + |
1928 | # Model directory name if directory is not given |
1929 | - if (mode != 'all' and len(self.split_arg(line[0:begidx])) == 2) or \ |
1930 | - (mode =='all' and len(self.split_arg(line[0:begidx]))==1): |
1931 | + if (len(args) == 2): |
1932 | + is_model = True |
1933 | if mode == 'model': |
1934 | file_cond = lambda p : os.path.exists(pjoin(MG5DIR,'models',p,'particles.py')) |
1935 | mod_name = lambda name: name |
1936 | @@ -1412,46 +1435,42 @@ |
1937 | cur_path = pjoin('.',*[a for a in args \ |
1938 | if a.endswith(os.path.sep)]) |
1939 | all_path = self.path_completion(text, cur_path) |
1940 | - return all_path |
1941 | + completion_categories['model name'] = all_path |
1942 | + is_model = False |
1943 | + |
1944 | + if is_model: |
1945 | + model_list = [mod_name(name) for name in \ |
1946 | + self.path_completion(text, |
1947 | + pjoin(MG5DIR,'models'), |
1948 | + only_dirs = True) \ |
1949 | + if file_cond(name)] |
1950 | |
1951 | - model_list = [mod_name(name) for name in \ |
1952 | - self.path_completion(text, |
1953 | - pjoin(MG5DIR,'models'), |
1954 | - only_dirs = True) \ |
1955 | - if file_cond(name)] |
1956 | - |
1957 | - if mode == 'model_v4': |
1958 | - return model_list |
1959 | - else: |
1960 | - # need to update the list with the possible restriction |
1961 | - all_name = [] |
1962 | - for model_name in model_list: |
1963 | - all_name += self.find_restrict_card(model_name, |
1964 | - base_dir=pjoin(MG5DIR,'models')) |
1965 | - if mode == 'all': |
1966 | - cur_path = pjoin('.',*[a for a in args \ |
1967 | - if a.endswith(os.path.sep)]) |
1968 | - all_path = self.path_completion(text, cur_path) |
1969 | - return all_path + all_name |
1970 | - else: |
1971 | - return all_name |
1972 | + if mode == 'model_v4': |
1973 | + completion_categories['model name'] = model_list |
1974 | + else: |
1975 | + # need to update the list with the possible restriction |
1976 | + all_name = [] |
1977 | + for model_name in model_list: |
1978 | + all_name += self.find_restrict_card(model_name, |
1979 | + base_dir=pjoin(MG5DIR,'models')) |
1980 | + if mode == 'all': |
1981 | + cur_path = pjoin('.',*[a for a in args \ |
1982 | + if a.endswith(os.path.sep)]) |
1983 | + all_path = self.path_completion(text, cur_path) |
1984 | + completion_categories['model name'] = all_path + all_name |
1985 | + elif mode == 'model': |
1986 | + completion_categories['model name'] = all_name |
1987 | |
1988 | # Options |
1989 | if mode == 'all' and len(args)>1: |
1990 | - mode = self.find_import_type(args[1]) |
1991 | - opt_len = 1 |
1992 | - else: |
1993 | - opt_len =2 |
1994 | - |
1995 | - |
1996 | - if len(args) > opt_len and mode.startswith('model') and args[-1][0] != '-': |
1997 | - return ['-modelname'] |
1998 | - |
1999 | - if len(args) > (opt_len+1) and mode[1].startswith('model') and args[-1][0] == '-': |
2000 | - if GNU_SPLITTING: |
2001 | - return ['modelname'] |
2002 | - else: |
2003 | - return ['-modelname'] |
2004 | + mode = self.find_import_type(args[2]) |
2005 | + |
2006 | + if len(args) >= 3 and mode.startswith('model') and not '-modelname' in line: |
2007 | + if not text and not completion_categories: |
2008 | + return ['--modelname'] |
2009 | + elif not (os.path.sep in args[-1] and line[-1] != ' '): |
2010 | + completion_categories['options'] = self.list_completion(text, ['--modelname','-modelname']) |
2011 | + return self.deal_multiple_categories(completion_categories) |
2012 | |
2013 | |
2014 | |
2015 | @@ -1528,8 +1547,6 @@ |
2016 | _curr_exporter = None |
2017 | _done_export = False |
2018 | |
2019 | - # Configuration variable |
2020 | - pythia8_path = None |
2021 | |
2022 | def __init__(self, mgme_dir = '', *completekey, **stdin): |
2023 | """ add a tracker of the history """ |
2024 | @@ -1568,7 +1585,7 @@ |
2025 | self._cuttools_dir=str(os.path.join(self._mgme_dir,'loop_material','CutTools')) |
2026 | |
2027 | # Set defaults for options |
2028 | - self._options['group_subprocesses'] = True |
2029 | + self._options['group_subprocesses'] = 'Auto' |
2030 | self._options['ignore_six_quark_processes'] = False |
2031 | |
2032 | # Load the configuration file |
2033 | @@ -1628,7 +1645,10 @@ |
2034 | cpu_time1 = time.time() |
2035 | |
2036 | # Generate processes |
2037 | - collect_mirror_procs = self._options['group_subprocesses'] |
2038 | + if self._options['group_subprocesses'] == 'Auto': |
2039 | + collect_mirror_procs = True |
2040 | + else: |
2041 | + collect_mirror_procs = self._options['group_subprocesses'] |
2042 | ignore_six_quark_processes = \ |
2043 | self._options['ignore_six_quark_processes'] if \ |
2044 | "ignore_six_quark_processes" in self._options \ |
2045 | @@ -1695,7 +1715,7 @@ |
2046 | self.multiparticle_string(label)) |
2047 | |
2048 | # Display |
2049 | - def do_display(self, line): |
2050 | + def do_display(self, line, output=sys.stdout): |
2051 | """Display current internal status""" |
2052 | |
2053 | args = self.split_arg(line) |
2054 | @@ -1758,7 +1778,7 @@ |
2055 | text += '\n' |
2056 | pydoc.pager(text) |
2057 | |
2058 | - elif args[0] == 'interactions': |
2059 | + elif args[0] == 'interactions' and len(args)==2 and args[1].isdigit(): |
2060 | for arg in args[1:]: |
2061 | if int(arg) > len(self._curr_model['interactions']): |
2062 | raise self.InvalidCmd, 'no interaction %s in current model' % arg |
2063 | @@ -1768,6 +1788,43 @@ |
2064 | print "Interactions %s has the following property:" % arg |
2065 | print self._curr_model['interactions'][int(arg)-1] |
2066 | |
2067 | + elif args[0] == 'interactions': |
2068 | + request_part = args[1:] |
2069 | + text = '' |
2070 | + for i, inter in enumerate(self._curr_model['interactions']): |
2071 | + present_part = [part['is_part'] and part['name'] or part['antiname'] |
2072 | + for part in inter['particles'] |
2073 | + if (part['is_part'] and part['name'] in request_part) or |
2074 | + (not part['is_part'] and part['antiname'] in request_part)] |
2075 | + if len(present_part) < len(request_part): |
2076 | + continue |
2077 | + # check that all particles are selected at least once |
2078 | + if set(present_part) != set(request_part): |
2079 | + continue |
2080 | + # check if a particle is asked more than once |
2081 | + if len(request_part) > len(set(request_part)): |
2082 | + for p in request_part: |
2083 | + print p, request_part.count(p),present_part.count(p) |
2084 | + if request_part.count(p) > present_part.count(p): |
2085 | + continue |
2086 | + |
2087 | + name = str(i+1) + ' : ' |
2088 | + for part in inter['particles']: |
2089 | + if part['is_part']: |
2090 | + name += part['name'] |
2091 | + else: |
2092 | + name += part['antiname'] |
2093 | + name += " " |
2094 | + text += "\nInteractions %s has the following property:\n" % name |
2095 | + text += str(self._curr_model['interactions'][i]) |
2096 | + |
2097 | + text += '\n' |
2098 | + print name |
2099 | + if text =='': |
2100 | + text += 'No matching for any interactions' |
2101 | + pydoc.pager(text) |
2102 | + |
2103 | + |
2104 | elif args[0] == 'parameters' and len(args) == 1: |
2105 | text = "Current model contains %i parameters\n" % \ |
2106 | sum([len(part) for part in |
2107 | @@ -1802,7 +1859,8 @@ |
2108 | print self.multiparticle_string(key) |
2109 | |
2110 | elif args[0] == 'coupling_order': |
2111 | - hierarchy = self._curr_model.get_order_hierarchy().items() |
2112 | + hierarchy = self._curr_model['order_hierarchy'].items() |
2113 | + #self._curr_model.get_order_hierarchy().items() |
2114 | def order(first, second): |
2115 | if first[1] < second[1]: |
2116 | return -1 |
2117 | @@ -1881,7 +1939,7 @@ |
2118 | pydoc.pager(outstr) |
2119 | |
2120 | elif args[0] in ["options", "variable"]: |
2121 | - super(MadGraphCmd, self).do_display(line) |
2122 | + super(MadGraphCmd, self).do_display(line, output) |
2123 | |
2124 | |
2125 | def multiparticle_string(self, key): |
2126 | @@ -2701,7 +2759,7 @@ |
2127 | |
2128 | |
2129 | |
2130 | - def set_configuration(self, config_path=None): |
2131 | + def set_configuration(self, config_path=None, test=False): |
2132 | """ assign all configuration variable from file |
2133 | ./input/mg5_configuration.txt. assign to default if not define """ |
2134 | |
2135 | @@ -2709,7 +2767,8 @@ |
2136 | 'web_browser':None, |
2137 | 'eps_viewer':None, |
2138 | 'text_editor':None, |
2139 | - 'fortran_compiler':None} |
2140 | + 'fortran_compiler':None, |
2141 | + 'automatic_html_opening':True} |
2142 | |
2143 | if not config_path: |
2144 | try: |
2145 | @@ -2737,6 +2796,9 @@ |
2146 | if value.lower() == "none": |
2147 | self.configuration[name] = None |
2148 | |
2149 | + if test: |
2150 | + return self.configuration |
2151 | + |
2152 | # Treat each expected input |
2153 | # 1: Pythia8_path |
2154 | # try relative path |
2155 | @@ -2744,11 +2806,11 @@ |
2156 | if key == 'pythia8_path': |
2157 | pythia8_dir = pjoin(MG5DIR, self.configuration['pythia8_path']) |
2158 | if not os.path.isfile(pjoin(pythia8_dir, 'include', 'Pythia.h')): |
2159 | - if os.path.isfile(pjoin(self.configuration['pythia8_path'], 'include', 'Pythia.h')): |
2160 | - pythia8_dir = self.configuration['pythia8_path'] |
2161 | + if not os.path.isfile(pjoin(self.configuration['pythia8_path'], 'include', 'Pythia.h')): |
2162 | + self.configuration['pythia8_path'] = None |
2163 | else: |
2164 | - pythia8_dir = None |
2165 | - self.pythia8_path = pythia8_dir |
2166 | + continue |
2167 | + |
2168 | elif key.endswith('path'): |
2169 | pass |
2170 | elif key in ['cluster_type', 'automatic_html_opening']: |
2171 | @@ -2804,12 +2866,25 @@ |
2172 | if hasattr(self, 'do_shell'): |
2173 | ME = madevent_interface.MadEventCmdShell(me_dir=args[1]) |
2174 | else: |
2175 | - ME = madevent_interface.MadEventCmdShell(me_dir=args[1]) |
2176 | - stop = self.define_child_cmd_interface(ME) |
2177 | + ME = madevent_interface.MadEventCmd(me_dir=args[1]) |
2178 | + # transfer interactive configuration |
2179 | + config_line = [l for l in self.history if l.strip().startswith('set')] |
2180 | + for line in config_line: |
2181 | + ME.exec_cmd(line) |
2182 | + stop = self.define_child_cmd_interface(ME) |
2183 | return stop |
2184 | |
2185 | #check if this is a cross-section |
2186 | - if len(self._generate_info.split('>')[0].strip().split())>1: |
2187 | + if not self._generate_info: |
2188 | + # This relaunch an old run -> need to check if this is a |
2189 | + # cross-section or a width |
2190 | + info = open(pjoin(args[1],'SubProcesses','procdef_mg5.dat')).read() |
2191 | + generate_info = info.split('# Begin PROCESS',1)[1].split('\n')[1] |
2192 | + generate_info = generate_info.split('#')[0] |
2193 | + else: |
2194 | + generate_info = self._generate_info |
2195 | + |
2196 | + if len(generate_info.split('>')[0].strip().split())>1: |
2197 | ext_program = launch_ext.MELauncher(args[1], self.timeout, self, |
2198 | pythia=self.configuration['pythia-pgs_path'], |
2199 | delphes=self.configuration['delphes_path'], |
2200 | @@ -2823,8 +2898,9 @@ |
2201 | delphes=self.configuration['delphes_path'], |
2202 | shell = hasattr(self, 'do_shell'), |
2203 | **options) |
2204 | + |
2205 | elif args[0] == 'pythia8': |
2206 | - ext_program = launch_ext.Pythia8Launcher(self, args[1], self.timeout, |
2207 | + ext_program = launch_ext.Pythia8Launcher( args[1], self.timeout, self, |
2208 | **options) |
2209 | else: |
2210 | os.chdir(start_cwd) #ensure to go to the initial path |
2211 | @@ -2944,7 +3020,10 @@ |
2212 | for q in self._options[args[0]]])) |
2213 | |
2214 | elif args[0] == 'group_subprocesses': |
2215 | - self._options[args[0]] = eval(args[1]) |
2216 | + if args[1] != 'Auto': |
2217 | + self._options[args[0]] = eval(args[1]) |
2218 | + else: |
2219 | + self._options[args[0]] = 'Auto' |
2220 | logger.info('Set group_subprocesses to %s' % \ |
2221 | str(self._options[args[0]])) |
2222 | logger.info('Note that you need to regenerate all processes') |
2223 | @@ -3015,8 +3094,14 @@ |
2224 | if answer != 'y': |
2225 | raise self.InvalidCmd('Stopped by user request') |
2226 | |
2227 | - group_subprocesses = self._export_format == 'madevent' and \ |
2228 | - self._options['group_subprocesses'] |
2229 | + #check if we need to group processes |
2230 | + group_subprocesses = False |
2231 | + if self._export_format == 'madevent' and \ |
2232 | + self._options['group_subprocesses']: |
2233 | + if self._options['group_subprocesses'] is True: |
2234 | + group_subprocesses = True |
2235 | + elif self._curr_amps[0].get_ninitial() == 2: |
2236 | + group_subprocesses = True |
2237 | |
2238 | if self._curr_amps.has_any_loop_process() and \ |
2239 | self._export_format not in ['standalone','matrix']: |
2240 | @@ -3079,10 +3164,20 @@ |
2241 | self._curr_amps.sort(lambda a1, a2: a2.get_number_of_diagrams() - \ |
2242 | a1.get_number_of_diagrams()) |
2243 | |
2244 | + # Check if we need to group the SubProcesses or not |
2245 | + group = True |
2246 | + if self._options['group_subprocesses'] is False: |
2247 | + group = False |
2248 | + elif self._options['group_subprocesses'] == 'Auto' and \ |
2249 | + self._curr_amps[0].get_ninitial() == 1: |
2250 | + group = False |
2251 | + |
2252 | + |
2253 | + |
2254 | cpu_time1 = time.time() |
2255 | ndiags = 0 |
2256 | if not self._curr_matrix_elements.get_matrix_elements(): |
2257 | - if self._options['group_subprocesses']: |
2258 | + if group: |
2259 | cpu_time1 = time.time() |
2260 | dc_amps = [amp for amp in self._curr_amps if isinstance(amp, \ |
2261 | diagram_generation.DecayChainAmplitude)] |
2262 | |
2263 | === added file 'madgraph/interface/coloring_logging.py' |
2264 | --- madgraph/interface/coloring_logging.py 1970-01-01 00:00:00 +0000 |
2265 | +++ madgraph/interface/coloring_logging.py 2012-02-16 18:25:20 +0000 |
2266 | @@ -0,0 +1,52 @@ |
2267 | +import logging |
2268 | + |
2269 | +BLACK, RED, GREEN, YELLOW, BLUE, MAGENTA, CYAN, WHITE = range(8) |
2270 | + |
2271 | +COLORS = { |
2272 | + 'WARNING' : RED, |
2273 | + 'INFO' : BLACK, |
2274 | + 'DEBUG' : BLUE, |
2275 | + 'CRITICAL' : RED, |
2276 | + 'ERROR' : RED, |
2277 | + 'RED' : RED, |
2278 | + 'GREEN' : GREEN, |
2279 | + 'YELLOW' : YELLOW, |
2280 | + 'BLUE' : BLUE, |
2281 | + 'MAGENTA' : MAGENTA, |
2282 | + 'CYAN' : CYAN, |
2283 | + 'WHITE' : WHITE, |
2284 | +} |
2285 | + |
2286 | +RESET_SEQ = "\033[0m" |
2287 | +COLOR_SEQ = "\033[1;%dm" |
2288 | +BOLD_SEQ = "\033[1m" |
2289 | + |
2290 | +class ColorFormatter(logging.Formatter): |
2291 | + |
2292 | + def __init__(self, *args, **kwargs): |
2293 | + # can't do super(...) here because Formatter is an old school class |
2294 | + logging.Formatter.__init__(self, *args, **kwargs) |
2295 | + |
2296 | + def format(self, record): |
2297 | + levelname = record.levelname |
2298 | + message = logging.Formatter.format(self, record) + RESET_SEQ |
2299 | + for k,v in COLORS.items(): |
2300 | + message = message.replace("$" + k, COLOR_SEQ % (v+30))\ |
2301 | + .replace("$BG" + k, COLOR_SEQ % (v+40))\ |
2302 | + .replace("$BG-" + k, COLOR_SEQ % (v+40)) |
2303 | + |
2304 | + |
2305 | + if levelname == 'INFO': |
2306 | + message = message.replace("$RESET", '')\ |
2307 | + .replace("$BOLD", '')\ |
2308 | + .replace("$COLOR", '') |
2309 | + return message |
2310 | + else: |
2311 | + color = COLOR_SEQ % (30 + COLORS[levelname]) |
2312 | + message = message.replace("$RESET", RESET_SEQ)\ |
2313 | + .replace("$BOLD", BOLD_SEQ)\ |
2314 | + .replace("$COLOR", color) |
2315 | + |
2316 | + return message |
2317 | + |
2318 | +logging.ColorFormatter = ColorFormatter |
2319 | \ No newline at end of file |
2320 | |
2321 | === modified file 'madgraph/interface/extended_cmd.py' |
2322 | --- madgraph/interface/extended_cmd.py 2011-12-07 16:17:13 +0000 |
2323 | +++ madgraph/interface/extended_cmd.py 2012-02-16 18:25:20 +0000 |
2324 | @@ -21,6 +21,7 @@ |
2325 | import pydoc |
2326 | import signal |
2327 | import subprocess |
2328 | +import sys |
2329 | import traceback |
2330 | try: |
2331 | import readline |
2332 | @@ -72,14 +73,23 @@ |
2333 | continue |
2334 | name = name.replace(' ', '_') |
2335 | valid += 1 |
2336 | - out.append(opt[0]+'@@'+name+'@@') |
2337 | + out.append(opt[0].rstrip()+'@@'+name+'@@') |
2338 | + # Remove duplicate |
2339 | + d = {} |
2340 | + for x in opt: |
2341 | + d[x] = 1 |
2342 | + opt = list(d.keys()) |
2343 | + opt.sort() |
2344 | out += opt |
2345 | + |
2346 | + |
2347 | if valid == 1: |
2348 | out = out[1:] |
2349 | return out |
2350 | |
2351 | def print_suggestions(self, substitution, matches, longest_match_length) : |
2352 | """print auto-completions by category""" |
2353 | + longest_match_length += len(self.completion_prefix) |
2354 | try: |
2355 | if len(matches) == 1: |
2356 | self.stdout.write(matches[0]+' ') |
2357 | @@ -95,10 +105,12 @@ |
2358 | category = category.replace('_',' ') |
2359 | self.stdout.write('\n %s:\n%s\n' % (category, '=' * (len(category)+2))) |
2360 | start = 0 |
2361 | + pos = 0 |
2362 | continue |
2363 | elif pos and pos % nb_column ==0: |
2364 | self.stdout.write('\n') |
2365 | - self.stdout.write(val + ' ' * (longest_match_length +1 -len(val))) |
2366 | + self.stdout.write(self.completion_prefix + val + \ |
2367 | + ' ' * (longest_match_length +1 -len(val))) |
2368 | pos +=1 |
2369 | self.stdout.write('\n') |
2370 | else: |
2371 | @@ -107,7 +119,8 @@ |
2372 | for i,val in enumerate(matches): |
2373 | if i and i%nb_column ==0: |
2374 | self.stdout.write('\n') |
2375 | - self.stdout.write(val + ' ' * (longest_match_length +1 -len(val))) |
2376 | + self.stdout.write(self.completion_prefix + val + \ |
2377 | + ' ' * (longest_match_length +1 -len(val))) |
2378 | self.stdout.write('\n') |
2379 | |
2380 | self.stdout.write(self.prompt+readline.get_line_buffer()) |
2381 | @@ -139,6 +152,66 @@ |
2382 | except: |
2383 | cr = (25, 80) |
2384 | return int(cr[1]) |
2385 | + |
2386 | + def complete(self, text, state): |
2387 | + """Return the next possible completion for 'text'. |
2388 | + If a command has not been entered, then complete against command list. |
2389 | + Otherwise try to call complete_<command> to get list of completions. |
2390 | + """ |
2391 | + if state == 0: |
2392 | + import readline |
2393 | + origline = readline.get_line_buffer() |
2394 | + line = origline.lstrip() |
2395 | + stripped = len(origline) - len(line) |
2396 | + begidx = readline.get_begidx() - stripped |
2397 | + endidx = readline.get_endidx() - stripped |
2398 | + |
2399 | + if ';' in line: |
2400 | + begin, line = line.rsplit(';',1) |
2401 | + begidx = begidx - len(begin) - 1 |
2402 | + endidx = endidx - len(begin) - 1 |
2403 | + if line[:begidx] == ' ' * begidx: |
2404 | + begidx=0 |
2405 | + |
2406 | + if begidx>0: |
2407 | + cmd, args, foo = self.parseline(line) |
2408 | + if cmd == '': |
2409 | + compfunc = self.completedefault |
2410 | + else: |
2411 | + try: |
2412 | + compfunc = getattr(self, 'complete_' + cmd) |
2413 | + except AttributeError: |
2414 | + compfunc = self.completedefault |
2415 | + else: |
2416 | + compfunc = self.completenames |
2417 | + # correct wrong splitting with '-' |
2418 | + if line and line[begidx-1] == '-': |
2419 | + try: |
2420 | + Ntext = line.split()[-1] |
2421 | + self.completion_prefix = Ntext.rsplit('-',1)[0] +'-' |
2422 | + to_rm = len(self.completion_prefix) |
2423 | + Nbegidx = len(line.rsplit(None, 1)[0]) |
2424 | + data = compfunc(Ntext, line, Nbegidx, endidx) |
2425 | + self.completion_matches = [p[to_rm:] for p in data |
2426 | + if len(p)>to_rm] |
2427 | + except Exception, error: |
2428 | + print error |
2429 | + else: |
2430 | + self.completion_prefix = '' |
2431 | + self.completion_matches = compfunc(text, line, begidx, endidx) |
2432 | + #print self.completion_matches |
2433 | + |
2434 | + self.completion_matches = [ (l[-1] in [' ','@','=',os.path.sep] |
2435 | + and l or (l+' ')) for l in self.completion_matches if l] |
2436 | + |
2437 | + try: |
2438 | + return self.completion_matches[state] |
2439 | + except IndexError, error: |
2440 | + #if __debug__: |
2441 | + # print '\n Completion ERROR:' |
2442 | + # print error |
2443 | + # print '\n' |
2444 | + return None |
2445 | |
2446 | class Cmd(BasicCmd): |
2447 | """Extension of the cmd.Cmd command line. |
2448 | @@ -196,10 +269,15 @@ |
2449 | |
2450 | # Update the history of this suite of command, |
2451 | # except for useless commands (empty history and help calls) |
2452 | - if line != "history" and \ |
2453 | - not line.startswith('help') and \ |
2454 | - not line.startswith('#*'): |
2455 | - self.history.append(line) |
2456 | + if ';' in line: |
2457 | + lines = line.split(';') |
2458 | + else: |
2459 | + lines = [line] |
2460 | + for l in lines: |
2461 | + l = l.strip() |
2462 | + if not (l.startswith("history") or l.startswith('help') or \ |
2463 | + l.startswith('#*')): |
2464 | + self.history.append(l) |
2465 | |
2466 | # Check if we are continuing a line: |
2467 | if self.save_line: |
2468 | @@ -234,7 +312,13 @@ |
2469 | os.chdir(self.__initpos) |
2470 | # Create the debug files |
2471 | self.log = False |
2472 | - cmd.Cmd.onecmd(self, 'history %s' % self.debug_output) |
2473 | + if os.path.exists(self.debug_output): |
2474 | + os.remove(self.debug_output) |
2475 | + try: |
2476 | + cmd.Cmd.onecmd(self, 'history %s' % self.debug_output) |
2477 | + except Exception, error: |
2478 | + print error |
2479 | + |
2480 | debug_file = open(self.debug_output, 'a') |
2481 | traceback.print_exc(file=debug_file) |
2482 | # Create a nice error output |
2483 | @@ -249,6 +333,13 @@ |
2484 | str(error).replace('\n','\n\t')) |
2485 | error_text += self.error_debug % {'debug': self.debug_output} |
2486 | logger_stderr.critical(error_text) |
2487 | + |
2488 | + # Add options status to the debug file |
2489 | + try: |
2490 | + self.do_display('options', debug_file) |
2491 | + except Exception, error: |
2492 | + debug_file.write('Fail to write options with error %s' % error) |
2493 | + |
2494 | #stop the execution if on a non interactive mode |
2495 | if self.use_rawinput == False: |
2496 | return True |
2497 | @@ -292,11 +383,18 @@ |
2498 | error_text += '%s : %s' % (error.__class__.__name__, |
2499 | str(error).replace('\n','\n\t')) |
2500 | logger_stderr.error(error_text) |
2501 | + |
2502 | + # Add options status to the debug file |
2503 | + try: |
2504 | + self.do_display('options', debug_file) |
2505 | + except Exception, error: |
2506 | + debug_file.write('Fail to write options with error %s' % error) |
2507 | #stop the execution if on a non interactive mode |
2508 | if self.use_rawinput == False: |
2509 | return True |
2510 | # Remove failed command from history |
2511 | - self.history.pop() |
2512 | + if self.history: |
2513 | + self.history.pop() |
2514 | return False |
2515 | |
2516 | |
2517 | @@ -381,6 +479,17 @@ |
2518 | out.append(data) |
2519 | return out |
2520 | |
2521 | + def correct_splitting(line): |
2522 | + """if the line finish with a '-' the code splits in a weird way |
2523 | + on GNU_SPLITTING""" |
2524 | + |
2525 | + line = line.lstrip() |
2526 | + if line[-1] in [' ','\t']: |
2527 | + return '', line, len(line),len(enidx) |
2528 | + return text, line, begidx, endidx |
2529 | + |
2530 | + |
2531 | + |
2532 | |
2533 | # Write the list of command line use in this session |
2534 | def do_history(self, line): |
2535 | @@ -780,7 +889,7 @@ |
2536 | text+='\t %s \n' % option |
2537 | print text |
2538 | |
2539 | - def do_display(self, line): |
2540 | + def do_display(self, line, output=sys.stdout): |
2541 | """Advanced commands: basic display""" |
2542 | |
2543 | args = self.split_arg(line) |
2544 | @@ -791,10 +900,10 @@ |
2545 | raise self.InvalidCmd, 'display require at least one argument' |
2546 | |
2547 | if args[0] == "options": |
2548 | - outstr = "Value of current MG5 Options:\n" |
2549 | + outstr = "Value of current Options:\n" |
2550 | for key, value in self.configuration.items() + self._options.items(): |
2551 | outstr += '%25s \t:\t%s\n' %(key,value) |
2552 | - print outstr |
2553 | + output.write(outstr) |
2554 | |
2555 | elif args[0] == "variable": |
2556 | outstr = "Value of Internal Variable:\n" |
2557 | @@ -836,18 +945,6 @@ |
2558 | def list_completion(text, list, line=''): |
2559 | """Propose completions of text in list""" |
2560 | |
2561 | - rm=0 |
2562 | - if text: |
2563 | - line = line[:-len(text)] |
2564 | - |
2565 | - if line.endswith('-'): |
2566 | - if line.endswith('--'): |
2567 | - rm += 2 |
2568 | - text = '--%s' % text |
2569 | - else: |
2570 | - rm += 1 |
2571 | - text = '-%s' % text |
2572 | - |
2573 | if not text: |
2574 | completions = list |
2575 | else: |
2576 | @@ -855,13 +952,8 @@ |
2577 | for f in list |
2578 | if f.startswith(text) |
2579 | ] |
2580 | - def put_space(name, rm): |
2581 | - if name.endswith(' '): |
2582 | - return name |
2583 | - else: |
2584 | - return '%s ' % name[rm:] |
2585 | |
2586 | - return [put_space(name, rm) for name in completions] |
2587 | + return completions |
2588 | |
2589 | |
2590 | @staticmethod |
2591 | |
2592 | === modified file 'madgraph/interface/launch_ext_program.py' |
2593 | --- madgraph/interface/launch_ext_program.py 2012-01-05 12:58:34 +0000 |
2594 | +++ madgraph/interface/launch_ext_program.py 2012-02-16 18:25:20 +0000 |
2595 | @@ -40,14 +40,14 @@ |
2596 | |
2597 | force = False |
2598 | |
2599 | - def __init__(self, running_dir, card_dir='', timeout=None, |
2600 | + def __init__(self, cmd, running_dir, card_dir='', timeout=None, |
2601 | **options): |
2602 | """ initialize an object """ |
2603 | |
2604 | self.running_dir = running_dir |
2605 | self.card_dir = os.path.join(self.running_dir, card_dir) |
2606 | self.timeout = timeout |
2607 | - |
2608 | + self.cmd_int = cmd |
2609 | #include/overwrite options |
2610 | for key,value in options.items(): |
2611 | setattr(self, key, value) |
2612 | @@ -91,7 +91,7 @@ |
2613 | """nice handling of question""" |
2614 | |
2615 | if not self.force: |
2616 | - return cmd.Cmd().ask(question, default, choices=choices, path_msg=path_msg, |
2617 | + return self.cmd_int.ask(question, default, choices=choices, path_msg=path_msg, |
2618 | timeout=self.timeout, fct_timeout=self.timeout_fct) |
2619 | else: |
2620 | return str(default) |
2621 | @@ -132,10 +132,10 @@ |
2622 | class SALauncher(ExtLauncher): |
2623 | """ A class to launch a simple Standalone test """ |
2624 | |
2625 | - def __init__(self, running_dir, timeout, **options): |
2626 | + def __init__(self, cmd_int, running_dir, timeout, **options): |
2627 | """ initialize the StandAlone Version""" |
2628 | |
2629 | - ExtLauncher.__init__(self, running_dir, './Cards', timeout, **options) |
2630 | + ExtLauncher.__init__(self, cmd_int, running_dir, './Cards', timeout, **options) |
2631 | self.cards = ['param_card.dat'] |
2632 | |
2633 | |
2634 | @@ -158,8 +158,7 @@ |
2635 | def __init__(self, running_dir, timeout, cmd_int , unit='pb', **option): |
2636 | """ initialize the StandAlone Version""" |
2637 | |
2638 | - super(MELauncher,self).__init__(running_dir, './Cards', timeout, **option) |
2639 | - #self.executable = os.path.join('.', 'bin','generate_events') |
2640 | + ExtLauncher.__init__(self, cmd_int, running_dir, './Cards', timeout, **option) |
2641 | |
2642 | assert hasattr(self, 'cluster') |
2643 | assert hasattr(self, 'multicore') |
2644 | @@ -167,7 +166,6 @@ |
2645 | assert hasattr(self, 'shell') |
2646 | |
2647 | self.unit = unit |
2648 | - self.cmd_int = cmd_int |
2649 | |
2650 | if self.cluster: |
2651 | self.cluster = 1 |
2652 | @@ -204,13 +202,30 @@ |
2653 | import madgraph.interface.madevent_interface as ME |
2654 | |
2655 | if self.shell: |
2656 | - usecmd = ME.MadEventCmdShell |
2657 | + usecmd = ME.MadEventCmdShell(me_dir=self.running_dir) |
2658 | else: |
2659 | - usecmd = ME.MadEventCmd |
2660 | + usecmd = ME.MadEventCmd(me_dir=self.running_dir) |
2661 | + |
2662 | + #Check if some configuration were overwritten by a command. If so use it |
2663 | + set_cmd = [l for l in self.cmd_int.history if l.strip().startswith('set')] |
2664 | + for line in set_cmd: |
2665 | + try: |
2666 | + usecmd.exec_cmd(line) |
2667 | + except: |
2668 | + pass |
2669 | launch = self.cmd_int.define_child_cmd_interface( |
2670 | - usecmd(me_dir=self.running_dir), interface=False) |
2671 | + usecmd, interface=False) |
2672 | #launch.me_dir = self.running_dir |
2673 | - command = 'generate_events %s' % self.name |
2674 | + if self.unit == 'pb': |
2675 | + command = 'generate_events %s' % self.name |
2676 | + else: |
2677 | + warning_text = '''\ |
2678 | +This command will create a new param_card with the computed width. |
2679 | +This param_card makes sense only if you include all processes for |
2680 | +the computation of the width.''' |
2681 | + logger.warning(warning_text) |
2682 | + |
2683 | + command = 'calculate_decay_widths %s' % self.name |
2684 | if mode == "1": |
2685 | command += " --cluster" |
2686 | elif mode == "2": |
2687 | @@ -252,11 +267,11 @@ |
2688 | class Pythia8Launcher(ExtLauncher): |
2689 | """A class to launch Pythia8 run""" |
2690 | |
2691 | - def __init__(self, running_dir, timeout, **option): |
2692 | + def __init__(self, running_dir, timeout, cmd_int, **option): |
2693 | """ initialize launching Pythia 8""" |
2694 | |
2695 | running_dir = os.path.join(running_dir, 'examples') |
2696 | - ExtLauncher.__init__(self, running_dir, '.', timeout, **option) |
2697 | + ExtLauncher.__init__(self, cmd_int, running_dir, '.', timeout, **option) |
2698 | self.cards = [] |
2699 | |
2700 | def prepare_run(self): |
2701 | |
2702 | === modified file 'madgraph/interface/madevent_interface.py' |
2703 | --- madgraph/interface/madevent_interface.py 2011-12-07 16:17:13 +0000 |
2704 | +++ madgraph/interface/madevent_interface.py 2012-02-16 18:25:20 +0000 |
2705 | @@ -56,6 +56,7 @@ |
2706 | import madgraph.various.gen_crossxhtml as gen_crossxhtml |
2707 | import madgraph.various.cluster as cluster |
2708 | import madgraph.various.sum_html as sum_html |
2709 | + import madgraph.various.banner as banner_mod |
2710 | import models.check_param_card as check_param_card |
2711 | from madgraph import InvalidCmd, MadGraph5Error |
2712 | MADEVENT = False |
2713 | @@ -64,6 +65,7 @@ |
2714 | print error |
2715 | # import from madevent directory |
2716 | import internal.extended_cmd as cmd |
2717 | + import internal.banner as banner_mod |
2718 | import internal.misc as misc |
2719 | from internal import InvalidCmd, MadGraph5Error |
2720 | import internal.files as files |
2721 | @@ -228,11 +230,16 @@ |
2722 | self.results.current and 'run_name' in self.results.current and \ |
2723 | hasattr(self, 'me_dir'): |
2724 | name = self.results.current['run_name'] |
2725 | - self.debug_output = pjoin(self.me_dir, '%s_debug.log' % name) |
2726 | + tag = self.results.current['tag'] |
2727 | + self.debug_output = pjoin(self.me_dir, '%s_%s_debug.log' % (name,tag)) |
2728 | self.results.current.debug = self.debug_output |
2729 | else: |
2730 | #Force class default |
2731 | self.debug_output = MadEventCmd.debug_output |
2732 | + if os.path.exists('ME5_debug') and not 'ME5_debug' in self.debug_output: |
2733 | + os.remove('ME5_debug') |
2734 | + if not 'ME5_debug' in self.debug_output: |
2735 | + os.system('ln -s %s ME5_debug &> /dev/null' % self.debug_output) |
2736 | |
2737 | |
2738 | def nice_user_error(self, error, line): |
2739 | @@ -332,7 +339,7 @@ |
2740 | self._survey_options.items()]) |
2741 | |
2742 | def help_refine(self): |
2743 | - logger.info("syntax: refine require_precision [max_channel] [run_name] [--run_options]") |
2744 | + logger.info("syntax: refine require_precision [max_channel] [--run_options]") |
2745 | logger.info("-- refine the LAST run to achieve a given precision.") |
2746 | logger.info(" require_precision: can be either the targeted number of events") |
2747 | logger.info(' or the required relative error') |
2748 | @@ -341,15 +348,23 @@ |
2749 | |
2750 | def help_combine_events(self): |
2751 | """ """ |
2752 | - logger.info("syntax: combine_events [run_name] [--run_options]") |
2753 | + logger.info("syntax: combine_events [--run_options]") |
2754 | logger.info("-- Combine the last run in order to write the number of events") |
2755 | logger.info(" require in the run_card.") |
2756 | self.run_options_help([]) |
2757 | |
2758 | - def help_combine_events(self): |
2759 | + def help_store_events(self): |
2760 | """ """ |
2761 | - logger.info("syntax: store_events [run_name] [--run_options]") |
2762 | + logger.info("syntax: store_events [--run_options]") |
2763 | logger.info("-- Write physically the events in the files.") |
2764 | + logger.info(" should be launch after \'combine_events\'") |
2765 | + self.run_options_help([]) |
2766 | + |
2767 | + def help_create_gridpack(self): |
2768 | + """ """ |
2769 | + logger.info("syntax: create_gridpack [--run_options]") |
2770 | + logger.info("-- create the gridpack. ") |
2771 | + logger.info(" should be launch after \'store_events\'") |
2772 | self.run_options_help([]) |
2773 | |
2774 | def help_import(self): |
2775 | @@ -369,11 +384,11 @@ |
2776 | logger.info(" -f options: answer all question by default.") |
2777 | |
2778 | def help_remove(self): |
2779 | - logger.info("syntax: remove RUN [all|parton|pythia|pgs|delphes] [-f]") |
2780 | + logger.info("syntax: remove RUN [all|parton|pythia|pgs|delphes|banner] [-f] [--tag=]") |
2781 | logger.info("-- Remove all the files linked to previous run RUN") |
2782 | logger.info(" if RUN is 'all', then all run will be cleaned.") |
2783 | logger.info(" The optional argument precise which part should be cleaned.") |
2784 | - logger.info(" By default we clean all the related files but the banner.") |
2785 | + logger.info(" By default we clean all the related files but the banners.") |
2786 | logger.info(" the optional '-f' allows to by-pass all security question") |
2787 | logger.info(" The banner can be remove only if all files are removed first.") |
2788 | |
2789 | @@ -381,18 +396,21 @@ |
2790 | logger.info("syntax: pythia [RUN] [--run_options]") |
2791 | logger.info("-- run pythia on RUN (current one by default)") |
2792 | self.run_options_help([('-f','answer all question by default'), |
2793 | + ('-tag=', 'define the tag for the pythia run'), |
2794 | ('--no_default', 'not run if pythia_card not present')]) |
2795 | |
2796 | def help_pgs(self): |
2797 | logger.info("syntax: pgs [RUN] [--run_options]") |
2798 | logger.info("-- run pgs on RUN (current one by default)") |
2799 | self.run_options_help([('-f','answer all question by default'), |
2800 | + ('-tag=', 'define the tag for the pgs run'), |
2801 | ('--no_default', 'not run if pgs_card not present')]) |
2802 | |
2803 | def help_delphes(self): |
2804 | logger.info("syntax: delphes [RUN] [--run_options]") |
2805 | logger.info("-- run delphes on RUN (current one by default)") |
2806 | self.run_options_help([('-f','answer all question by default'), |
2807 | + ('-tag=', 'define the tag for the delphes run'), |
2808 | ('--no_default', 'not run if delphes_card not present')]) |
2809 | |
2810 | #=============================================================================== |
2811 | @@ -408,30 +426,45 @@ |
2812 | self.help_banner_run() |
2813 | raise self.InvalidCmd('banner_run reauires at least one argument.') |
2814 | |
2815 | + tag = [a[6:] for a in args if a.startswith('--tag=')] |
2816 | + |
2817 | + |
2818 | if os.path.exists(args[0]): |
2819 | type ='banner' |
2820 | format = self.detect_card_type(args[0]) |
2821 | if format != 'banner': |
2822 | raise self.InvalidCmd('The file is not a valid banner.') |
2823 | - elif os.path.exists(pjoin(self.me_dir,'Events','%s_banner.txt' % args[0])): |
2824 | + elif tag: |
2825 | + args[0] = pjoin(self.me_dir,'Events', args[0], '%s_%s_banner.txt' % \ |
2826 | + (args[0], tag)) |
2827 | + if not os.path.exists(args[0]): |
2828 | + raise self.InvalidCmd('No banner associates to this name and tag.') |
2829 | + else: |
2830 | name = args[0] |
2831 | - args[0] = pjoin(self.me_dir,'Events','%s_banner.txt') |
2832 | type = 'run' |
2833 | - else: |
2834 | - raise self.InvalidCmd('No banner associates to this name.') |
2835 | - |
2836 | - run_name = [arg[7:] for arg in args if arg.startswith('--name')] |
2837 | + banners = glob.glob(pjoin(self.me_dir,'Events', args[0], '*_banner.txt')) |
2838 | + if not banners: |
2839 | + raise self.InvalidCmd('No banner associates to this name.') |
2840 | + elif len(banners) == 1: |
2841 | + args[0] = banners[0] |
2842 | + else: |
2843 | + #list the tag and propose those to the user |
2844 | + tags = [os.path.basename(p)[len(args[0])+1:-11] for p in banners] |
2845 | + tag = self.ask('which tag do you want to use?', tags[0], tags) |
2846 | + args[0] = pjoin(self.me_dir,'Events', args[0], '%s_%s_banner.txt' % \ |
2847 | + (args[0], tag)) |
2848 | + |
2849 | + run_name = [arg[7:] for arg in args if arg.startswith('--name=')] |
2850 | if run_name: |
2851 | try: |
2852 | - os.exec_cmd('remove %s all' % run_name) |
2853 | + os.exec_cmd('remove %s all banner' % run_name) |
2854 | except: |
2855 | pass |
2856 | - self.set_run_name(args[0], True) |
2857 | + self.set_run_name(args[0], tag=None, level='parton', reload_card=True) |
2858 | elif type == 'banner': |
2859 | self.set_run_name(self.find_available_run_name(self.me_dir)) |
2860 | elif type == 'run': |
2861 | - data = glob.glob(pjoin(self.me_dir, 'Events', '%s_*' % name)) |
2862 | - if len(data) > 1: |
2863 | + if os.path.exists(pjoin(self.me_dir, 'Events', name)): |
2864 | run_name = self.find_available_run_name(self.me_dir) |
2865 | logger.info('Run %s is not empty so will use run_name: %s' % \ |
2866 | (name, run_name)) |
2867 | @@ -439,7 +472,6 @@ |
2868 | else: |
2869 | self.set_run_name(name) |
2870 | |
2871 | - |
2872 | def check_history(self, args): |
2873 | """check the validity of line""" |
2874 | |
2875 | @@ -537,7 +569,7 @@ |
2876 | # No run name assigned -> assigned one automaticaly |
2877 | self.set_run_name(self.find_available_run_name(self.me_dir)) |
2878 | else: |
2879 | - self.set_run_name(args[0], True) |
2880 | + self.set_run_name(args[0], None,'parton', True) |
2881 | args.pop(0) |
2882 | |
2883 | return True |
2884 | @@ -634,19 +666,22 @@ |
2885 | def check_refine(self, args): |
2886 | """check that the argument for survey are valid""" |
2887 | |
2888 | - # if last argument is not a number -> it's the run_name |
2889 | + # if last argument is not a number -> it's the run_name (Not allow anymore) |
2890 | try: |
2891 | float(args[-1]) |
2892 | except ValueError: |
2893 | - self.set_run_name(args[-1]) |
2894 | - del args[-1] |
2895 | + self.help_refine() |
2896 | + raise self.InvalidCmd('Not valid arguments') |
2897 | except IndexError: |
2898 | self.help_refine() |
2899 | raise self.InvalidCmd('require_precision argument is require for refine cmd') |
2900 | |
2901 | |
2902 | if not self.run_name: |
2903 | - raise self.InvalidCmd('No run_name currently define. Impossible to run refine') |
2904 | + if self.results.lastrun: |
2905 | + self.set_run_name(self.results.lastrun) |
2906 | + else: |
2907 | + raise self.InvalidCmd('No run_name currently define. Impossible to run refine') |
2908 | |
2909 | if len(args) > 2: |
2910 | self.help_refine() |
2911 | @@ -663,11 +698,15 @@ |
2912 | def check_combine_events(self, arg): |
2913 | """ Check the argument for the combine events command """ |
2914 | |
2915 | - if len(arg) >1: |
2916 | + if len(arg): |
2917 | self.help_combine_events() |
2918 | raise self.InvalidCmd('Too many argument for combine_events command') |
2919 | |
2920 | - if len(arg): |
2921 | + if not self.run_name: |
2922 | + if not self.results.lastrun: |
2923 | + raise self.InvalidCmd('No run_name currently define. Impossible to run combine') |
2924 | + else: |
2925 | + self.set_run_name(self.results.lastrun) |
2926 | self.set_run_name(arg[0]) |
2927 | |
2928 | return True |
2929 | @@ -679,7 +718,7 @@ |
2930 | """ |
2931 | |
2932 | mode = None |
2933 | - laststep = [arg for arg in args if arg.startswith('--lastep=')] |
2934 | + laststep = [arg for arg in args if arg.startswith('--laststep=')] |
2935 | if laststep and len(laststep)==1: |
2936 | mode = laststep[0].split('=')[-1] |
2937 | if mode not in ['auto', 'pythia', 'pgs', 'delphes']: |
2938 | @@ -688,15 +727,27 @@ |
2939 | elif laststep: |
2940 | raise self.InvalidCmd('only one laststep argument is allowed') |
2941 | |
2942 | + tag = [a for a in args if a.startswith('--tag=')] |
2943 | + if tag: |
2944 | + args.remove(tag[0]) |
2945 | + tag = tag[0][6:] |
2946 | + |
2947 | if len(args) == 0 and not self.run_name: |
2948 | - self.help_pythia() |
2949 | - raise self.InvalidCmd('No run name currently define. Please add this information.') |
2950 | + if self.results.lastrun: |
2951 | + args.insert(0, self.results.lastrun) |
2952 | + else: |
2953 | + raise self.InvalidCmd('No run name currently define. Please add this information.') |
2954 | |
2955 | if len(args) == 1: |
2956 | if args[0] != self.run_name and\ |
2957 | not os.path.exists(pjoin(self.me_dir,'Events',args[0], 'unweighted_events.lhe.gz')): |
2958 | raise self.InvalidCmd('No events file corresponding to %s run. '% args[0]) |
2959 | - self.set_run_name(args[0]) |
2960 | + self.set_run_name(args[0], tag, 'pythia') |
2961 | + else: |
2962 | + if tag: |
2963 | + self.run_card['run_tag'] = tag |
2964 | + self.set_run_name(self.run_name, tag, 'pythia') |
2965 | + |
2966 | |
2967 | if not os.path.exists(pjoin(self.me_dir,'Events',self.run_name,'unweighted_events.lhe.gz')): |
2968 | raise self.InvalidCmd('No events file corresponding to %s run. '% self.run_name) |
2969 | @@ -724,19 +775,34 @@ |
2970 | def check_remove(self, args): |
2971 | """Check that the remove command is valid""" |
2972 | |
2973 | - |
2974 | - if len(args) == 0: |
2975 | + tmp_args = args[:] |
2976 | + |
2977 | + tag = [a[6:] for a in tmp_args if a.startswith('--tag=')] |
2978 | + if tag: |
2979 | + tag = tag[0] |
2980 | + tmp_args.remove('--tag=%s' % tag) |
2981 | + |
2982 | + try: |
2983 | + tmp_args.remove('-f') |
2984 | + except: |
2985 | + pass |
2986 | + else: |
2987 | + if args[0] == '-f': |
2988 | + args.pop(0) |
2989 | + args.append('-f') |
2990 | + |
2991 | + if len(tmp_args) == 0: |
2992 | self.help_clean() |
2993 | raise self.InvalidCmd('clean command require the name of the run to clean') |
2994 | - elif len(args) == 1: |
2995 | - args.append('all') |
2996 | + elif len(tmp_args) == 1: |
2997 | + return tmp_args[0], tag, ['all'] |
2998 | else: |
2999 | - for arg in args[1:]: |
3000 | + for arg in tmp_args[1:]: |
3001 | if arg not in self._clean_mode and arg != '-f': |
3002 | self.help_clean() |
3003 | raise self.InvalidCmd('%s is not a valid options for clean command'\ |
3004 | % arg) |
3005 | - |
3006 | + return tmp_args[0], tag, tmp_args[1:] |
3007 | |
3008 | def check_plot(self, args): |
3009 | """Check the argument for the plot command |
3010 | @@ -809,11 +875,19 @@ |
3011 | error_msg = 'No pythia-pgs path correctly set.' |
3012 | error_msg += 'Please use the set command to define the path and retry.' |
3013 | error_msg += 'You can also define it in the configuration file.' |
3014 | - raise self.InvalidCmd(error_msg) |
3015 | - |
3016 | + raise self.InvalidCmd(error_msg) |
3017 | + |
3018 | + tag = [a for a in arg if a.startswith('--tag=')] |
3019 | + if tag: |
3020 | + arg.remove(tag[0]) |
3021 | + tag = tag[0][6:] |
3022 | + |
3023 | + |
3024 | if len(arg) == 0 and not self.run_name: |
3025 | - self.help_pgs() |
3026 | - raise self.InvalidCmd('No run name currently define. Please add this information.') |
3027 | + if self.results.lastrun: |
3028 | + args.insert(0, self.results.lastrun) |
3029 | + else: |
3030 | + raise self.InvalidCmd('No run name currently define. Please add this information.') |
3031 | |
3032 | if len(arg) == 1 and self.run_name == arg[0]: |
3033 | arg.pop(0) |
3034 | @@ -825,13 +899,19 @@ |
3035 | Please specify a valid run_name''') |
3036 | |
3037 | if len(arg) == 1: |
3038 | - self.set_run_name(arg[0]) |
3039 | - if not os.path.exists(pjoin(self.me_dir,'Events',self.run_name,'%s_pythia_events.hep.gz' % self.run_card['run_tag'])): |
3040 | - raise self.InvalidCmd('No events file corresponding to %s run. '% self.run_name) |
3041 | + prev_tag = self.set_run_name(arg[0], tag, 'pgs') |
3042 | + if not os.path.exists(pjoin(self.me_dir,'Events',self.run_name,'%s_pythia_events.hep.gz' % prev_tag)): |
3043 | + raise self.InvalidCmd('No events file corresponding to %s run with tag %s. '% (self.run_name, prev_tag)) |
3044 | else: |
3045 | - input_file = pjoin(self.me_dir,'Events', self.run_name, '%s_pythia_events.hep.gz' % self.run_card['run_tag']) |
3046 | + input_file = pjoin(self.me_dir,'Events', self.run_name, '%s_pythia_events.hep.gz' % prev_tag) |
3047 | output_file = pjoin(self.me_dir, 'Events', 'pythia_events.hep') |
3048 | - os.system('gunzip -c %s > %s' % (input_file, output_file)) |
3049 | + self.launch_job('gunzip',stdout=open(output_file,'w'), |
3050 | + argument=['-c', input_file], mode=2) |
3051 | + else: |
3052 | + if tag: |
3053 | + self.run_card['run_tag'] = tag |
3054 | + self.set_run_name(self.run_name, tag, 'pgs') |
3055 | + |
3056 | |
3057 | def check_delphes(self, arg): |
3058 | """Check the argument for pythia command |
3059 | @@ -849,10 +929,18 @@ |
3060 | error_msg += 'Please use the set command to define the path and retry.' |
3061 | error_msg += 'You can also define it in the configuration file.' |
3062 | raise self.InvalidCmd(error_msg) |
3063 | + |
3064 | + tag = [a for a in arg if a.startswith('--tag=')] |
3065 | + if tag: |
3066 | + arg.remove(tag[0]) |
3067 | + tag = tag[0][6:] |
3068 | + |
3069 | |
3070 | if len(arg) == 0 and not self.run_name: |
3071 | - self.help_delphes() |
3072 | - raise self.InvalidCmd('No run name currently define. Please add this information.') |
3073 | + if self.results.lastrun: |
3074 | + args.insert(0, self.results.lastrun) |
3075 | + else: |
3076 | + raise self.InvalidCmd('No run name currently define. Please add this information.') |
3077 | |
3078 | if len(arg) == 1 and self.run_name == arg[0]: |
3079 | arg.pop(0) |
3080 | @@ -864,14 +952,20 @@ |
3081 | Please specify a valid run_name''') |
3082 | |
3083 | if len(arg) == 1: |
3084 | - self.set_run_name(arg[0]) |
3085 | - if not os.path.exists(pjoin(self.me_dir,'Events','%s_pythia_events.hep.gz' % self.run_name)): |
3086 | - raise self.InvalidCmd('No events file corresponding to %s run. '% self.run_name) |
3087 | + prev_tag = self.set_run_name(arg[0], tag, 'delphes') |
3088 | + if not os.path.exists(pjoin(self.me_dir,'Events',self.run_name, '%s_pythia_events.hep.gz' % prev_tag)): |
3089 | + raise self.InvalidCmd('No events file corresponding to %s run with tag %s.:%s '\ |
3090 | + % (self.run_name, prev_tag, |
3091 | + pjoin(self.me_dir,'Events',self.run_name, '%s_pythia_events.hep.gz' % prev_tag))) |
3092 | else: |
3093 | - input_file = pjoin(self.me_dir,'Events', '%s_pythia_events.hep.gz' % self.run_name) |
3094 | + input_file = pjoin(self.me_dir,'Events', self.run_name, '%s_pythia_events.hep.gz' % prev_tag) |
3095 | output_file = pjoin(self.me_dir, 'Events', 'pythia_events.hep') |
3096 | - os.system('gunzip -c %s > %s' % (input_file, output_file)) |
3097 | - |
3098 | + self.launch_job('gunzip',stdout=open(output_file,'w'), |
3099 | + argument=['-c', input_file], mode=2) |
3100 | + else: |
3101 | + if tag: |
3102 | + self.run_card['run_tag'] = tag |
3103 | + self.set_run_name(self.run_name, tag, 'delphes') |
3104 | |
3105 | def check_display(self, args): |
3106 | """check the validity of line |
3107 | @@ -913,21 +1007,44 @@ |
3108 | def complete_banner_run(self, text, line, begidx, endidx): |
3109 | "Complete the banner run command" |
3110 | try: |
3111 | + |
3112 | + |
3113 | args = self.split_arg(line[0:begidx], error=False) |
3114 | - if len(args)==1: |
3115 | - comp = self.path_completion(text, |
3116 | + |
3117 | + if args[-1].endswith(os.path.sep): |
3118 | + return self.path_completion(text, |
3119 | os.path.join('.',*[a for a in args \ |
3120 | + if a.endswith(os.path.sep)])) |
3121 | + |
3122 | + |
3123 | + if len(args) > 1: |
3124 | + # only options are possible |
3125 | + tags = glob.glob(pjoin(self.me_dir, 'Events' , args[1],'%s_*_banner.txt' % args[1])) |
3126 | + tags = ['%s' % os.path.basename(t)[len(args[1])+1:-11] for t in tags] |
3127 | + |
3128 | + if args[-1] != '--tag=': |
3129 | + tags = ['--tag=%s' % t for t in tags] |
3130 | + else: |
3131 | + return self.list_completion(text, tags) |
3132 | + return self.list_completion(text, tags +['--name=','-f'], line) |
3133 | + |
3134 | + # First argument |
3135 | + possibilites = {} |
3136 | + |
3137 | + comp = self.path_completion(text, os.path.join('.',*[a for a in args \ |
3138 | if a.endswith(os.path.sep)])) |
3139 | - run_list = glob.glob(pjoin(self.me_dir, 'Events', '*_banner.txt')) |
3140 | - run_list = [n.rsplit('/',1)[1][:-11] for n in run_list] |
3141 | - comp += self.list_completion(text, run_list) |
3142 | + if os.path.sep in line: |
3143 | return comp |
3144 | - if args[-1].endswith(os.path.sep): |
3145 | - return self.path_completion(text, |
3146 | - os.path.join('.',*[a for a in args \ |
3147 | - if a.endswith(os.path.sep)])) |
3148 | + else: |
3149 | + possibilites['Path from ./'] = comp |
3150 | + |
3151 | + run_list = glob.glob(pjoin(self.me_dir, 'Events', '*','*_banner.txt')) |
3152 | + run_list = [n.rsplit('/',2)[1] for n in run_list] |
3153 | + possibilites['RUN Name'] = self.list_completion(text, run_list) |
3154 | + |
3155 | + return self.deal_multiple_categories(possibilites) |
3156 | |
3157 | - return self.list_completion(text, ['--name=','-f'], line) |
3158 | + |
3159 | except Exception, error: |
3160 | print error |
3161 | def complete_history(self, text, line, begidx, endidx): |
3162 | @@ -1009,7 +1126,9 @@ |
3163 | |
3164 | complete_refine = complete_survey |
3165 | complete_combine_events = complete_survey |
3166 | + complite_store = complete_survey |
3167 | complete_generate_events = complete_survey |
3168 | + complete_create_gridpack = complete_survey |
3169 | |
3170 | def complete_generate_events(self, text, line, begidx, endidx): |
3171 | """ Complete the generate events""" |
3172 | @@ -1075,11 +1194,23 @@ |
3173 | """Complete the remove command """ |
3174 | |
3175 | args = self.split_arg(line[0:begidx], error=False) |
3176 | - if len(args) > 1: |
3177 | - return self.list_completion(text, self._clean_mode + ['-f']) |
3178 | + if len(args) > 1 and (text.startswith('--t')): |
3179 | + run = args[1] |
3180 | + tags = ['--tag=%s' % tag['tag'] for tag in self.results[run]] |
3181 | + return self.list_completion(text, tags) |
3182 | + elif len(args) > 1 and '--' == args[-1]: |
3183 | + run = args[1] |
3184 | + tags = ['tag=%s' % tag['tag'] for tag in self.results[run]] |
3185 | + return self.list_completion(text, tags) |
3186 | + elif len(args) > 1 and '--tag=' == args[-1]: |
3187 | + run = args[1] |
3188 | + tags = [tag['tag'] for tag in self.results[run]] |
3189 | + return self.list_completion(text, tags) |
3190 | + elif len(args) > 1: |
3191 | + return self.list_completion(text, self._clean_mode + ['-f','--tag=']) |
3192 | else: |
3193 | - data = glob.glob(pjoin(self.me_dir, 'Events', '*_banner.txt')) |
3194 | - data = [n.rsplit('/',1)[1].rsplit('_',1)[0] for n in data] |
3195 | + data = glob.glob(pjoin(self.me_dir, 'Events','*','*_banner.txt')) |
3196 | + data = [n.rsplit('/',2)[1] for n in data] |
3197 | return self.list_completion(text, ['all'] + data) |
3198 | |
3199 | |
3200 | @@ -1089,35 +1220,37 @@ |
3201 | args = self.split_arg(line[0:begidx], error=False) |
3202 | if len(args) == 1: |
3203 | #return valid run_name |
3204 | - data = glob.glob(pjoin(self.me_dir, 'Events', '*_unweighted_events.lhe.gz')) |
3205 | - data = [n.rsplit('/',1)[1][:-25] for n in data] |
3206 | + data = glob.glob(pjoin(self.me_dir, 'Events', '*','unweighted_events.lhe.gz')) |
3207 | + data = [n.rsplit('/',2)[1] for n in data] |
3208 | tmp1 = self.list_completion(text, data) |
3209 | if not self.run_name: |
3210 | return tmp1 |
3211 | else: |
3212 | - tmp2 = self.list_completion(text, self._run_options + ['-f', '--no_default'], line) |
3213 | + tmp2 = self.list_completion(text, self._run_options + ['-f', |
3214 | + '--no_default', '--tag='], line) |
3215 | return tmp1 + tmp2 |
3216 | else: |
3217 | - return self.list_completion(text, self._run_options + ['-f', '--no_default'], line) |
3218 | + return self.list_completion(text, self._run_options + ['-f', |
3219 | + '--no_default','--tag='], line) |
3220 | |
3221 | def complete_pgs(self,text, line, begidx, endidx): |
3222 | "Complete the pythia command" |
3223 | - |
3224 | args = self.split_arg(line[0:begidx], error=False) |
3225 | - |
3226 | if len(args) == 1: |
3227 | #return valid run_name |
3228 | - data = glob.glob(pjoin(self.me_dir, 'Events', '*_pythia_events.hep.gz')) |
3229 | - data = [n.rsplit('/',1)[1][:-21] for n in data] |
3230 | + data = glob.glob(pjoin(self.me_dir, 'Events', '*', '*_pythia_events.hep.gz')) |
3231 | + data = [n.rsplit('/',2)[1] for n in data] |
3232 | tmp1 = self.list_completion(text, data) |
3233 | if not self.run_name: |
3234 | return tmp1 |
3235 | else: |
3236 | - tmp2 = self.list_completion(text, self._run_options + ['-f', '--no_default'], line) |
3237 | + tmp2 = self.list_completion(text, self._run_options + ['-f', |
3238 | + '--tag=' ,'--no_default'], line) |
3239 | return tmp1 + tmp2 |
3240 | else: |
3241 | - return self.list_completion(text, self._run_options + ['-f', '--no_default'], line) |
3242 | - |
3243 | + return self.list_completion(text, self._run_options + ['-f', |
3244 | + '--tag=','--no_default'], line) |
3245 | + |
3246 | complete_delphes = complete_pgs |
3247 | |
3248 | |
3249 | @@ -1201,6 +1334,8 @@ |
3250 | |
3251 | self.to_store = [] |
3252 | self.run_name = None |
3253 | + self.run_tag = None |
3254 | + self.banner = None |
3255 | |
3256 | # Get number of initial states |
3257 | nexternal = open(pjoin(me_dir,'Source','nexternal.inc')).read() |
3258 | @@ -1224,6 +1359,7 @@ |
3259 | process = self.process # define in find_model_name |
3260 | self.results = gen_crossxhtml.AllResults(model, process, self.me_dir) |
3261 | self.results.def_web_mode(self.web) |
3262 | + |
3263 | |
3264 | self.configured = 0 # time for reading the card |
3265 | self._options = {} # for compatibility with extended_cmd |
3266 | @@ -1357,10 +1493,8 @@ |
3267 | elif key.startswith('cluster'): |
3268 | pass |
3269 | elif key == 'automatic_html_opening': |
3270 | - if self.configuration[key] == 'False': |
3271 | - self.open_crossx = False |
3272 | - else: |
3273 | - self.open_crossx = True |
3274 | + if self.configuration[key] in ['False', 'True']: |
3275 | + self.configuration[key] =eval(self.configuration[key]) |
3276 | elif key not in ['text_editor','eps_viewer','web_browser']: |
3277 | # Default: try to set parameter |
3278 | try: |
3279 | @@ -1385,18 +1519,21 @@ |
3280 | force = True |
3281 | else: |
3282 | force = False |
3283 | - |
3284 | - # Split the banner |
3285 | - if MADEVENT: |
3286 | - import internal.splitbanner as splitbanner |
3287 | - else: |
3288 | - import madgraph.various.splitbanner as splitbanner |
3289 | - |
3290 | - splitbanner.split_banner(args[0], self.me_dir) |
3291 | + |
3292 | + banner_mod.split_banner(args[0], self.me_dir, proc_card=False) |
3293 | + |
3294 | + # Remove previous cards |
3295 | + for name in ['delphes_trigger.dat', 'delphes_card.dat', |
3296 | + 'pgs_card.dat', 'pythia_card.dat']: |
3297 | + try: |
3298 | + os.remove(pjoin(self.me_dir, 'Cards', name)) |
3299 | + except: |
3300 | + pass |
3301 | + |
3302 | |
3303 | # Check if we want to modify the run |
3304 | if not force: |
3305 | - ans = self.ask('Do you want to modify this run?', 'n', ['y','n'], |
3306 | + ans = self.ask('Do you want to modify the Cards?', 'n', ['y','n'], |
3307 | timeout=self.timeout) |
3308 | if ans == 'n': |
3309 | force = True |
3310 | @@ -1407,7 +1544,7 @@ |
3311 | |
3312 | |
3313 | ############################################################################ |
3314 | - def do_display(self, line): |
3315 | + def do_display(self, line, output=sys.stdout): |
3316 | """Display current internal status""" |
3317 | |
3318 | args = self.split_arg(line) |
3319 | @@ -1415,19 +1552,27 @@ |
3320 | self.check_display(args) |
3321 | |
3322 | if args[0] == 'run_name': |
3323 | - |
3324 | - |
3325 | #return valid run_name |
3326 | - data = glob.glob(pjoin(self.me_dir, 'Events', '*_banner.txt')) |
3327 | - data = [n.rsplit('/',1)[1][:-11] for n in data] |
3328 | + data = glob.glob(pjoin(self.me_dir, 'Events', '*','*_banner.txt')) |
3329 | + data = [n.rsplit('/',2)[1:] for n in data] |
3330 | + |
3331 | if data: |
3332 | + out = {} |
3333 | + for name, tag in data: |
3334 | + tag = tag[len(name)+1:-11] |
3335 | + if name in out: |
3336 | + out[name].append(tag) |
3337 | + else: |
3338 | + out[name] = [tag] |
3339 | print 'the runs available are:' |
3340 | - for run_name in data: |
3341 | - print ' %s' % run_name |
3342 | + for run_name, tags in out.items(): |
3343 | + print ' run: %s' % run_name |
3344 | + print ' tags: ', |
3345 | + print ', '.join(tags) |
3346 | else: |
3347 | print 'No run detected.' |
3348 | else: |
3349 | - super(MadEventCmd, self).do_display(line) |
3350 | + super(MadEventCmd, self).do_display(line, output) |
3351 | |
3352 | |
3353 | ############################################################################ |
3354 | @@ -1487,6 +1632,13 @@ |
3355 | elif args[0] in self.configuration: |
3356 | if args[1] in ['None','True','False']: |
3357 | self.configuration[args[0]] = eval(args[1]) |
3358 | + elif args[0].endswith('path'): |
3359 | + if os.path.exists(args[1]): |
3360 | + self.configuration[args[0]] = args[1] |
3361 | + elif os.path.exists(pjoin(self.me_dir, args[1])): |
3362 | + self.configuration[args[0]] = pjoin(self.me_dir, args[1]) |
3363 | + else: |
3364 | + raise self.InvalidCmd('Not a valid path: keep previous value: \'%s\'' % self.configuration[args[0]]) |
3365 | else: |
3366 | self.configuration[args[0]] = args[1] |
3367 | |
3368 | @@ -1522,9 +1674,9 @@ |
3369 | self.ask_run_configuration(mode, force) |
3370 | if not args: |
3371 | # No run name assigned -> assigned one automaticaly |
3372 | - self.set_run_name(self.find_available_run_name(self.me_dir)) |
3373 | + self.set_run_name(self.find_available_run_name(self.me_dir), None, 'parton') |
3374 | else: |
3375 | - self.set_run_name(args[0], True) |
3376 | + self.set_run_name(args[0], None, 'parton', True) |
3377 | args.pop(0) |
3378 | |
3379 | if self.run_card['gridpack'] in self.true: |
3380 | @@ -1555,7 +1707,7 @@ |
3381 | 1) A massive s-channel particle has a width set to zero. |
3382 | 2) The pdf are zero for at least one of the initial state particles. |
3383 | 3) The cuts are too strong. |
3384 | - Please check/correct your param_card and your run_card.''') |
3385 | + Please check/correct your param_card and/or your run_card.''') |
3386 | |
3387 | nb_event = self.run_card['nevents'] |
3388 | self.exec_cmd('refine %s' % nb_event, postcmd=False) |
3389 | @@ -1580,15 +1732,16 @@ |
3390 | # No run name assigned -> assigned one automaticaly |
3391 | self.set_run_name(self.find_available_run_name(self.me_dir)) |
3392 | else: |
3393 | - self.set_run_name(args[0], True) |
3394 | + self.set_run_name(args[0], reload_card=True) |
3395 | args.pop(0) |
3396 | - |
3397 | + |
3398 | # Running gridpack warmup |
3399 | opts=[('accuracy', accuracy), # default 0.01 |
3400 | ('points', 1000), |
3401 | ('iterations',9)] |
3402 | |
3403 | logger.info('Calculating decay widths with run name %s' % self.run_name) |
3404 | + |
3405 | self.exec_cmd('survey %s %s' % \ |
3406 | (self.run_name, |
3407 | " ".join(['--' + opt + '=' + str(val) for (opt,val) \ |
3408 | @@ -1596,16 +1749,115 @@ |
3409 | postcmd=False) |
3410 | self.exec_cmd('combine_events', postcmd=False) |
3411 | self.exec_cmd('store_events', postcmd=False) |
3412 | - # Combine decay widths into new file |
3413 | - proc = subprocess.Popen(['python', |
3414 | - pjoin(self.dirbin, 'collect_decay_widths.py'), |
3415 | - self.run_name], |
3416 | - cwd=self.me_dir, |
3417 | - stdout=subprocess.PIPE) |
3418 | |
3419 | - (stdout, stderr) = proc.communicate() |
3420 | + self.collect_decay_widths() |
3421 | + self.update_status('calculate_decay_widths done', |
3422 | + level='parton', makehtml=False) |
3423 | + |
3424 | |
3425 | ############################################################################ |
3426 | + def collect_decay_widths(self): |
3427 | + """ Collect the decay widths and calculate BRs for all particles, and put |
3428 | + in param_card form. |
3429 | + """ |
3430 | + |
3431 | + particle_dict = {} # store the results |
3432 | + run_name = self.run_name |
3433 | + |
3434 | + # Looping over the Subprocesses |
3435 | + for P_path in SubProcesses.get_subP(self.me_dir): |
3436 | + ids = SubProcesses.get_subP_ids(P_path) |
3437 | + # due to grouping we need to compute the ratio factor for the |
3438 | + # ungroup resutls (that we need here). Note that initial particles |
3439 | + # grouping are not at the same stage as final particle grouping |
3440 | + nb_output = len(ids) / (len(set([p[0] for p in ids]))) |
3441 | + results = open(pjoin(P_path, run_name + '_results.dat')).read().split('\n')[0] |
3442 | + result = float(results.strip().split(' ')[0]) |
3443 | + for particles in ids: |
3444 | + try: |
3445 | + particle_dict[particles[0]].append([particles[1:], result/nb_output]) |
3446 | + except KeyError: |
3447 | + particle_dict[particles[0]] = [[particles[1:], result/nb_output]] |
3448 | + |
3449 | + # Open the param_card.dat and insert the calculated decays and BRs |
3450 | + param_card_file = open(pjoin(self.me_dir, 'Cards', 'param_card.dat')) |
3451 | + param_card = param_card_file.read().split('\n') |
3452 | + param_card_file.close() |
3453 | + |
3454 | + decay_lines = [] |
3455 | + line_number = 0 |
3456 | + # Read and remove all decays from the param_card |
3457 | + while line_number < len(param_card): |
3458 | + line = param_card[line_number] |
3459 | + if line.lower().startswith('decay'): |
3460 | + # Read decay if particle in particle_dict |
3461 | + # DECAY 6 1.455100e+00 |
3462 | + line = param_card.pop(line_number) |
3463 | + line = line.split() |
3464 | + particle = 0 |
3465 | + if int(line[1]) not in particle_dict: |
3466 | + try: # If formatting is wrong, don't want this particle |
3467 | + particle = int(line[1]) |
3468 | + width = float(line[2]) |
3469 | + except: |
3470 | + particle = 0 |
3471 | + # Read BRs for this decay |
3472 | + line = param_card[line_number] |
3473 | + while line.startswith('#') or line.startswith(' '): |
3474 | + line = param_card.pop(line_number) |
3475 | + if not particle or line.startswith('#'): |
3476 | + line=param_card[line_number] |
3477 | + continue |
3478 | + # 6.668201e-01 3 5 2 -1 |
3479 | + line = line.split() |
3480 | + try: # Remove BR if formatting is wrong |
3481 | + partial_width = float(line[0])*width |
3482 | + decay_products = [int(p) for p in line[2:2+int(line[1])]] |
3483 | + except: |
3484 | + line=param_card[line_number] |
3485 | + continue |
3486 | + try: |
3487 | + particle_dict[particle].append([decay_products, partial_width]) |
3488 | + except KeyError: |
3489 | + particle_dict[particle] = [[decay_products, partial_width]] |
3490 | + line=param_card[line_number] |
3491 | + if particle and particle not in particle_dict: |
3492 | + # No decays given, only total width |
3493 | + particle_dict[particle] = [[[], width]] |
3494 | + else: # Not decay |
3495 | + line_number += 1 |
3496 | + # Clean out possible remaining comments at the end of the card |
3497 | + while not param_card[-1] or param_card[-1].startswith('#'): |
3498 | + param_card.pop(-1) |
3499 | + |
3500 | + # Append calculated and read decays to the param_card |
3501 | + param_card.append("#\n#*************************") |
3502 | + param_card.append("# Decay widths *") |
3503 | + param_card.append("#*************************") |
3504 | + for key in sorted(particle_dict.keys()): |
3505 | + width = sum([r for p,r in particle_dict[key]]) |
3506 | + param_card.append("#\n# PDG Width") |
3507 | + param_card.append("DECAY %i %e" % (key, width)) |
3508 | + if not width: |
3509 | + continue |
3510 | + if particle_dict[key][0][0]: |
3511 | + param_card.append("# BR NDA ID1 ID2 ...") |
3512 | + brs = [[val[1]/width, val[0]] for val in particle_dict[key] if val[1]] |
3513 | + for val in sorted(brs, reverse=True): |
3514 | + param_card.append(" %e %i %s" % (val[0], len(val[1]), |
3515 | + " ".join([str(v) for v in val[1]]))) |
3516 | + output_name = pjoin(self.me_dir, 'Events', run_name, "param_card.dat") |
3517 | + decay_table = open(output_name, 'w') |
3518 | + decay_table.write("\n".join(param_card) + "\n") |
3519 | + logger.info("Results written to %s" % output_name) |
3520 | + |
3521 | + |
3522 | + |
3523 | + |
3524 | + |
3525 | + |
3526 | + |
3527 | + ############################################################################ |
3528 | def do_multi_run(self, line): |
3529 | |
3530 | args = self.split_arg(line) |
3531 | @@ -1788,10 +2040,6 @@ |
3532 | cwd=pjoin(self.me_dir,'SubProcesses'), |
3533 | stdout=devnull) |
3534 | |
3535 | - #subprocess.call([pjoin(self.dirbin, 'sumall')], |
3536 | - # cwd=pjoin(self.me_dir,'SubProcesses'), |
3537 | - # stdout=devnull) |
3538 | - |
3539 | self.update_status('finish refine', 'parton', makehtml=False) |
3540 | |
3541 | ############################################################################ |
3542 | @@ -1823,15 +2071,26 @@ |
3543 | nb_event = pat.search(output).groups()[0] |
3544 | self.results.add_detail('nb_event', nb_event) |
3545 | |
3546 | + # Define The Banner |
3547 | + tag = self.run_card['run_tag'] |
3548 | + self.banner.load_basic(self.me_dir) |
3549 | + if not os.path.exists(pjoin(self.me_dir, 'Events', self.run_name)): |
3550 | + os.mkdir(pjoin(self.me_dir, 'Events', self.run_name)) |
3551 | + self.banner.write(pjoin(self.me_dir, 'Events', self.run_name, |
3552 | + '%s_%s_banner.txt' % (self.run_name, tag))) |
3553 | + self.banner.add(pjoin(self.me_dir, 'Cards', 'param_card.dat')) |
3554 | + self.banner.add(pjoin(self.me_dir, 'Cards', 'run_card.dat')) |
3555 | + |
3556 | + |
3557 | subprocess.call(['%s/put_banner' % self.dirbin, 'events.lhe'], |
3558 | cwd=pjoin(self.me_dir, 'Events')) |
3559 | subprocess.call(['%s/put_banner'% self.dirbin, 'unweighted_events.lhe'], |
3560 | cwd=pjoin(self.me_dir, 'Events')) |
3561 | |
3562 | - if os.path.exists(pjoin(self.me_dir, 'Events', 'unweighted_events.lhe')): |
3563 | - subprocess.call(['%s/extract_banner-pl' % self.dirbin, |
3564 | - 'unweighted_events.lhe', 'banner.txt'], |
3565 | - cwd=pjoin(self.me_dir, 'Events')) |
3566 | + #if os.path.exists(pjoin(self.me_dir, 'Events', 'unweighted_events.lhe')): |
3567 | + # subprocess.call(['%s/extract_banner-pl' % self.dirbin, |
3568 | + # 'unweighted_events.lhe', 'banner.txt'], |
3569 | + # cwd=pjoin(self.me_dir, 'Events')) |
3570 | |
3571 | eradir = self.configuration['exrootanalysis_path'] |
3572 | madir = self.configuration['madanalysis_path'] |
3573 | @@ -1860,24 +2119,8 @@ |
3574 | if not os.path.exists(pjoin(self.me_dir, 'HTML', run)): |
3575 | os.mkdir(pjoin(self.me_dir, 'HTML', run)) |
3576 | |
3577 | - # 1) TREAT FILE PRESENT IN SubProcesses |
3578 | - # results.html |
3579 | - #input = pjoin(self.me_dir, 'SubProcesses', 'results.html') |
3580 | - #output = pjoin(self.me_dir, 'HTML', run, 'results.html') |
3581 | - #files.mv(input, output) |
3582 | - # results.dat. The usefull information is store in the pickle -> remove |
3583 | - #os.remove(pjoin(self.me_dir, 'SubProcesses','results.dat')) |
3584 | - |
3585 | # 2) Treat the files present in the P directory |
3586 | - for Pdir in open(pjoin(self.me_dir, 'SubProcesses','subproc.mg')): |
3587 | - Pdir = Pdir.strip() |
3588 | - P_path = pjoin(self.me_dir, 'SubProcesses', Pdir) |
3589 | - # results.dat. |
3590 | - #os.remove(pjoin(P_path, 'results.dat')) |
3591 | - # results.html |
3592 | - #input = pjoin(P_path, 'results.html') |
3593 | - #output = pjoin(self.me_dir, 'HTML', run, '%s_results.html' % Pdir) |
3594 | - #files.mv(input, output) |
3595 | + for P_path in SubProcesses.get_subP(self.me_dir): |
3596 | G_dir = [G for G in os.listdir(P_path) if G.startswith('G') and |
3597 | os.path.isdir(pjoin(P_path,G))] |
3598 | for G in G_dir: |
3599 | @@ -1885,16 +2128,17 @@ |
3600 | # Remove events file (if present) |
3601 | if os.path.exists(pjoin(G_path, 'events.lhe')): |
3602 | os.remove(pjoin(G_path, 'events.lhe')) |
3603 | - # Remove results.dat |
3604 | - if os.path.exists(pjoin(G_path, 'results.dat')): |
3605 | - os.remove(pjoin(G_path, 'results.dat')) |
3606 | + # Remove results.dat (but not for gridpack) |
3607 | + if self.run_card['gridpack'] not in self.true: |
3608 | + if os.path.exists(pjoin(G_path, 'results.dat')): |
3609 | + os.remove(pjoin(G_path, 'results.dat')) |
3610 | # Store log |
3611 | if os.path.exists(pjoin(G_path, 'log.txt')): |
3612 | input = pjoin(G_path, 'log.txt') |
3613 | output = pjoin(G_path, '%s_log.txt' % run) |
3614 | files.mv(input, output) |
3615 | # Grid |
3616 | - for name in ['ftn25', 'ftn99']: |
3617 | + for name in ['ftn25', 'ftn99','ftn26']: |
3618 | if os.path.exists(pjoin(G_path, name)): |
3619 | if os.path.exists(pjoin(G_path, '%s_%s.gz'%(run,name))): |
3620 | os.remove(pjoin(G_path, '%s_%s.gz'%(run,name))) |
3621 | @@ -1915,18 +2159,13 @@ |
3622 | for name in ['events.lhe', 'unweighted_events.lhe']: |
3623 | if os.path.exists(pjoin(E_path, name)): |
3624 | if os.path.exists(pjoin(O_path, '%s.gz' % name)): |
3625 | - os.remove(pjoin(O_path, '%s.lhe.gz' % name)) |
3626 | + os.remove(pjoin(O_path, '%s.gz' % name)) |
3627 | input = pjoin(E_path, name) |
3628 | output = pjoin(O_path, name) |
3629 | files.mv(input, output) |
3630 | subprocess.call(['gzip', output], stdout=devnull, stderr=devnull, |
3631 | cwd=O_path) |
3632 | - # The banner -- This is tag dependent |
3633 | - if os.path.exists(pjoin(E_path, 'banner.txt')): |
3634 | - input = pjoin(E_path, 'banner.txt') |
3635 | - output = pjoin(O_path, '%s_%s_banner.txt' % (run, tag)) |
3636 | - files.mv(input, output) |
3637 | - |
3638 | + |
3639 | self.update_status('End Parton', level='parton', makehtml=False) |
3640 | |
3641 | ############################################################################ |
3642 | @@ -1934,23 +2173,23 @@ |
3643 | """Advanced commands: Create gridpack from present run""" |
3644 | |
3645 | self.update_status('Creating gridpack', level='parton') |
3646 | + args = self.split_arg(line) |
3647 | + self.check_combine_events(args) |
3648 | os.system("sed -i.bak \"s/\s*.false.*=.*GridRun/ .true. = GridRun/g\" %s/Cards/grid_card.dat" \ |
3649 | % self.me_dir) |
3650 | subprocess.call(['./bin/internal/restore_data', self.run_name], |
3651 | cwd=self.me_dir) |
3652 | - subprocess.call(['./bin/internal/store4grid', 'default'], |
3653 | - cwd=self.me_dir) |
3654 | + self.exec_cmd('store_events') |
3655 | subprocess.call(['./bin/internal/clean'], cwd=self.me_dir) |
3656 | misc.compile(['gridpack.tar.gz'], cwd=self.me_dir) |
3657 | files.mv(pjoin(self.me_dir, 'gridpack.tar.gz'), |
3658 | pjoin(self.me_dir, '%s_gridpack.tar.gz' % self.run_name)) |
3659 | self.update_status('gridpack created', level='gridpack') |
3660 | - |
3661 | + |
3662 | ############################################################################ |
3663 | def do_pythia(self, line): |
3664 | """launch pythia""" |
3665 | |
3666 | - |
3667 | # Check argument's validity |
3668 | args = self.split_arg(line) |
3669 | if '-f' in args: |
3670 | @@ -1966,24 +2205,27 @@ |
3671 | args.remove('--no_default') |
3672 | else: |
3673 | no_default = False |
3674 | + |
3675 | self.check_pythia(args) |
3676 | # the args are modify and the last arg is always the mode |
3677 | |
3678 | self.ask_pythia_run_configuration(args[-1], force) |
3679 | |
3680 | + # Update the banner with the pythia card |
3681 | + if not self.banner: |
3682 | + self.banner = banner_mod.recover_banner(self.results, 'pythia') |
3683 | |
3684 | # initialize / remove lhapdf mode |
3685 | self.configure_directory() |
3686 | - |
3687 | - tag = self.run_card['run_tag'] |
3688 | - if not force: |
3689 | - if os.path.exists(pjoin(self.me_dir, 'Events', self.run_name, '%s_pythia.log' % tag)): |
3690 | - question = 'Previous run of pythia detected. Do you want to remove it?' |
3691 | - ans = self.ask(question, 'y', choices=['y','n'], timeout = 20) |
3692 | - if ans == 'n': |
3693 | - return |
3694 | + |
3695 | + #if not force: |
3696 | + # if os.path.exists(pjoin(self.me_dir, 'Events', self.run_name, '%s_pythia.log' % tag)): |
3697 | + # question = 'Previous run of pythia detected. Do you want to remove it?' |
3698 | + # ans = self.ask(question, 'y', choices=['y','n'], timeout = 20) |
3699 | + # if ans == 'n': |
3700 | + # return |
3701 | |
3702 | - self.exec_cmd('remove %s pythia -f' % self.run_name) |
3703 | + #self.exec_cmd('remove %s pythia -f' % self.run_name) |
3704 | |
3705 | pythia_src = pjoin(self.configuration['pythia-pgs_path'],'src') |
3706 | |
3707 | @@ -1994,6 +2236,7 @@ |
3708 | pass |
3709 | |
3710 | ## LAUNCHING PYTHIA |
3711 | + tag = self.run_tag |
3712 | |
3713 | if self.cluster_mode == 1: |
3714 | pythia_log = pjoin(self.me_dir, 'Events', self.run_name , '%s_pythia.log' % tag) |
3715 | @@ -2009,7 +2252,7 @@ |
3716 | cwd=pjoin(self.me_dir,'Events')) |
3717 | |
3718 | if not os.path.exists(pjoin(self.me_dir,'Events','pythia.done')): |
3719 | - logger.warning('Fail to produce pythia output') |
3720 | + logger.warning('Fail to produce pythia output. More info in \n %s' % pythia_log.name) |
3721 | return |
3722 | else: |
3723 | os.remove(pjoin(self.me_dir,'Events','pythia.done')) |
3724 | @@ -2029,13 +2272,10 @@ |
3725 | madir = self.configuration['madanalysis_path'] |
3726 | td = self.configuration['td_path'] |
3727 | |
3728 | - # Update the banner with the pythia card |
3729 | - banner_path = pjoin(self.me_dir,'Events',self.run_name, '%s_%s_banner.txt' % (self.run_name, tag)) |
3730 | - banner = open(banner_path, 'a') |
3731 | - banner.writelines('\n<MGPythiaCard>\n') |
3732 | - banner.writelines(open(pjoin(self.me_dir, 'Cards','pythia_card.dat')).read()) |
3733 | - banner.writelines('\n</MGPythiaCard>\n') |
3734 | - banner.close() |
3735 | + |
3736 | + self.banner.add(pjoin(self.me_dir, 'Cards','pythia_card.dat')) |
3737 | + banner_path = pjoin(self.me_dir, 'Events', self.run_name, '%s_%s_banner.txt' % (self.run_name, tag)) |
3738 | + self.banner.write(banner_path) |
3739 | |
3740 | # Creating LHE file |
3741 | if misc.is_executable(pjoin(pydir, 'hep2lhe')): |
3742 | @@ -2069,20 +2309,20 @@ |
3743 | |
3744 | if int(self.run_card['ickkw']): |
3745 | self.update_status('Create matching plots for Pythia', level='pythia') |
3746 | - subprocess.call([self.dirbin+'/create_matching_plots.sh', self.run_name], |
3747 | + subprocess.call([self.dirbin+'/create_matching_plots.sh', self.run_name, tag], |
3748 | stdout = os.open(os.devnull, os.O_RDWR), |
3749 | cwd=pjoin(self.me_dir,'Events')) |
3750 | #Clean output |
3751 | subprocess.call(['gzip','-f','events.tree'], |
3752 | cwd=pjoin(self.me_dir,'Events')) |
3753 | files.mv(pjoin(self.me_dir,'Events','events.tree.gz'), |
3754 | - pjoin(self.me_dir,'Events',self.run_name +'_pythia_events.tree.gz')) |
3755 | + pjoin(self.me_dir,'Events',self.run_name, tag + '_pythia_events.tree.gz')) |
3756 | subprocess.call(['gzip','-f','beforeveto.tree'], |
3757 | cwd=pjoin(self.me_dir,'Events')) |
3758 | files.mv(pjoin(self.me_dir,'Events','beforeveto.tree.gz'), |
3759 | - pjoin(self.me_dir,'Events',self.run_name +'_pythia_beforeveto.tree.gz')) |
3760 | + pjoin(self.me_dir,'Events',self.run_name, tag+'_pythia_beforeveto.tree.gz')) |
3761 | files.mv(pjoin(self.me_dir,'Events','xsecs.tree'), |
3762 | - pjoin(self.me_dir,'Events',self.run_name +'_pythia_xsecs.tree')) |
3763 | + pjoin(self.me_dir,'Events',self.run_name, tag+'_pythia_xsecs.tree')) |
3764 | |
3765 | |
3766 | # Plot for pythia |
3767 | @@ -2097,30 +2337,41 @@ |
3768 | |
3769 | self.update_status('finish', level='pythia', makehtml=False) |
3770 | self.exec_cmd('pgs --no_default', postcmd=False, printcmd=False) |
3771 | - self.exec_cmd('delphes --no_default', postcmd=False, printcmd=False) |
3772 | + if self.configuration['delphes_path']: |
3773 | + self.exec_cmd('delphes --no_default', postcmd=False, printcmd=False) |
3774 | + |
3775 | + def get_available_tag(self): |
3776 | + """create automatically a tag""" |
3777 | |
3778 | + used_tags = [r['tag'] for r in self.results[self.run_name]] |
3779 | + i=0 |
3780 | + while 1: |
3781 | + i+=1 |
3782 | + if 'tag_%s' %i not in used_tags: |
3783 | + return 'tag_%s' % i |
3784 | + |
3785 | + |
3786 | + |
3787 | ################################################################################ |
3788 | def do_remove(self, line): |
3789 | """Remove one/all run or only part of it""" |
3790 | |
3791 | args = self.split_arg(line) |
3792 | - self.check_remove(args) |
3793 | - if args[0] == 'all': |
3794 | + run, tag, mode = self.check_remove(args) |
3795 | + if run == 'all': |
3796 | # Check first if they are not a run with a name run. |
3797 | - if os.path.exists(pjoin(self.me_dir, 'Events', 'all_banner.txt')): |
3798 | + if os.path.exists(pjoin(self.me_dir, 'Events', 'all')): |
3799 | logger.warning('A run with name all exists. So we will not supress all processes.') |
3800 | else: |
3801 | - for match in glob.glob(pjoin(self.me_dir, 'Events', '*_banner.txt')): |
3802 | - run = os.path.basename(match).rsplit('_',1)[0] |
3803 | + for match in glob.glob(pjoin(self.me_dir, 'Events','*','*_banner.txt')): |
3804 | + run = match.rsplit(os.path.sep,2)[1] |
3805 | try: |
3806 | self.exec_cmd('clean %s %s' % (run, ' '.join(args[1:]) ) ) |
3807 | except self.InvalidCmd, error: |
3808 | logger.info(error) |
3809 | pass # run already clear |
3810 | return |
3811 | - |
3812 | - run = args[0] |
3813 | - |
3814 | + |
3815 | # Check that run exists |
3816 | if not os.path.exists(pjoin(self.me_dir, 'Events', run)): |
3817 | raise self.InvalidCmd('No run \'%s\' detected' % run) |
3818 | @@ -2131,20 +2382,26 @@ |
3819 | except: |
3820 | pass # Just ensure that html never makes crash this function |
3821 | |
3822 | - # Found the file to suppress: |
3823 | - to_suppress = glob.glob(pjoin(self.me_dir, 'Events', '%s_*' % run)) |
3824 | + |
3825 | + # Found the file to suppress |
3826 | + |
3827 | + to_suppress = glob.glob(pjoin(self.me_dir, 'Events', run, '*')) |
3828 | + to_suppress += glob.glob(pjoin(self.me_dir, 'HTML', run, '*')) |
3829 | + # forbid the banner to be removed |
3830 | to_suppress = [os.path.basename(f) for f in to_suppress if 'banner' not in f] |
3831 | + if tag: |
3832 | + to_suppress = [f for f in to_suppress if tag in f] |
3833 | |
3834 | - if 'all' in args: |
3835 | + if 'all' in mode: |
3836 | pass # suppress everything |
3837 | else: |
3838 | - if 'pythia' not in args: |
3839 | + if 'pythia' not in mode: |
3840 | to_suppress = [f for f in to_suppress if 'pythia' not in f] |
3841 | - if 'pgs' not in args: |
3842 | + if 'pgs' not in mode: |
3843 | to_suppress = [f for f in to_suppress if 'pgs' not in f] |
3844 | - if 'delphes' not in args: |
3845 | + if 'delphes' not in mode: |
3846 | to_suppress = [f for f in to_suppress if 'delphes' not in f] |
3847 | - if 'parton' not in args: |
3848 | + if 'parton' not in mode: |
3849 | to_suppress = [f for f in to_suppress if 'delphes' in f |
3850 | or 'pgs' in f |
3851 | or 'pythia' in f] |
3852 | @@ -2158,13 +2415,21 @@ |
3853 | |
3854 | if ans == 'y': |
3855 | for file2rm in to_suppress: |
3856 | - if os.path.isdir(pjoin(self.me_dir, 'Events', file2rm)): |
3857 | - shutil.rmtree(pjoin(self.me_dir, 'Events', file2rm)) |
3858 | + if os.path.exists(pjoin(self.me_dir, 'Events', run, file2rm)): |
3859 | + try: |
3860 | + os.remove(pjoin(self.me_dir, 'Events', run, file2rm)) |
3861 | + except: |
3862 | + shutil.rmtree(pjoin(self.me_dir, 'Events', run, file2rm)) |
3863 | else: |
3864 | - os.remove(pjoin(self.me_dir, 'Events', file2rm)) |
3865 | + try: |
3866 | + os.remove(pjoin(self.me_dir, 'HTML', run, file2rm)) |
3867 | + except: |
3868 | + shutil.rmtree(pjoin(self.me_dir, 'HTML', run, file2rm)) |
3869 | + |
3870 | + |
3871 | |
3872 | # Remove file in SubProcess directory |
3873 | - if 'all' in args or 'channel' in args: |
3874 | + if 'all' in mode or 'channel' in mode: |
3875 | to_suppress = glob.glob(pjoin(self.me_dir, 'SubProcesses', '%s*' % run)) |
3876 | to_suppress += glob.glob(pjoin(self.me_dir, 'SubProcesses', '*','%s*' % run)) |
3877 | to_suppress += glob.glob(pjoin(self.me_dir, 'SubProcesses', '*','*','%s*' % run)) |
3878 | @@ -2180,23 +2445,35 @@ |
3879 | for file2rm in to_suppress: |
3880 | os.remove(file2rm) |
3881 | |
3882 | - if 'banner' in args[1:]: |
3883 | - to_suppress = glob.glob(pjoin(self.me_dir, 'Events', '%s_*' % run)) |
3884 | - if len(to_suppress) > 1: |
3885 | - raise MadGraph5Error, '''Some output still exists for this run. |
3886 | + if 'banner' in mode: |
3887 | + to_suppress = glob.glob(pjoin(self.me_dir, 'Events', run, '*')) |
3888 | + if tag: |
3889 | + # remove banner |
3890 | + try: |
3891 | + os.remove(pjoin(self.me_dir, 'Events',run,'%s_%s_banner.txt' % (run,tag))) |
3892 | + except: |
3893 | + logger.warning('fail to remove the banner') |
3894 | + # remove the run from the html output |
3895 | + if run in self.results: |
3896 | + self.results.delete_run(run, tag) |
3897 | + return |
3898 | + elif any(['banner' not in os.path.basename(p) for p in to_suppress]): |
3899 | + if to_suppress: |
3900 | + raise MadGraph5Error, '''Some output still exists for this run. |
3901 | Please remove those output first. Do for example: |
3902 | remove %s all banner |
3903 | ''' % run |
3904 | - elif len(to_suppress): |
3905 | - # remove banner |
3906 | - os.remove(to_suppress[0]) |
3907 | - # remove the run from the html output |
3908 | + else: |
3909 | + shutil.rmtree(pjoin(self.me_dir, 'Events',run)) |
3910 | if run in self.results: |
3911 | self.results.delete_run(run) |
3912 | return |
3913 | + else: |
3914 | + logger.info('''The banner is not removed. In order to remove it run: |
3915 | + remove %s all banner %s''' % (run, tag and '--tag=%s ' % tag or '')) |
3916 | |
3917 | # update database. |
3918 | - self.results.clean(args[1:]) |
3919 | + self.results.clean(mode, run, tag) |
3920 | self.update_status('', level='all') |
3921 | |
3922 | |
3923 | @@ -2227,7 +2504,8 @@ |
3924 | logger.info('No valid files for partonic plot') |
3925 | |
3926 | if any([arg in ['all','pythia'] for arg in args]): |
3927 | - filename = pjoin(self.me_dir, 'Events','%s_pythia_events.lhe' % self.run_name) |
3928 | + filename = pjoin(self.me_dir, 'Events' ,self.run_name, |
3929 | + '%s_pythia_events.lhe' % self.run_tag) |
3930 | if os.path.exists(filename+'.gz'): |
3931 | os.system('gunzip -f %s' % (filename+'.gz') ) |
3932 | if os.path.exists(filename): |
3933 | @@ -2239,19 +2517,19 @@ |
3934 | logger.info('No valid files for pythia plot') |
3935 | |
3936 | if any([arg in ['all','pgs'] for arg in args]): |
3937 | - filename = pjoin(self.me_dir, 'Events','%s_pgs_events.lhco' % self.run_name) |
3938 | + filename = pjoin(self.me_dir, 'Events', self.run_name, |
3939 | + '%s_pgs_events.lhco' % self.run_tag) |
3940 | if os.path.exists(filename+'.gz'): |
3941 | os.system('gunzip -f %s' % (filename+'.gz') ) |
3942 | if os.path.exists(filename): |
3943 | - #shutil.move(filename, pjoin(self.me_dir, 'Events','pgs_events.lhco')) |
3944 | self.create_plot('PGS') |
3945 | - #shutil.move(pjoin(self.me_dir, 'Events','pgs_events.lhco'), filename) |
3946 | os.system('gzip -f %s' % filename) |
3947 | else: |
3948 | logger.info('No valid files for pgs plot') |
3949 | |
3950 | if any([arg in ['all','delphes'] for arg in args]): |
3951 | - filename = pjoin(self.me_dir, 'Events','%s_delphes_events.lhco' % self.run_name) |
3952 | + filename = pjoin(self.me_dir, 'Events', self.run_name, |
3953 | + '%s_delphes_events.lhco' % self.run_tag) |
3954 | if os.path.exists(filename+'.gz'): |
3955 | os.system('gunzip -f %s' % (filename+'.gz') ) |
3956 | if os.path.exists(filename): |
3957 | @@ -2279,14 +2557,14 @@ |
3958 | |
3959 | tag = self.run_card['run_tag'] |
3960 | if 'pythia' in self.to_store: |
3961 | - self.update_status('Storing Pythia files of Previous run', level='pythia') |
3962 | + self.update_status('Storing Pythia files of Previous run', level='pythia', error=True) |
3963 | os.system('mv -f %(path)s/pythia_events.hep %(path)s/%(name)s/%(tag)s_pythia_events.hep' % |
3964 | {'name': self.run_name, 'path' : pjoin(self.me_dir,'Events'), |
3965 | 'tag':tag}) |
3966 | os.system('gzip -f %s/%s_pythia_events.hep' % ( |
3967 | pjoin(self.me_dir,'Events',self.run_name), tag)) |
3968 | self.to_store.remove('pythia') |
3969 | - self.update_status('Done', level='pythia') |
3970 | + self.update_status('Done', level='pythia',makehtml=False,error=True) |
3971 | |
3972 | self.to_store = [] |
3973 | |
3974 | @@ -2296,7 +2574,6 @@ |
3975 | """launch pgs""" |
3976 | |
3977 | args = self.split_arg(line) |
3978 | - self.edit_one_card('pgs_card.dat', args) |
3979 | # Check argument's validity |
3980 | if '-f' in args: |
3981 | force = True |
3982 | @@ -2308,9 +2585,31 @@ |
3983 | args.remove('--no_default') |
3984 | else: |
3985 | no_default = False |
3986 | - self.update_status('prepare PGS run', level=None) |
3987 | + |
3988 | + # Check all arguments |
3989 | + # This might launch a gunzip in another thread. After the question |
3990 | + # This thread need to be wait for completion. (This allow to have the |
3991 | + # question right away and have the computer working in the same time) |
3992 | self.check_pgs(args) |
3993 | + |
3994 | + # Check that the pgs_card exists. If not copy the default |
3995 | + if not os.path.exists(pjoin(self.me_dir, 'Cards', 'pgs_card.dat')): |
3996 | + if no_default: |
3997 | + logger.info('No pgs_card detected, so not run pgs') |
3998 | + return |
3999 | + |
4000 | + files.cp(pjoin(self.me_dir, 'Cards', 'pgs_card_default.dat'), |
4001 | + pjoin(self.me_dir, 'Cards', 'pgs_card.dat')) |
4002 | + logger.info('No pgs card found. Take the default one.') |
4003 | |
4004 | + if not (no_default or force): |
4005 | + self.ask_edit_cards(['pgs'], args) |
4006 | + |
4007 | + self.update_status('prepare PGS run', level=None) |
4008 | + # Wait that the gunzip of the files is finished (if any) |
4009 | + if hasattr(self, 'control_thread') and self.control_thread[0]: |
4010 | + self.monitor(mode=2,html=False) |
4011 | + |
4012 | pgsdir = pjoin(self.configuration['pythia-pgs_path'], 'src') |
4013 | eradir = self.configuration['exrootanalysis_path'] |
4014 | madir = self.configuration['madanalysis_path'] |
4015 | @@ -2321,45 +2620,29 @@ |
4016 | logger.info('No PGS executable -- running make') |
4017 | misc.compile(cwd=pgsdir) |
4018 | |
4019 | - # Check that the pgs_card exists. If not copy the default and |
4020 | - # ask for edition of the card. |
4021 | - if not os.path.exists(pjoin(self.me_dir, 'Cards', 'pgs_card.dat')): |
4022 | - if no_default: |
4023 | - logger.info('No pgs_card detected, so not run pgs') |
4024 | - return |
4025 | - |
4026 | - files.cp(pjoin(self.me_dir, 'Cards', 'pgs_card_default.dat'), |
4027 | - pjoin(self.me_dir, 'Cards', 'pgs_card.dat')) |
4028 | - logger.info('No pgs card found. Take the default one.') |
4029 | - |
4030 | - if not force: |
4031 | - answer = self.ask('Do you want to edit this card?','n', ['y','n'], |
4032 | - timeout=20) |
4033 | - else: |
4034 | - answer = 'n' |
4035 | - |
4036 | - if answer == 'y': |
4037 | - misc.open_file(pjoin(self.me_dir, 'Cards', 'pgs_card.dat')) |
4038 | - |
4039 | - |
4040 | + |
4041 | + |
4042 | + |
4043 | + |
4044 | self.update_status('Running PGS', level='pgs') |
4045 | - tag = self.run_card['run_tag'] |
4046 | + |
4047 | + tag = self.run_tag |
4048 | + # Update the banner with the pgs card |
4049 | + self.banner.add(pjoin(self.me_dir, 'Cards','pgs_card.dat')) |
4050 | + banner_path = pjoin(self.me_dir, 'Events', self.run_name, '%s_%s_banner.txt' % (self.run_name, self.run_tag)) |
4051 | + self.banner.write(banner_path) |
4052 | + |
4053 | + ######################################################################## |
4054 | # now pass the event to a detector simulator and reconstruct objects |
4055 | - |
4056 | - # Update the banner with the pgs card |
4057 | - banner_path = pjoin(self.me_dir,'Events',self.run_name, '%s_%s_banner.txt' % (self.run_name,tag)) |
4058 | - print banner_path |
4059 | |
4060 | - banner = open(banner_path, 'a') |
4061 | - banner.writelines('\n<MGPGSCard>') |
4062 | - banner.writelines(open(pjoin(self.me_dir, 'Cards','pgs_card.dat')).read()) |
4063 | - banner.writelines('</MGPGSCard>\n') |
4064 | - banner.close() |
4065 | + ######################################################################## |
4066 | |
4067 | # Prepare the output file with the banner |
4068 | ff = open(pjoin(self.me_dir, 'Events', 'pgs_events.lhco'), 'w') |
4069 | text = open(banner_path).read() |
4070 | text = '#%s' % text.replace('\n','\n#') |
4071 | + dico = self.results[self.run_name].get_current_info() |
4072 | + text +='\n## Integrated weight (pb) : %.4g' % dico['cross'] |
4073 | + text +='\n## Number of Event : %s\n' % dico['nb_event'] |
4074 | ff.writelines(text) |
4075 | ff.close() |
4076 | |
4077 | @@ -2413,8 +2696,6 @@ |
4078 | """ run delphes and make associate root file/plot """ |
4079 | |
4080 | args = self.split_arg(line) |
4081 | - self.edit_one_card('delphes_card.dat', args) |
4082 | - self.edit_one_card('trigger_card.dat', args) |
4083 | # Check argument's validity |
4084 | if '-f' in args: |
4085 | force = True |
4086 | @@ -2426,10 +2707,9 @@ |
4087 | args.remove('--no_default') |
4088 | else: |
4089 | no_default = False |
4090 | - |
4091 | + self.check_delphes(args) |
4092 | self.update_status('prepare delphes run', level=None) |
4093 | - self.check_pgs(args) |
4094 | - |
4095 | + |
4096 | # Check that the delphes_card exists. If not copy the default and |
4097 | # ask for edition of the card. |
4098 | if not os.path.exists(pjoin(self.me_dir, 'Cards', 'delphes_card.dat')): |
4099 | @@ -2440,41 +2720,35 @@ |
4100 | files.cp(pjoin(self.me_dir, 'Cards', 'delphes_card_default.dat'), |
4101 | pjoin(self.me_dir, 'Cards', 'delphes_card.dat')) |
4102 | logger.info('No delphes card found. Take the default one.') |
4103 | - if not force: |
4104 | - answer = self.ask('Do you want to edit this card?','n', ['y','n'], |
4105 | - timeout=20) |
4106 | - else: |
4107 | - answer = 'n' |
4108 | - |
4109 | - if answer == 'y': |
4110 | - misc.open_file(pjoin(self.me_dir, 'Cards', 'delphes_card.dat')) |
4111 | + |
4112 | + if not (no_default or force): |
4113 | + self.ask_edit_cards(['delphes', 'trigger'], args) |
4114 | + |
4115 | + self.update_status('Running Delphes', level=None) |
4116 | + # Wait that the gunzip of the files is finished (if any) |
4117 | + if hasattr(self, 'control_thread') and self.control_thread[0]: |
4118 | + self.monitor(mode=2,html=False) |
4119 | + |
4120 | + |
4121 | |
4122 | delphes_dir = self.configuration['delphes_path'] |
4123 | - tag = self.run_card['run_tag'] |
4124 | - self.update_status('Running Delphes', level='delphes') |
4125 | - |
4126 | - # Update the banner with the pgs card |
4127 | - banner = open(pjoin(self.me_dir,'Events', self.run_name, |
4128 | - '%s_banner.txt' % tag), 'a') |
4129 | - banner.writelines('<MGDelphesCard>\n') |
4130 | - banner.writelines(open(pjoin(self.me_dir, 'Cards','delphes_card.dat')).read()) |
4131 | - banner.writelines('</MGDelphesCard>\n') |
4132 | - banner.writelines('<MGDelphesTrigger>\n') |
4133 | - banner.writelines(open(pjoin(self.me_dir, 'Cards','delphes_trigger.dat')).read()) |
4134 | - banner.writelines('</MGDelphesTrigger>') |
4135 | - banner.close() |
4136 | - |
4137 | - |
4138 | + tag = self.run_tag |
4139 | + self.banner.add(pjoin(self.me_dir, 'Cards','delphes_card.dat')) |
4140 | + self.banner.add(pjoin(self.me_dir, 'Cards','delphes_trigger.dat')) |
4141 | + self.banner.write(pjoin(self.me_dir, 'Events', self.run_name, '%s_%s_banner.txt' % (self.run_name, tag))) |
4142 | + |
4143 | + cross = self.results[self.run_name].get_current_info()['cross'] |
4144 | + |
4145 | if self.cluster_mode == 1: |
4146 | delphes_log = pjoin(self.me_dir, 'Events', self.run_name, "%s_delphes.log" % tag) |
4147 | self.cluster.launch_and_wait('../bin/internal/run_delphes', |
4148 | - argument= [delphes_dir, self.run_name, tag], |
4149 | + argument= [delphes_dir, self.run_name, tag, str(cross)], |
4150 | stdout=delphes_log, stderr=subprocess.STDOUT, |
4151 | cwd=pjoin(self.me_dir,'Events')) |
4152 | else: |
4153 | delphes_log = open(pjoin(self.me_dir, 'Events', self.run_name, "%s_delphes.log" % tag),'w') |
4154 | subprocess.call(['../bin/internal/run_delphes', delphes_dir, |
4155 | - self.run_name, tag], |
4156 | + self.run_name, tag, str(cross)], |
4157 | stdout= delphes_log, stderr=subprocess.STDOUT, |
4158 | cwd=pjoin(self.me_dir,'Events')) |
4159 | |
4160 | @@ -2503,17 +2777,21 @@ |
4161 | self.update_status('delphes done', level='delphes', makehtml=False) |
4162 | |
4163 | def launch_job(self,exe, cwd=None, stdout=None, argument = [], remaining=0, |
4164 | - run_type='', **opt): |
4165 | + run_type='', mode=None, **opt): |
4166 | """ """ |
4167 | argument = [str(arg) for arg in argument] |
4168 | + if mode is None: |
4169 | + mode = self.cluster_mode |
4170 | |
4171 | def launch_in_thread(exe, argument, cwd, stdout, control_thread): |
4172 | """ way to launch for multicore""" |
4173 | |
4174 | start = time.time() |
4175 | - subprocess.call(['./'+exe] + argument, cwd=cwd, stdout=stdout, |
4176 | + if (cwd and os.path.exists(pjoin(cwd, exe))) or os.path.exists(exe): |
4177 | + exe = './' + exe |
4178 | + subprocess.call([exe] + argument, cwd=cwd, stdout=stdout, |
4179 | stderr=subprocess.STDOUT, **opt) |
4180 | - logger.info('%s run in %f s' % (exe, time.time() -start)) |
4181 | + #logger.info('%s run in %f s' % (exe, time.time() -start)) |
4182 | |
4183 | # release the lock for allowing to launch the next job |
4184 | while not control_thread[1].locked(): |
4185 | @@ -2528,7 +2806,7 @@ |
4186 | |
4187 | |
4188 | |
4189 | - if self.cluster_mode == 0: |
4190 | + if mode == 0: |
4191 | self.update_status((remaining, 1, |
4192 | self.total_jobs - remaining -1, run_type), level=None, force=False) |
4193 | start = time.time() |
4194 | @@ -2540,10 +2818,10 @@ |
4195 | raise MadGraph5Error, '%s didn\'t stop properly. Stop all computation' % exe |
4196 | |
4197 | |
4198 | - elif self.cluster_mode == 1: |
4199 | + elif mode == 1: |
4200 | self.cluster.submit(exe, stdout=stdout, cwd=cwd) |
4201 | |
4202 | - elif self.cluster_mode == 2: |
4203 | + elif mode == 2: |
4204 | import thread |
4205 | if not hasattr(self, 'control_thread'): |
4206 | self.control_thread = [0] # [used_thread] |
4207 | @@ -2582,22 +2860,28 @@ |
4208 | return 'v4' |
4209 | |
4210 | ############################################################################ |
4211 | - def monitor(self, run_type='monitor'): |
4212 | + def monitor(self, run_type='monitor', mode=None, html=True): |
4213 | """ monitor the progress of running job """ |
4214 | |
4215 | - if self.cluster_mode == 1: |
4216 | - def update_status(idle, run, finish): |
4217 | - self.update_status((idle, run, finish, run_type), level=None) |
4218 | + if mode is None: |
4219 | + mode = self.cluster_mode |
4220 | + |
4221 | + if mode == 1: |
4222 | + if html: |
4223 | + update_status = lambda idle, run, finish: \ |
4224 | + self.update_status((idle, run, finish, run_type), level=None) |
4225 | + else: |
4226 | + update_status = lambda idle, run, finish: None |
4227 | self.cluster.wait(self.me_dir, update_status) |
4228 | |
4229 | - if self.cluster_mode == 2: |
4230 | + if mode == 2: |
4231 | # Wait that all thread finish |
4232 | if not self.control_thread[2]: |
4233 | # time.sleep(1) |
4234 | nb = self.control_thread[0] |
4235 | while self.control_thread[0]: |
4236 | time.sleep(5) |
4237 | - if nb != self.control_thread[0]: |
4238 | + if nb != self.control_thread[0] and html: |
4239 | self.update_status((0, self.control_thread[0], |
4240 | self.total_jobs - self.control_thread[0], run_type), |
4241 | level=None, force=False) |
4242 | @@ -2608,26 +2892,21 @@ |
4243 | pass |
4244 | else: |
4245 | for i in range(0,self.nb_core): |
4246 | - self.update_status((0, self.control_thread[0], |
4247 | + if html: |
4248 | + self.update_status((0, self.control_thread[0], |
4249 | self.total_jobs - self.control_thread[0], run_type), |
4250 | level=None, force=False) |
4251 | self.control_thread[1].acquire() |
4252 | self.control_thread[2] = False |
4253 | self.control_thread[1].release() |
4254 | - del self.next_update |
4255 | + try: |
4256 | + del self.next_update |
4257 | + except: |
4258 | + pass |
4259 | + if not html: |
4260 | + return |
4261 | |
4262 | - #proc = subprocess.Popen([pjoin(self.dirbin, 'sumall')], |
4263 | - # cwd=pjoin(self.me_dir,'SubProcesses'), |
4264 | - # stdout=subprocess.PIPE) |
4265 | - |
4266 | - #(stdout, stderr) = proc.communicate() |
4267 | - #for line in stdout.split('\n'): |
4268 | - # if line.startswith(' Results'): |
4269 | - # data = line.split() |
4270 | - # self.results.add_detail('cross', float(data[1])) |
4271 | - # self.results.add_detail('error', float(data[2])) |
4272 | - |
4273 | - # TEST################################################################## |
4274 | + ####################################################################### |
4275 | cross, error = sum_html.make_all_html_results(self) |
4276 | self.results.add_detail('cross', cross) |
4277 | self.results.add_detail('error', error) |
4278 | @@ -2639,9 +2918,6 @@ |
4279 | def find_available_run_name(me_dir): |
4280 | """ find a valid run_name for the current job """ |
4281 | |
4282 | - |
4283 | - |
4284 | - |
4285 | name = 'run_%02d' |
4286 | data = [int(s[4:6]) for s in os.listdir(pjoin(me_dir,'Events')) if |
4287 | s.startswith('run_') and len(s)>5 and s[4:6].isdigit()] |
4288 | @@ -2664,12 +2940,12 @@ |
4289 | else: |
4290 | self.configured = time.time() |
4291 | self.update_status('compile directory', level=None) |
4292 | - if self.open_crossx: |
4293 | + if self.configuration['automatic_html_opening']: |
4294 | misc.open_file(os.path.join(self.me_dir, 'crossx.html')) |
4295 | - self.open_crossx = False |
4296 | + self.configuration['automatic_html_opening'] = False |
4297 | + #open only once the web page |
4298 | # Change current working directory |
4299 | self.launching_dir = os.getcwd() |
4300 | - #os.chdir(self.me_dir) |
4301 | |
4302 | # Check if we need the MSSM special treatment |
4303 | model = self.find_model_name() |
4304 | @@ -2720,7 +2996,7 @@ |
4305 | line = line.split('=') |
4306 | if len(line) != 2: |
4307 | continue |
4308 | - output[line[1].strip()] = line[0].strip() |
4309 | + output[line[1].strip()] = line[0].replace('\'','').strip() |
4310 | return output |
4311 | |
4312 | ############################################################################ |
4313 | @@ -2735,14 +3011,36 @@ |
4314 | return default |
4315 | |
4316 | ############################################################################ |
4317 | - def set_run_name(self, name, reload_card=False): |
4318 | - """define the run name and update the results object""" |
4319 | - |
4320 | - if name == self.run_name: |
4321 | + def set_run_name(self, name, tag=None, level='parton', reload_card=False): |
4322 | + """define the run name, the run_tag, the banner and the results.""" |
4323 | + |
4324 | + # when are we force to change the tag new_run:previous run requiring changes |
4325 | + upgrade_tag = {'parton': ['parton','pythia','pgs','delphes'], |
4326 | + 'pythia': ['pythia','pgs','delphes'], |
4327 | + 'pgs': ['pgs'], |
4328 | + 'delphes':['delphes']} |
4329 | + |
4330 | + |
4331 | + |
4332 | + if name == self.run_name: |
4333 | if reload_card: |
4334 | run_card = pjoin(self.me_dir, 'Cards','run_card.dat') |
4335 | self.run_card = self.read_run_card(run_card) |
4336 | - return # Nothing to do |
4337 | + |
4338 | + #check if we need to change the tag |
4339 | + if tag: |
4340 | + self.run_card['run_tag'] = tag |
4341 | + self.run_tag = tag |
4342 | + self.results.add_run(self.run_name, self.run_card) |
4343 | + else: |
4344 | + for tag in upgrade_tag[level]: |
4345 | + if getattr(self.results[self.run_name][-1], tag): |
4346 | + tag = self.get_available_tag() |
4347 | + self.run_card['run_tag'] = tag |
4348 | + self.run_tag = tag |
4349 | + self.results.add_run(self.run_name, self.run_card) |
4350 | + break |
4351 | + return # Nothing to do anymore |
4352 | |
4353 | # save/clean previous run |
4354 | if self.run_name: |
4355 | @@ -2753,12 +3051,49 @@ |
4356 | # Read run_card |
4357 | run_card = pjoin(self.me_dir, 'Cards','run_card.dat') |
4358 | self.run_card = self.read_run_card(run_card) |
4359 | - |
4360 | - if name in self.results: |
4361 | + |
4362 | + new_tag = False |
4363 | + # First call for this run -> set the banner |
4364 | + self.banner = banner_mod.recover_banner(self.results, level) |
4365 | + if tag: |
4366 | + self.run_card['run_tag'] = tag |
4367 | + new_tag = True |
4368 | + elif not self.run_name in self.results and level =='parton': |
4369 | + pass # No results yet, so current tag is fine |
4370 | + else: |
4371 | + for tag in upgrade_tag[level]: |
4372 | + if getattr(self.results[self.run_name][-1], tag): |
4373 | + # LEVEL is already define in the last tag -> need to switch tag |
4374 | + tag = self.get_available_tag() |
4375 | + self.run_card['run_tag'] = tag |
4376 | + new_tag = True |
4377 | + break |
4378 | + if not new_tag: |
4379 | + # We can add the results to the current run |
4380 | + tag = self.results[self.run_name][-1]['tag'] |
4381 | + self.run_card['run_tag'] = tag # ensure that run_tag is correct |
4382 | + |
4383 | + |
4384 | + if name in self.results and not new_tag: |
4385 | self.results.def_current(self.run_name) |
4386 | else: |
4387 | self.results.add_run(self.run_name, self.run_card) |
4388 | - |
4389 | + |
4390 | + self.run_tag = self.run_card['run_tag'] |
4391 | + |
4392 | + # Return the tag of the previous run having the required data for this |
4393 | + # tag/run to working wel. |
4394 | + if level == 'parton': |
4395 | + return |
4396 | + elif level == 'pythia': |
4397 | + return self.results[self.run_name][0]['tag'] |
4398 | + else: |
4399 | + for i in range(-1,-len(self.results[self.run_name])-1,-1): |
4400 | + tagRun = self.results[self.run_name][i] |
4401 | + if tagRun.pythia: |
4402 | + return tagRun['tag'] |
4403 | + |
4404 | + |
4405 | |
4406 | |
4407 | |
4408 | @@ -3125,110 +3460,160 @@ |
4409 | name = {'0': 'auto', '1': 'pythia', '2':'pgs', '3':'delphes'} |
4410 | options = available_mode + [name[val] for val in available_mode] |
4411 | question = """Which programs do you want to run? |
4412 | - 0 / auto : running existing card |
4413 | - 1 / pythia : Pythia |
4414 | - 2 / pgs : Pythia + PGS\n""" |
4415 | + 0 / auto : running existing card |
4416 | + 1 / pythia : Pythia |
4417 | + 2 / pgs : Pythia + PGS\n""" |
4418 | if '3' in available_mode: |
4419 | question += """ 3 / delphes : Pythia + Delphes.\n""" |
4420 | |
4421 | - if not force: |
4422 | - if not mode: |
4423 | - mode = self.ask(question, '0', options, timeout=self.timeout) |
4424 | - elif not mode: |
4425 | - mode = 'auto' |
4426 | - |
4427 | - if mode.isdigit(): |
4428 | - mode = name[mode] |
4429 | - |
4430 | - if mode == 'auto': |
4431 | - if os.path.exists(pjoin(self.me_dir, 'Cards', 'pgs_card.dat')): |
4432 | - mode = 'pgs' |
4433 | - elif os.path.exists(pjoin(self.me_dir, 'Cards', 'delphes_card.dat')): |
4434 | - mode = 'delphes' |
4435 | - else: |
4436 | - mode = 'pythia' |
4437 | - logger.info('Will run in mode %s' % mode) |
4438 | - |
4439 | - # Clean the pointless card |
4440 | - if mode == 'pgs': |
4441 | - if os.path.exists(pjoin(self.me_dir,'Cards','delphes_card.dat')): |
4442 | - os.remove(pjoin(self.me_dir,'Cards','delphes_card.dat')) |
4443 | - if mode == 'delphes': |
4444 | - if os.path.exists(pjoin(self.me_dir,'Cards','pgs_card.dat')): |
4445 | - os.remove(pjoin(self.me_dir,'Cards','pgs_card.dat')) |
4446 | - |
4447 | - # Now that we know in which mode we are check that all the card |
4448 | - #exists (copy default if needed) |
4449 | - |
4450 | - cards = ['pythia_card.dat'] |
4451 | - self.add_card_to_run('pythia') |
4452 | - if mode == 'pgs': |
4453 | - self.add_card_to_run('pgs') |
4454 | - cards.append('pgs_card.dat') |
4455 | - if mode == 'delphes': |
4456 | - self.add_card_to_run('delphes') |
4457 | - cards.append('delphes_card.dat') |
4458 | - |
4459 | - if force: |
4460 | - return mode |
4461 | - |
4462 | - # Ask the user if he wants to edit any of the files |
4463 | - #First create the asking text |
4464 | - question = """Do you want to edit one cards (press enter to bypass editing)? |
4465 | - 1 / pythia : pythia_card.dat\n""" |
4466 | - possible_answer = ['0','done', '1', 'pythia'] |
4467 | - card = {0:'done', 1:'pythia', 9:'plot'} |
4468 | - if mode == 'pgs': |
4469 | - question += ' 2 / pgs : pgs_card.dat\n' |
4470 | - possible_answer.append(2) |
4471 | - possible_answer.append('pgs') |
4472 | - card[2] = 'pgs' |
4473 | - if mode == 'delphes': |
4474 | - question += ' 2 / delphes : delphes_card.dat\n' |
4475 | - question += ' 3 / trigger : delphes_trigger.dat\n' |
4476 | - possible_answer.append(2) |
4477 | - possible_answer.append('delphes') |
4478 | - possible_answer.append(3) |
4479 | - possible_answer.append('trigger') |
4480 | - card[2] = 'delphes' |
4481 | - card[3] = 'trigger' |
4482 | - if self.configuration['madanalysis_path']: |
4483 | - question += ' 9 / plot : plot_card.dat\n' |
4484 | - possible_answer.append(9) |
4485 | - possible_answer.append('plot') |
4486 | - |
4487 | - # Add the path options |
4488 | - question += ' Path to a valid card.\n' |
4489 | - |
4490 | - # Loop as long as the user is not done. |
4491 | - answer = 'no' |
4492 | - while answer != 'done': |
4493 | - answer = self.ask(question, '0', possible_answer, timeout=int(1.5*self.timeout), path_msg='enter path') |
4494 | - if answer.isdigit(): |
4495 | - answer = card[int(answer)] |
4496 | - if answer == 'done': |
4497 | - return |
4498 | - if os.path.exists(answer): |
4499 | - # detect which card is provide |
4500 | - card_name = self.detect_card_type(answer) |
4501 | - if card_name == 'unknown': |
4502 | - card_name = self.ask('Fail to determine the type of the file. Please specify the format', |
4503 | - ['pythia_card.dat','pgs_card.dat', |
4504 | - 'delphes_card.dat', 'delphes_trigger.dat','plot_card.dat']) |
4505 | - |
4506 | - logger.info('copy %s as %s' % (answer, card_name)) |
4507 | - files.cp(answer, pjoin(self.me_dir, 'Cards', card_name)) |
4508 | - continue |
4509 | - path = pjoin(self.me_dir,'Cards','%s_card.dat' % answer) |
4510 | - self.exec_cmd('open %s' % path) |
4511 | + if not force: |
4512 | + if not mode: |
4513 | + mode = self.ask(question, '0', options, timeout=self.timeout) |
4514 | + elif not mode: |
4515 | + mode = 'auto' |
4516 | + |
4517 | + if mode.isdigit(): |
4518 | + mode = name[mode] |
4519 | + |
4520 | + if mode == 'auto': |
4521 | + if os.path.exists(pjoin(self.me_dir, 'Cards', 'pgs_card.dat')): |
4522 | + mode = 'pgs' |
4523 | + elif os.path.exists(pjoin(self.me_dir, 'Cards', 'delphes_card.dat')): |
4524 | + mode = 'delphes' |
4525 | + else: |
4526 | + mode = 'pythia' |
4527 | + logger.info('Will run in mode %s' % mode) |
4528 | + |
4529 | + # Clean the pointless card |
4530 | + if mode == 'pgs': |
4531 | + if os.path.exists(pjoin(self.me_dir,'Cards','delphes_card.dat')): |
4532 | + os.remove(pjoin(self.me_dir,'Cards','delphes_card.dat')) |
4533 | + if mode == 'delphes': |
4534 | + if os.path.exists(pjoin(self.me_dir,'Cards','pgs_card.dat')): |
4535 | + os.remove(pjoin(self.me_dir,'Cards','pgs_card.dat')) |
4536 | + |
4537 | + # Now that we know in which mode we are check that all the card |
4538 | + #exists (copy default if needed) |
4539 | + |
4540 | + cards = ['pythia_card.dat'] |
4541 | + self.add_card_to_run('pythia') |
4542 | + if mode == 'pgs': |
4543 | + self.add_card_to_run('pgs') |
4544 | + cards.append('pgs_card.dat') |
4545 | + if mode == 'delphes': |
4546 | + self.add_card_to_run('delphes') |
4547 | + cards.append('delphes_card.dat') |
4548 | + |
4549 | + if force: |
4550 | + return mode |
4551 | + |
4552 | + # Ask the user if he wants to edit any of the files |
4553 | + #First create the asking text |
4554 | + question = """Do you want to edit one cards (press enter to bypass editing)?\n""" |
4555 | + question += """ 1 / pythia : pythia_card.dat\n""" |
4556 | + possible_answer = ['0','done', '1', 'pythia'] |
4557 | + card = {0:'done', 1:'pythia', 9:'plot'} |
4558 | + if mode == 'pgs': |
4559 | + question += ' 2 / pgs : pgs_card.dat\n' |
4560 | + possible_answer.append(2) |
4561 | + possible_answer.append('pgs') |
4562 | + card[2] = 'pgs' |
4563 | + if mode == 'delphes': |
4564 | + question += ' 2 / delphes : delphes_card.dat\n' |
4565 | + question += ' 3 / trigger : delphes_trigger.dat\n' |
4566 | + possible_answer.append(2) |
4567 | + possible_answer.append('delphes') |
4568 | + possible_answer.append(3) |
4569 | + possible_answer.append('trigger') |
4570 | + card[2] = 'delphes' |
4571 | + card[3] = 'trigger' |
4572 | + if self.configuration['madanalysis_path']: |
4573 | + question += ' 9 / plot : plot_card.dat\n' |
4574 | + possible_answer.append(9) |
4575 | + possible_answer.append('plot') |
4576 | + |
4577 | + # Add the path options |
4578 | + question += ' Path to a valid card.\n' |
4579 | + |
4580 | + # Loop as long as the user is not done. |
4581 | + answer = 'no' |
4582 | + while answer != 'done': |
4583 | + answer = self.ask(question, '0', possible_answer, timeout=int(1.5*self.timeout), path_msg='enter path') |
4584 | + if answer.isdigit(): |
4585 | + answer = card[int(answer)] |
4586 | + if answer == 'done': |
4587 | + return |
4588 | + if os.path.exists(answer): |
4589 | + # detect which card is provide |
4590 | + card_name = self.detect_card_type(answer) |
4591 | + if card_name == 'unknown': |
4592 | + card_name = self.ask('Fail to determine the type of the file. Please specify the format', |
4593 | + ['pythia_card.dat','pgs_card.dat', |
4594 | + 'delphes_card.dat', 'delphes_trigger.dat','plot_card.dat']) |
4595 | + |
4596 | + logger.info('copy %s as %s' % (answer, card_name)) |
4597 | + files.cp(answer, pjoin(self.me_dir, 'Cards', card_name)) |
4598 | + continue |
4599 | + path = pjoin(self.me_dir,'Cards','%s_card.dat' % answer) |
4600 | + self.exec_cmd('open %s' % path) |
4601 | |
4602 | return mode |
4603 | |
4604 | - def edit_one_card(self, card, fct_args): |
4605 | - """ """ |
4606 | + def ask_edit_cards(self, cards, fct_args): |
4607 | + """Question for cards editions (used for pgs/delphes)""" |
4608 | |
4609 | if '-f' in fct_args or '--no_default' in fct_args: |
4610 | return |
4611 | + |
4612 | + card_name = {'pgs': 'pgs_card.dat', |
4613 | + 'delphes': 'delphes_card.dat', |
4614 | + 'trigger': 'delphes_trigger.dat' |
4615 | + } |
4616 | + |
4617 | + # Ask the user if he wants to edit any of the files |
4618 | + #First create the asking text |
4619 | + question = """Do you want to edit one cards (press enter to bypass editing)?\n""" |
4620 | + possible_answer = ['0', 'done'] |
4621 | + card = {0:'done'} |
4622 | + |
4623 | + for i, mode in enumerate(cards): |
4624 | + possible_answer.append(i+1) |
4625 | + possible_answer.append(mode) |
4626 | + question += ' %s / %-9s : %s\n' % (i+1, mode, card_name[mode]) |
4627 | + card[i+1] = mode |
4628 | + |
4629 | + if self.configuration['madanalysis_path']: |
4630 | + question += ' 9 / %-9s : plot_card.dat\n' % 'plot' |
4631 | + possible_answer.append(9) |
4632 | + possible_answer.append('plot') |
4633 | + card[9] = 'plot' |
4634 | + |
4635 | + # Add the path options |
4636 | + question += ' Path to a valid card.\n' |
4637 | + |
4638 | + # Loop as long as the user is not done. |
4639 | + answer = 'no' |
4640 | + while answer != 'done': |
4641 | + answer = self.ask(question, '0', possible_answer, timeout=int(1.5*self.timeout), path_msg='enter path') |
4642 | + if answer.isdigit(): |
4643 | + answer = card[int(answer)] |
4644 | + if answer == 'done': |
4645 | + return |
4646 | + if os.path.exists(answer): |
4647 | + # detect which card is provide |
4648 | + card_name = self.detect_card_type(answer) |
4649 | + if card_name == 'unknown': |
4650 | + card_name = self.ask('Fail to determine the type of the file. Please specify the format', |
4651 | + ['pgs_card.dat', 'delphes_card.dat', 'delphes_trigger.dat']) |
4652 | + |
4653 | + logger.info('copy %s as %s' % (answer, card_name)) |
4654 | + files.cp(answer, pjoin(self.me_dir, 'Cards', card_name)) |
4655 | + continue |
4656 | + path = pjoin(self.me_dir,'Cards','%s_card.dat' % answer) |
4657 | + self.exec_cmd('open %s' % path) |
4658 | + |
4659 | + return mode |
4660 | + |
4661 | |
4662 | question = """Do you want to edit the %s?""" % card |
4663 | answer = self.ask(question, 'n', ['y','n'], timeout=self.timeout,path_msg='enter path') |
4664 | @@ -3297,49 +3682,196 @@ |
4665 | """The command line processor of MadGraph""" |
4666 | |
4667 | |
4668 | -#=============================================================================== |
4669 | -# GridPack |
4670 | -#=============================================================================== |
4671 | + |
4672 | +#=============================================================================== |
4673 | +# HELPING FUNCTION For Subprocesses |
4674 | +#=============================================================================== |
4675 | +class SubProcesses(object): |
4676 | + |
4677 | + name_to_pdg = {} |
4678 | + |
4679 | + @classmethod |
4680 | + def clean(cls): |
4681 | + cls.name_to_pdg = {} |
4682 | + |
4683 | + @staticmethod |
4684 | + def get_subP(me_dir): |
4685 | + """return the list of Subprocesses""" |
4686 | + |
4687 | + out = [] |
4688 | + for line in open(pjoin(me_dir,'SubProcesses', 'subproc.mg')): |
4689 | + if not line: |
4690 | + continue |
4691 | + name = line.strip() |
4692 | + if os.path.exists(pjoin(me_dir, 'SubProcesses', name)): |
4693 | + out.append(pjoin(me_dir, 'SubProcesses', name)) |
4694 | + |
4695 | + return out |
4696 | + |
4697 | + |
4698 | + |
4699 | + @staticmethod |
4700 | + def get_subP_info(path): |
4701 | + """ return the list of processes with their name""" |
4702 | + |
4703 | + nb_sub = 0 |
4704 | + names = {} |
4705 | + old_main = '' |
4706 | + |
4707 | + if not os.path.exists(os.path.join(path,'processes.dat')): |
4708 | + return make_info_html.get_subprocess_info_v4(path) |
4709 | + |
4710 | + for line in open(os.path.join(path,'processes.dat')): |
4711 | + main = line[:8].strip() |
4712 | + if main == 'mirror': |
4713 | + main = old_main |
4714 | + if line[8:].strip() == 'none': |
4715 | + continue |
4716 | + else: |
4717 | + main = int(main) |
4718 | + old_main = main |
4719 | + |
4720 | + sub_proccess = line[8:] |
4721 | + nb_sub += sub_proccess.count(',') + 1 |
4722 | + if main in names: |
4723 | + names[main] += [sub_proccess.split(',')] |
4724 | + else: |
4725 | + names[main]= [sub_proccess.split(',')] |
4726 | + |
4727 | + return names |
4728 | + |
4729 | + @staticmethod |
4730 | + def get_subP_info_v4(path): |
4731 | + """ return the list of processes with their name in case without grouping """ |
4732 | + |
4733 | + nb_sub = 0 |
4734 | + names = {'':[[]]} |
4735 | + path = os.path.join(path, 'auto_dsig.f') |
4736 | + found = 0 |
4737 | + for line in open(path): |
4738 | + if line.startswith('C Process:'): |
4739 | + found += 1 |
4740 | + names[''][0].append(line[15:]) |
4741 | + elif found >1: |
4742 | + break |
4743 | + return names |
4744 | + |
4745 | + |
4746 | + @staticmethod |
4747 | + def get_subP_ids(path): |
4748 | + """return the pdg codes of the particles present in the Subprocesses""" |
4749 | + |
4750 | + all_ids = [] |
4751 | + for line in open(pjoin(path, 'leshouche.inc')): |
4752 | + if not 'IDUP' in line: |
4753 | + continue |
4754 | + particles = re.search("/([\d,-]+)/", line) |
4755 | + all_ids.append([int(p) for p in particles.group(1).split(',')]) |
4756 | + return all_ids |
4757 | + |
4758 | + |
4759 | +#=============================================================================== |
4760 | class GridPackCmd(MadEventCmd): |
4761 | """The command for the gridpack --Those are not suppose to be use interactively--""" |
4762 | - |
4763 | - |
4764 | + |
4765 | def __init__(self, me_dir = None, nb_event=0, seed=0, *completekey, **stdin): |
4766 | """Initialize the command and directly run""" |
4767 | - |
4768 | - # Initialize properly |
4769 | + |
4770 | + # Initialize properly |
4771 | MadEventCmd.__init__(self, me_dir, *completekey, **stdin) |
4772 | - |
4773 | - # Now it's time to run! |
4774 | + self.run_mode = 0 |
4775 | + self.configuration['automatic_html_opening'] = False |
4776 | + # Now it's time to run! |
4777 | if me_dir and nb_event and seed: |
4778 | self.launch(nb_event, seed) |
4779 | - |
4780 | + |
4781 | + |
4782 | def launch(self, nb_event, seed): |
4783 | """ launch the generation for the grid """ |
4784 | - |
4785 | - # 1) Restore the default data |
4786 | + |
4787 | + # 1) Restore the default data |
4788 | logger.info('generate %s events' % nb_event) |
4789 | self.set_run_name('GridRun_%s' % seed) |
4790 | self.update_status('restoring default data', level=None) |
4791 | - subprocess.call([pjoin(self.me_dir,'bin','internal','restore_data'), 'default'], |
4792 | - cwd=self.me_dir) |
4793 | - |
4794 | - # 2) Run the refine for the grid |
4795 | - self.update_status('Generating Events', level=None) |
4796 | - ### TO EDIT #### |
4797 | - subprocess.call([pjoin(self.me_dir,'bin','internal','refine4grid'), |
4798 | - str(nb_event), '0', 'Madevent','1','GridRun_%s' % seed], |
4799 | - cwd=self.me_dir) |
4800 | - |
4801 | - # 3) Combine the events/pythia/... |
4802 | + subprocess.call([pjoin(self.me_dir,'bin','internal','restore_data'), self.run_name], |
4803 | + cwd=self.me_dir) |
4804 | + |
4805 | + # 2) Run the refine for the grid |
4806 | + self.update_status('Generating Events', level=None) |
4807 | + #subprocess.call([pjoin(self.me_dir,'bin','refine4grid'), |
4808 | + # str(nb_event), '0', 'Madevent','1','GridRun_%s' % seed], |
4809 | + # cwd=self.me_dir) |
4810 | + self.refine4grid(nb_event) |
4811 | + # 3) Combine the events/pythia/... |
4812 | self.exec_cmd('combine_events') |
4813 | - self.create_plot('parton') |
4814 | - self.exec_cmd('pythia --nodefault') |
4815 | - self.exec_cmd('pgs --nodefault') |
4816 | - |
4817 | - |
4818 | - |
4819 | - |
4820 | - |
4821 | - |
4822 | + self.exec_cmd('store_events') |
4823 | + self.exec_cmd('pythia --no_default -f') |
4824 | + |
4825 | + |
4826 | + |
4827 | + |
4828 | + def refine4grid(self, nb_event): |
4829 | + """Advanced commands: launch survey for the current process """ |
4830 | + self.nb_refine += 1 |
4831 | + |
4832 | + precision = nb_event |
4833 | + |
4834 | + # initialize / remove lhapdf mode |
4835 | + self.configure_directory() |
4836 | + self.cluster_mode = 0 # force single machine |
4837 | + |
4838 | + self.update_status('Refine results to %s' % precision, level=None) |
4839 | + logger.info("Using random number seed offset = %s" % self.random) |
4840 | + |
4841 | + self.total_jobs = 0 |
4842 | + subproc = [P for P in os.listdir(pjoin(self.me_dir,'SubProcesses')) if |
4843 | + P.startswith('P') and os.path.isdir(pjoin(self.me_dir,'SubProcesses', P))] |
4844 | + for nb_proc,subdir in enumerate(subproc): |
4845 | + subdir = subdir.strip() |
4846 | + Pdir = pjoin(self.me_dir, 'SubProcesses',subdir) |
4847 | + bindir = pjoin(os.path.relpath(self.dirbin, Pdir)) |
4848 | + |
4849 | + logger.info(' %s ' % subdir) |
4850 | + # clean previous run |
4851 | + for match in glob.glob(pjoin(Pdir, '*ajob*')): |
4852 | + if os.path.basename(match)[:4] in ['ajob', 'wait', 'run.', 'done']: |
4853 | + os.remove(pjoin(Pdir, match)) |
4854 | + |
4855 | + devnull = os.open(os.devnull, os.O_RDWR) |
4856 | + proc = subprocess.Popen([pjoin(bindir, 'gen_ximprove')], |
4857 | + stdin=subprocess.PIPE, |
4858 | + cwd=Pdir) |
4859 | + proc.communicate('%s 1 F\n' % (precision)) |
4860 | + |
4861 | + if os.path.exists(pjoin(Pdir, 'ajob1')): |
4862 | + misc.compile(['madevent'], cwd=Pdir) |
4863 | + # |
4864 | + os.system("chmod +x %s/ajob*" % Pdir) |
4865 | + alljobs = glob.glob(pjoin(Pdir,'ajob*')) |
4866 | + nb_tot = len(alljobs) |
4867 | + self.total_jobs += nb_tot |
4868 | + for i, job in enumerate(alljobs): |
4869 | + job = os.path.basename(job) |
4870 | + self.launch_job('./%s' % job, cwd=Pdir, remaining=(nb_tot-i-1), |
4871 | + run_type='Refine number %s on %s (%s/%s)' % (self.nb_refine, subdir, nb_proc+1, len(subproc))) |
4872 | + self.monitor(run_type='All job submitted for refine number %s' % self.nb_refine, |
4873 | + html=True) |
4874 | + |
4875 | + self.update_status("Combining runs", level='parton') |
4876 | + try: |
4877 | + os.remove(pjoin(Pdir, 'combine_runs.log')) |
4878 | + except: |
4879 | + pass |
4880 | + |
4881 | + bindir = pjoin(os.path.relpath(self.dirbin, pjoin(self.me_dir,'SubProcesses'))) |
4882 | + subprocess.call([pjoin(bindir, 'combine_runs')], |
4883 | + cwd=pjoin(self.me_dir,'SubProcesses'), |
4884 | + stdout=devnull) |
4885 | + |
4886 | + #subprocess.call([pjoin(self.dirbin, 'sumall')], |
4887 | + # cwd=pjoin(self.me_dir,'SubProcesses'), |
4888 | + # stdout=devnull) |
4889 | + |
4890 | + self.update_status('finish refine', 'parton', makehtml=False) |
4891 | + |
4892 | |
4893 | |
4894 | === modified file 'madgraph/iolibs/export_v4.py' |
4895 | --- madgraph/iolibs/export_v4.py 2011-12-09 16:10:49 +0000 |
4896 | +++ madgraph/iolibs/export_v4.py 2012-02-16 18:25:20 +0000 |
4897 | @@ -1181,15 +1181,16 @@ |
4898 | cp(_file_path+'/__init__.py', self.dir_path+'/bin/internal/__init__.py') |
4899 | cp(_file_path+'/various/gen_crossxhtml.py', |
4900 | self.dir_path+'/bin/internal/gen_crossxhtml.py') |
4901 | - cp(_file_path+'/various/splitbanner.py', |
4902 | - self.dir_path+'/bin/internal/splitbanner.py') |
4903 | + cp(_file_path+'/various/banner.py', |
4904 | + self.dir_path+'/bin/internal/banner.py') |
4905 | cp(_file_path+'/various/cluster.py', |
4906 | self.dir_path+'/bin/internal/cluster.py') |
4907 | cp(_file_path+'/various/sum_html.py', |
4908 | self.dir_path+'/bin/internal/sum_html.py') |
4909 | cp(_file_path+'/interface/.mg5_logging.conf', |
4910 | self.dir_path+'/bin/internal/me5_logging.conf') |
4911 | - |
4912 | + cp(_file_path+'/interface/coloring_logging.py', |
4913 | + self.dir_path+'/bin/internal/coloring_logging.py') |
4914 | |
4915 | #=========================================================================== |
4916 | # export model files |
4917 | @@ -1240,13 +1241,13 @@ |
4918 | cwd = os.getcwd() |
4919 | path = os.path.join(self.dir_path, 'SubProcesses') |
4920 | |
4921 | + |
4922 | if not self.model: |
4923 | self.model = matrix_element.get('processes')[0].get('model') |
4924 | |
4925 | |
4926 | |
4927 | os.chdir(path) |
4928 | - |
4929 | # Create the directory PN_xx_xxxxx in the specified path |
4930 | subprocdir = "P%s" % matrix_element.get('processes')[0].shell_string() |
4931 | try: |
4932 | @@ -1471,13 +1472,14 @@ |
4933 | |
4934 | os.chdir(os.path.pardir) |
4935 | |
4936 | - gen_infohtml.make_info_html(self.dir_path) |
4937 | - #subprocess.call([os.path.join(old_pos, self.dir_path, 'bin', 'internal', 'gen_crossxhtml.py')], |
4938 | - # stdout = devnull) |
4939 | + obj = gen_infohtml.make_info_html(self.dir_path) |
4940 | [mv(name, './HTML/') for name in os.listdir('.') if \ |
4941 | (name.endswith('.html') or name.endswith('.jpg')) and \ |
4942 | name != 'index.html'] |
4943 | - |
4944 | + if online: |
4945 | + nb_channel = obj.rep_rule['nb_gen_diag'] |
4946 | + open(pjoin('./Online'),'w').write(str(nb_channel)) |
4947 | + |
4948 | # Write command history as proc_card_mg5 |
4949 | if os.path.isdir('Cards'): |
4950 | output_file = os.path.join('Cards', 'proc_card_mg5.dat') |
4951 | @@ -1817,7 +1819,7 @@ |
4952 | if os.environ.has_key('HOME'): |
4953 | conf = os.path.join(os.environ['HOME'], '.mg5','mg5_configuration.txt') |
4954 | if os.path.exists(conf): |
4955 | - # just need to copy seed the path are absolute |
4956 | + # just need to copy since the path are absolute |
4957 | path = writer.name |
4958 | writer.close() |
4959 | cp(conf, path) |
4960 | |
4961 | === modified file 'madgraph/iolibs/files.py' |
4962 | --- madgraph/iolibs/files.py 2011-08-22 14:27:14 +0000 |
4963 | +++ madgraph/iolibs/files.py 2012-02-16 18:25:20 +0000 |
4964 | @@ -111,8 +111,11 @@ |
4965 | return False |
4966 | |
4967 | for path in path_list: |
4968 | - if os.path.getmtime(path) > pickle_date: |
4969 | - return False |
4970 | + try: |
4971 | + if os.path.getmtime(path) > pickle_date: |
4972 | + return False |
4973 | + except Exception: |
4974 | + continue |
4975 | #all pass |
4976 | return True |
4977 | |
4978 | |
4979 | === modified file 'madgraph/iolibs/helas_call_writers.py' |
4980 | --- madgraph/iolibs/helas_call_writers.py 2011-12-16 15:23:56 +0000 |
4981 | +++ madgraph/iolibs/helas_call_writers.py 2012-02-16 18:25:20 +0000 |
4982 | @@ -1152,18 +1152,16 @@ |
4983 | |
4984 | # Check if we need to append a charge conjugation flag |
4985 | l = [str(l) for l in argument.get('lorentz')] |
4986 | - c_flag = '' |
4987 | - l_flag = '' |
4988 | + flag = [] |
4989 | if argument.needs_hermitian_conjugate(): |
4990 | - c_flag = "".join(['C%d' % i for i in \ |
4991 | - argument.get_conjugate_index()]) |
4992 | + flag = ['C%d' % i for i in argument.get_conjugate_index()] |
4993 | if (isinstance(argument, helas_objects.HelasWavefunction) and \ |
4994 | argument.get('is_loop') or (isinstance(argument, helas_objects.HelasAmplitude) and \ |
4995 | argument.get('type')=='loop')): |
4996 | - l_flag = "L" |
4997 | - routine_name = aloha_writers.combine_name( |
4998 | - '%s%s%s' % (l[0], c_flag, l_flag),\ |
4999 | - l[1:], outgoing) |
5000 | + flag.append("L") |
Hi Valentin, the three test remaining are clearly linked to NLO stuff:
======= ======= ======= ======= ======= ======= ======= ======= ======= ======= subproc_ group_symmetry (tests. acceptance_ tests.test_ cmd.TestCmdShel l2) ------- ------- ------- ------- ------- ------- ------- ------- ------- omatt/Documents /Eclipse/ NLO1.4/ tests/acceptanc e_tests/ test_cmd. py", line 729, in test_madevent_ subproc_ group_symmetry omatt/Documents /Eclipse/ NLO1.4/ tests/acceptanc e_tests/ test_cmd. py", line 168, in do cmd.exec_ cmd(line) omatt/Documents /Eclipse/ NLO1.4/ madgraph/ interface/ extended_ cmd.py" , line 443, in exec_cmd onecmd( current_ interface, line) Frameworks/ Python. framework/ Versions/ 2.7/lib/ python2. 7/cmd.py" , line 219, in onecmd omatt/Documents /Eclipse/ NLO1.4/ madgraph/ interface/ cmd_interface. py", line 3106, in do_output amps.has_ any_loop_ process( ) and \ omatt/Documents /Eclipse/ NLO1.4/ madgraph/ core/diagram_ generation. py", line 1005, in has_any_ loop_process 'process' ).get(' perturbation_ couplings' ): omatt/Documents /Eclipse/ NLO1.4/ madgraph/ core/diagram_ generation. py", line 314, in get __bases_ _[0].get( self, name) #return the mother routine omatt/Documents /Eclipse/ NLO1.4/ madgraph/ core/base_ objects. py", line 91, in get omatt/Documents /Eclipse/ NLO1.4/ madgraph/ core/base_ objects. py", line 67, in __getitem__ is_valid_ prop(name) #raise the correct error omatt/Documents /Eclipse/ NLO1.4/ madgraph/ core/base_ objects. py", line 84, in is_valid_prop class__ .__name_ _)
ERROR: test_madevent_
Check that symmetry.f gives right output
-------
Traceback (most recent call last):
File "/Users/
self.do('output %s ' % self.out_dir)
File "/Users/
self.
File "/Users/
stop = cmd.Cmd.
File "/Library/
return func(arg)
File "/Users/
if self._curr_
File "/Users/
if amp.get(
File "/Users/
return Amplitude.
File "/Users/
return self[name]
File "/Users/
self.
File "/Users/
(name, self.__
PhysicsObjectError: process is not a valid property for this object: DecayChainAmplitude
Could you try to fix those? call me if you need help off course.