NNPDF / pinefarm

Generate PineAPPL grids from PineCards
https://pinefarm.readthedocs.io
GNU General Public License v3.0
1 stars 0 forks source link

FileNotFoundError when running Madgraph trough pinefarm #47

Closed comane closed 10 months ago

comane commented 10 months ago

Hi @cschwan, I am trying to reproduce one of the pinecards using pinefarm. For ATLAS_TTB_13TEV_TOT (as well as for others) I get this error that seems related to pineappl

INFO:  
INFO: Checking test output: 
INFO: P0_gg_ttx 
INFO:  Result for test_ME: 
Command "launch auto " interrupted with error:
FileNotFoundError : [Errno 2] No such file or directory: '/store/DAMTP/mnc33/Projects_store/PhD/nnpdf40_pheno/pinefarm_runs/results/200-ATLAS_TTB_13TEV_TOT--20231113103915/ATLAS_TTB_13TEV_TOT/SubProcesses/P0_gg_ttx/test_ME.log'
Please report this bug on https://bugs.launchpad.net/mg5amcnlo
More information is found in '/store/DAMTP/mnc33/Projects_store/PhD/nnpdf40_pheno/pinefarm_runs/results/200-ATLAS_TTB_13TEV_TOT--20231113103915/ATLAS_TTB_13TEV_TOT/run_01_tag_1_debug.log'.
Please attach this file to your report.
INFO:  
quit
INFO:  
quit
quit
Error calling StartServiceByName for org.freedesktop.Notifications: Timeout was reached
Traceback (most recent call last):
  File "/store/DAMTP/mnc33/miniconda3/envs/pinefarm/bin/pinefarm", line 8, in <module>
    sys.exit(command())
  File "/store/DAMTP/mnc33/miniconda3/envs/pinefarm/lib/python3.9/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/store/DAMTP/mnc33/miniconda3/envs/pinefarm/lib/python3.9/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/store/DAMTP/mnc33/miniconda3/envs/pinefarm/lib/python3.9/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/store/DAMTP/mnc33/miniconda3/envs/pinefarm/lib/python3.9/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/store/DAMTP/mnc33/miniconda3/envs/pinefarm/lib/python3.9/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/store/DAMTP/mnc33/miniconda3/envs/pinefarm/lib/python3.9/site-packages/pinefarm/cli/run.py", line 31, in subcommand
    main(dataset, theory_card, pdf)
  File "/store/DAMTP/mnc33/miniconda3/envs/pinefarm/lib/python3.9/site-packages/pinefarm/cli/run.py", line 67, in main
    run_dataset(runner)
  File "/store/DAMTP/mnc33/miniconda3/envs/pinefarm/lib/python3.9/site-packages/pinefarm/cli/run.py", line 122, in run_dataset
    runner.generate_pineappl()
  File "/store/DAMTP/mnc33/miniconda3/envs/pinefarm/lib/python3.9/site-packages/pinefarm/external/mg5/__init__.py", line 184, in generate_pineappl
    grid = pineappl.grid.Grid.read(mg5_grids[0])
IndexError: list index out of range
Thanks for using LHAPDF 6.4.0. Please make sure to cite the paper:
  Eur.Phys.J. C75 (2015) 3, 132  (http://arxiv.org/abs/1412.7420)
felixhekhorn commented 10 months ago

Can you check this directory https://github.com/NNPDF/pinefarm/blob/c007afc3f6f51b5bb5d3f93ccdc6caa77e192f05/src/pinefarm/external/mg5/__init__.py#L181 ? did mg5 actually run?

comane commented 10 months ago

Hi @felixhekhorn , I am not sure I do have that directory as I have installed pinefarm with pip.

However, if I go in the .prefix/mg5amc folder Madgraph is there and runs.

felixhekhorn commented 10 months ago

However, if I go in the .prefix/mg5amc folder Madgraph is there and runs.

that is good, that at least mg5 got correctly installed - but the question is rather did it run for the grid you requested?

Hi @felixhekhorn , I am not sure I do have that directory as I have installed pinefarm with pip.

i.e. whenever you request a new calculation pinefarm should create a new folder (the name of the dataset + some timestamp) and that should be the mg5_dir in the snippet above ...

actually, I just had a closer look to your snippet (which I should have done in the first place :see_no_evil: ) and the directory we're talking about is /store/DAMTP/mnc33/Projects_store/PhD/nnpdf40_pheno/pinefarm_runs/results/200-ATLAS_TTB_13TEV_TOT--20231113103915/

However, I was trying to understand why there are no grids (which is the python error you encounter), but the true problem is the mg5 error just as you said in the first place "FileNotFoundError" ... sorry for the noise ... on mg5 I've no idea, maybe @cschwan knows more ... is the debug file mentioned in the error revealing more information? else this might be better discussed in the mg5 bugtracker?

comane commented 10 months ago

The mg5 log doesn't say much more:

launch auto Traceback (most recent call last): File "/store/DAMTP/mnc33/Projects_store/PhD/nnpdf40_pheno/pinefarm_runs/.prefix/mg5amc/madgraph/interface/extended_cmd.py", line 1544, in onecmd return self.onecmd_orig(line, opt) File "/store/DAMTP/mnc33/Projects_store/PhD/nnpdf40_pheno/pinefarm_runs/.prefix/mg5amc/madgraph/interface/extended_cmd.py", line 1493, in onecmd_orig return func(arg, opt) File "/store/DAMTP/mnc33/Projects_store/PhD/nnpdf40_pheno/pinefarm_runs/.prefix/mg5amc/madgraph/interface/amcatnlo_run_interface.py", line 1783, in do_launch self.compile(mode, options) File "/store/DAMTP/mnc33/Projects_store/PhD/nnpdf40_pheno/pinefarm_runs/.prefix/mg5amc/madgraph/interface/amcatnlo_run_interface.py", line 5407, in compile self.check_tests(test, this_dir) File "/store/DAMTP/mnc33/Projects_store/PhD/nnpdf40_pheno/pinefarm_runs/.prefix/mg5amc/madgraph/interface/amcatnlo_run_interface.py", line 5418, in check_tests return self.parse_test_mx_log(pjoin(dir, '%s.log' % test)) File "/store/DAMTP/mnc33/Projects_store/PhD/nnpdf40_pheno/pinefarm_runs/.prefix/mg5amc/madgraph/interface/amcatnlo_run_interface.py", line 5425, in parse_test_mx_log content = open(log).read() FileNotFoundError: [Errno 2] No such file or directory: '/store/DAMTP/mnc33/Projects_store/PhD/nnpdf40_pheno/pinefarm_runs/results/200-ATLAS_TTB_13TEV_TOT--20231113103915/ATLAS_TTB_13TEV_TOT/SubProcesses/P0_gg_ttx/test_ME.log' Related File: /store/DAMTP/mnc33/Projects_store/PhD/nnpdf40_pheno/pinefarm_runs/results/200-ATLAS_TTB_13TEV_TOT--20231113103915/ATLAS_TTB_13TEV_TOT/SubProcesses/P0_gg_ttx/test_ME.log Value of current Options: pythia8_path : None hwpp_path : None thepeg_path : None hepmc_path : None madanalysis_path : None madanalysis5_path : None pythia-pgs_path : None rivet_path : None yoda_path : None contur_path : None td_path : None delphes_path : None exrootanalysis_path : None syscalc_path : None timeout : 60 web_browser : None eps_viewer : None text_editor : None fortran_compiler : None f2py_compiler : None f2py_compiler_py2 : None f2py_compiler_py3 : None cpp_compiler : None cluster_type : condor cluster_queue : None cluster_status_update : (600, 30) fastjet : None golem : None samurai : None ninja : /store/DAMTP/mnc33/Projects_store/PhD/nnpdf40_pheno/pinefarm_runs/.prefix/mg5amc/HEPTools/lib collier : /store/DAMTP/mnc33/Projects_store/PhD/nnpdf40_pheno/pinefarm_runs/.prefix/mg5amc/HEPTools/lib lhapdf : lhapdf-config pineappl : pineappl lhapdf_py2 : None lhapdf_py3 : None cluster_temp_path : None mg5amc_py8_interface_path : None cluster_local_path : None OLP : MadLoop cluster_nb_retry : 1 cluster_retry_wait : 300 cluster_size : 100 output_dependencies : external crash_on_error : False auto_convert_model : True acknowledged_v3.1_syntax : False group_subprocesses : Auto ignore_six_quark_processes : False low_mem_multicore_nlo_generation : False complex_mass_scheme : False include_lepton_initiated_processes : False gauge : unitary stdout_level : 20 loop_optimized_output : True loop_color_flows : False max_npoint_for_channel : 0 default_unset_couplings : 99 max_t_for_channel : 99 zerowidth_tchannel : True nlo_mixed_expansion : True auto_update : 7 automatic_html_opening : False run_mode : 2 nb_core : 112 notification_center : True mg5_path : /store/DAMTP/mnc33/Projects_store/PhD/nnpdf40_pheno/pinefarm_runs/.prefix/mg5amc

felixhekhorn commented 10 months ago

Okay, digging a bit more around ... I think we're looking at the processed version of this file https://github.com/NNPDF/pinecards/blob/master/ATLAS_TTB_13TEV_TOT/launch.txt right? this should configure mg5 with the correct setting, I think

can you check that the generated version /store/DAMTP/mnc33/Projects_store/PhD/nnpdf40_pheno/pinefarm_runs/results/200-ATLAS_TTB_13TEV_TOT--20231113103915/launch.txt (?) makes sense? if that is fine, pinefarm has done it's job and it is a mg5 bug ...

(Just for the sake of it:

comane commented 10 months ago

So I think it has to be a Madgraph bug given that the launch card seems to have been generated correctly (see below):

$ sdiff launch.txt ../../runcards/ATLAS_TTB_13TEV_TOT/launch.txt 
launch ATLAS_TTB_13TEV_TOT                    | launch @OUTPUT@
fixed_order = ON                        fixed_order = ON
set maxjetflavor 5                      set maxjetflavor 5
set gf 1.1663787e-5                       | set gf @GF@
set mh 125.0                              | set mh @MH@
set mt 172.5                              | set mt @MT@
set mw 80.352                             | set mw @MW@
set mz 91.1535                            | set mz @MZ@
set wh 4.07468e-3                         | set wh @WH@
set wt 0.0                          set wt 0.0
set ww 2.084                              | set ww @WW@
set wz 2.4943                             | set wz @WZ@
set ebeam1 6500                         set ebeam1 6500
set ebeam2 6500                         set ebeam2 6500
set pdlabel lhapdf                      set pdlabel lhapdf
set lhaid 324900                          | set lhaid @LHAPDF_ID@
set dynamical_scale_choice 10                   set dynamical_scale_choice 10
set reweight_scale True                     set reweight_scale True
set req_acc_FO 0.001                        set req_acc_FO 0.001
set pineappl True                       set pineappl True
done                                done
quit                                quit
felixhekhorn commented 10 months ago

So I think it has to be a Madgraph bug given that the launch card seems to have been generated correctly (see below):

good (i.e. someone else needs to worry :innocent: ) - just to exclude pinefarm 100%: can you just run mg5 independently (i.e. not through the CI)? something like /store/DAMTP/mnc33/Projects_store/PhD/nnpdf40_pheno/pinefarm_runs/.prefix/mg5amc launch.txt (it should hopefully crash with the very same error)

comane commented 10 months ago

Indeed, by running directly trough Madgraph I get the same error 👍

Maybe a weird thing might be the following: although fixed_order = ON in launch.txt it seems (see below) that it is set to OFF?

113142105/ATLAS_TTB_13TEV_TOT/Cards/amcatnlo_configuration.txt
launch auto WARNING: NLO+PS mode is not allowed for processes including electroweak corrections The following switches determine which programs are run: /================== Description ==================|=========== values ===========|================ other options ================\ | 1. Type of perturbative computation | order = NLO | LO | | 2. No MC@[N]LO matching / event generation | fixed_order = OFF | No NLO+PS available for EW correction | \================================================================================================================================/

comane commented 10 months ago

Btw @felixhekhorn , thanks for the answers! feel free to close the issue if you think that it has nothing to do with pinefarm and its an mg5 issue only!

felixhekhorn commented 10 months ago

Maybe a weird thing might be the following: although fixed_order = ON in launch.txt it seems (see below) that it is set to OFF?

as said, I've no idea of mg5 - but, guessing in the wild, I think mg5 somehow makes a test run, before doing the actual calculation (see Checking test output: or Result for test_ME:) and that is the one that fails ... maybe there the settings are weird ...

Btw @felixhekhorn , thanks for the answers! feel free to close the issue if you think that it has nothing to do with pinefarm and its an mg5 issue only!

let's do this