cta-observatory / lstmcpipe

Scripts to analyse MC files on LST cluster at La Palma
https://cta-observatory.github.io/lstmcpipe/
MIT License
7 stars 15 forks source link

Index error (r0 to dl1 process) on the base production (dec_931) #371

Open SeiyaNozaki opened 1 year ago

SeiyaNozaki commented 1 year ago

I found an error on the r0 to dl1 process for a single file and the merge jobs failed consequently.

/fefs/aswg/data/mc/DL1/AllSky/20221027_v0.9.9_base_prod/TrainingDataset/dec_931/Protons/node_corsika_theta_31.589_az_122.714_/job_logs_r0dl1/job_20543571_6.e

Traceback (most recent call last):
  File "/fefs/aswg/software/conda/envs/lstchain-v0.9.9/bin/lstchain_mc_r0_to_dl1", line 8, in <module>
    sys.exit(main())
  File "/fefs/aswg/software/conda/envs/lstchain-v0.9.9/lib/python3.8/site-packages/lstchain/scripts/lstchain_mc_r0_to_dl1.py", line 74, in main
    r0_to_dl1.r0_to_dl1(
  File "/fefs/aswg/software/conda/envs/lstchain-v0.9.9/lib/python3.8/site-packages/lstchain/reco/r0_to_dl1.py", line 459, in r0_to_dl1
    for i, event in enumerate(source):
  File "/fefs/aswg/software/conda/envs/lstchain-v0.9.9/lib/python3.8/site-packages/ctapipe/io/eventsource.py", line 278, in __iter__
    for event in self._generator():
  File "/fefs/aswg/software/conda/envs/lstchain-v0.9.9/lib/python3.8/site-packages/ctapipe/io/simteleventsource.py", line 376, in _generator
    yield from self._generate_events()
  File "/fefs/aswg/software/conda/envs/lstchain-v0.9.9/lib/python3.8/site-packages/ctapipe/io/simteleventsource.py", line 393, in _generate_events
    for counter, array_event in enumerate(self.file_):
  File "/fefs/aswg/software/conda/envs/lstchain-v0.9.9/lib/python3.8/site-packages/eventio/simtel/simtelfile.py", line 291, in iter_array_events
    self.next_low_level()
  File "/fefs/aswg/software/conda/envs/lstchain-v0.9.9/lib/python3.8/site-packages/eventio/simtel/simtelfile.py", line 155, in next_low_level
    self.current_mc_event = o.parse()
  File "/fefs/aswg/software/conda/envs/lstchain-v0.9.9/lib/python3.8/site-packages/eventio/simtel/objects.py", line 1413, in parse
    d = MCEvent.parse_mc_event(self.read(), self.header.version)
  File "src/eventio/simtel/parsing.pyx", line 53, in eventio.simtel.parsing.parse_mc_event
  File "src/eventio/simtel/parsing.pyx", line 66, in eventio.simtel.parsing.parse_mc_event
IndexError: Out of bounds on buffer access (axis 0)

/fefs/aswg/data/mc/DL1/AllSky/20221027_v0.9.9_base_prod/TrainingDataset/dec_931/Protons/merging-output.e

 31%|███       | 2618/8391 [13:48<35:41,  2.70it/s]Can't append node /dl1/event/telescope/parameters/LST_LSTCam from file /fefs/aswg/data/mc/DL1/AllSky/20221027_v0.9.9_base_prod/TrainingDataset/dec_931/Protons/node_corsika_theta_31.589_az_122.714_/dl1_simtel_corsika_theta_31.589_az_122.714_run192.h5
Traceback (most recent call last):
  File "/fefs/aswg/software/conda/envs/lstchain-v0.9.9/bin/lstchain_merge_hdf5_files", line 8, in <module>
    sys.exit(main())
  File "/fefs/aswg/software/conda/envs/lstchain-v0.9.9/lib/python3.8/site-packages/lstchain/scripts/lstchain_merge_hdf5_files.py", line 88, in main
    auto_merge_h5files(
  File "/fefs/aswg/software/conda/envs/lstchain-v0.9.9/lib/python3.8/site-packages/lstchain/io/io.py", line 341, in auto_merge_h5files
    out_node.append(in_node.read().astype(out_node.dtype))
ValueError: structures must have the same size
vuillaut commented 1 year ago

Thank you for reporting @SeiyaNozaki ! I will look into it

vuillaut commented 1 year ago

Hi @SeiyaNozaki Is there another analysis/production affected by this error?

SeiyaNozaki commented 1 year ago

No, I found this error by chance:)

vuillaut commented 1 year ago

I have restarted the DL1 production for node node_corsika_theta_31.589_az_122.714_ and the merging.

vuillaut commented 1 year ago

Ok this seems fixed.

SeiyaNozaki commented 1 year ago

@vuillaut It seems other base productions have the same issue: 20221215_v0.9.12_base_prod and 20230127_v0.9.12_base_prod_az_tel

SeiyaNozaki commented 1 year ago

@vuillaut Did you already reprocess those productions? I heard gammaness distribution is strange when Estelle used this production (dec_931 in 20230127_v0.9.12_base_prod_az_tel), so I suspect this issue affects her analysis.