SmileiPIC / Smilei

Particle-in-cell code for plasma simulation
https://smileipic.github.io/Smilei
343 stars 120 forks source link

Opening Smilei Diagnostics while a simulation is running and also afterwards #231

Closed GeorgeHicks1 closed 4 years ago

GeorgeHicks1 commented 4 years ago

Description

I opened a simulation while it was still running to check everything was going ok. The simulation opened ok. After the simulation had finished, I tried to open it again and I get the error below.

Have I somehow corrupted the hdf5 files by opening them while they were being written to?

It is similar to this I believe: https://github.com/h5py/h5py/issues/736

Loaded simulation '/Volumes/gsh11/ephemeral/ELI_a03_01HAr_lin_r2_cx2_copy'
Scanning for Scalar diagnostics
WARNING: you do not have the *pint* package, so you cannot modify units.
       : The results will stay in code units.
Scanning for Field diagnostics
Scanning for Probe diagnostics
Scanning for ParticleBinning diagnostics
Scanning for Screen diagnostics
Scanning for Tracked particle diagnostics
Traceback (most recent call last):

  File "<ipython-input-71-8d93004e4adf>", line 1, in <module>
    runfile('/Users/gsh11/OneDrive - Imperial College London/Experiments/2020ELI/ELI_sim_v2.py', wdir='/Users/gsh11/OneDrive - Imperial College London/Experiments/2020ELI')

  File "//anaconda3/lib/python3.7/site-packages/spyder_kernels/customize/spydercustomize.py", line 827, in runfile
    execfile(filename, namespace)

  File "//anaconda3/lib/python3.7/site-packages/spyder_kernels/customize/spydercustomize.py", line 110, in execfile
    exec(compile(f.read(), filename, 'exec'), namespace)

  File "/Users/gsh11/OneDrive - Imperial College London/Experiments/2020ELI/ELI_sim_v2.py", line 63, in <module>
    Electron_xy=S.ParticleBinning(0)

  File "/Users/gsh11/OneDrive - Imperial College London/Simulations/Smieli/Smilei20191129/happi/_core.py", line 306, in __call__
    return ParticleBinning.ParticleBinning(self._simulation, *(self._additionalArgs+args), **kwargs)

  File "/Users/gsh11/OneDrive - Imperial College London/Simulations/Smieli/Smilei20191129/happi/_Diagnostics/Diagnostic.py", line 78, in __init__
    remaining_kwargs = self._init(*args, **kwargs)

  File "/Users/gsh11/OneDrive - Imperial College London/Simulations/Smieli/Smilei20191129/happi/_Diagnostics/ParticleBinning.py", line 99, in _init
    items.update( dict(f) )

  File "//anaconda3/lib/python3.7/_collections_abc.py", line 720, in __iter__
    yield from self._mapping

  File "//anaconda3/lib/python3.7/site-packages/h5py/_hl/group.py", line 407, in __iter__
    for x in self.id.__iter__():

  File "h5py/h5g.pyx", line 472, in h5py.h5g.GroupID.__iter__

  File "h5py/h5g.pyx", line 473, in h5py.h5g.GroupID.__iter__

  File "h5py/h5g.pyx", line 100, in h5py.h5g.GroupIter.__init__

  File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper

  File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper

  File "h5py/h5g.pyx", line 331, in h5py.h5g.GroupID.get_num_objs

RuntimeError: Can't determine (addr overflow, addr = 1194401192, size = 544, eoa = 1140110760)

Steps to reproduce the problem

S=happi.Open(sim_directory) Electron_xy=S.ParticleBinning(0)

Parameters

mccoys commented 4 years ago

I have no problem opening hdf5 files while the simulation is running, even on supercomputers. I think we had similar reports in the past, but that was due to the filesystem, or to the way hdf5 was compiled. Could you contact your sysadmin on these points?

GeorgeHicks1 commented 4 years ago

Ok, thank you for your response. I will contact them.

mccoys commented 4 years ago

If you find a solution, please share. It could be useful to other users with similar systems

GeorgeHicks1 commented 3 years ago

I've found a solution that works for me. I now add os.environ["HDF5_USE_FILE_LOCKING"] = "FALSE" before opening a simulation.