IGNF / myria3d

Myria3D: Aerial Lidar HD Semantic Segmentation with Deep Learning
https://ignf.github.io/myria3d/
BSD 3-Clause "New" or "Revised" License
151 stars 20 forks source link

Failure creating a larger hdf5 dataset #131

Open Vynikal opened 3 weeks ago

Vynikal commented 3 weeks ago

Whenever creating a larger hdf5 dataset, i.e. approximately more than 4 lidar HD tiles for training set, the resulting hdf5 file collapses to a few kilobytes without any explanation. The RAM size might play a role in this, I have 32GB and the process has to utilize swap partition in order to create a larger dataset. But even if it completes without any apparent error, the hdf5 file is tiny and when running the RandLa experiment, it attempts to create it again. Some error then follows. What is happening? If it is because of the RAM size, is there any way to circumvent it?