One experimental session is typically ~12 GB (after Step1 analysis and saving the .h5 file). This is quite a lot, especially if we consider that in the future there will be neuropixels data as well.
The largest directories are:
OnixAnalogData is ~4.7GB
OnixAnalogClock ~1.6 GB
Video Data ~1.5GB x2
I suggest that after Step1 (data extraction, Sleap processing and .h5 file creation), at least these directories are compressed with lrzip (best compression ratio I could find).
lrztar -z <folder_name>
OnixAnalogueData 4,7GB -> 350 MB (!) in 10 minutes
OnixAnalogueClock 1.6GB -> 440 MB in 2 minutes
VideoData1 1.5 GB -> 940 GB in 1 minute
For video data, we can test another algorithm, explore lossless h265 in ffmpeg.
One experimental session is typically ~12 GB (after Step1 analysis and saving the .h5 file). This is quite a lot, especially if we consider that in the future there will be neuropixels data as well.
The largest directories are: OnixAnalogData is ~4.7GB OnixAnalogClock ~1.6 GB Video Data ~1.5GB x2
I suggest that after Step1 (data extraction, Sleap processing and .h5 file creation), at least these directories are compressed with lrzip (best compression ratio I could find).
lrztar -z <folder_name>
OnixAnalogueData 4,7GB -> 350 MB (!) in 10 minutes OnixAnalogueClock 1.6GB -> 440 MB in 2 minutes VideoData1 1.5 GB -> 940 GB in 1 minute
For video data, we can test another algorithm, explore lossless h265 in ffmpeg.