Closed rajatsaxena closed 3 years ago
Thank you so much for the comment. We have run EXTRACT on files as large as 120GB, in theory the size shouldn't really be a problem. Could you describe more specifically what is meant by crashing? If RAM is not enough for a big movie, then increasing the partition number is the way to go. Could you provide the error message?
the RAM size was 64GB and it was getting full. I used the partition and that made it work. Thank you! Getting new errors in cell_check.m
but I will test it and create a new issue. Thank you
Thanks! Currently, our cell_check module is an experimental module that works only if there are no partitions. We are working on it, but it will not be fully ready for some time. Instead, we have provided a link to another sorter in the Readme file, which is also from Schnitzerlab. I would suggest going through the Readme file regularly to check for updates on the matter related to cell_check.m.
Thank you for quick response. I will test the other cell_check method.
Thanks for the program, works really well on the small datasets I have tried so far :)
I am currently using UCLA miniscope which saved multiple .avi files for a given session. I am using NormCorre to register multiple avi files and save them into either multiple h5 files or a large h5 file. If I try to run Extract algorithm on large h5 file (typical size ~9Gb), it crashes. Is there a way to run Extract on multiple h5 files at the same time.