python -m methylprep -v process -d ./ --all --no_sample_sheet
and
python -m methylprep -v process -d ./ --all --no_sample_sheet --batch-size X
act no different in terms of memory requirements. For 107 samples, they both use >60GB of memory.
dev note: the --all overrides any other kwargs, so --batch-size is ignored
Between batches, the memory does not clear. It just continues to grow. Should the memory not be cleared after the .pkl objects are written to free up memory?
Also, the memory requirement seems very high, is there a memory leak?
via @moranr7
Running :
python -m methylprep -v process -d ./ --all --no_sample_sheet
and
python -m methylprep -v process -d ./ --all --no_sample_sheet --batch-size X
act no different in terms of memory requirements. For 107 samples, they both use >60GB of memory.
Between batches, the memory does not clear. It just continues to grow. Should the memory not be cleared after the .pkl objects are written to free up memory?
Also, the memory requirement seems very high, is there a memory leak?
Thanks, Ray