Thank you for developing such a convenient program, CITE-seq-Count, which helped our lab a lot, so far.
But recently when I processed the fastq data (total 516,131,880 reads of R1 and R2), I continuedly get the message "MemoryError" in each ForkPoolWorker, which showed below:
Process ForkPoolWorker-7:
Traceback (most recent call last):
File "/home/amber4mint/miniconda3/lib/python3.9/site-packages/multiprocess/pool.py", line 131, in worker
put((job, i, result))
File "/home/amber4mint/miniconda3/lib/python3.9/site-packages/multiprocess/queues.py", line 375, in put
obj = _ForkingPickler.dumps(obj)
File "/home/amber4mint/miniconda3/lib/python3.9/site-packages/multiprocess/reduction.py", line 54, in dumps
cls(buf, protocol, *args, *kwds).dump(obj)
File "/home/amber4mint/miniconda3/lib/python3.9/site-packages/dill/_dill.py", line 498, in dump
StockPickler.dump(self, obj)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 487, in dump
self.save(obj)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 560, in save
f(self, obj) # Call unbound method with explicit self
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 886, in save_tuple
save(element)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 560, in save
f(self, obj) # Call unbound method with explicit self
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 886, in save_tuple
save(element)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 560, in save
f(self, obj) # Call unbound method with explicit self
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 886, in save_tuple
save(element)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 560, in save
f(self, obj) # Call unbound method with explicit self
File "/home/amber4mint/miniconda3/lib/python3.9/site-packages/dill/_dill.py", line 990, in save_module_dict
StockPickler.save_dict(pickler, obj)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 971, in save_dict
self._batch_setitems(obj.items())
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 997, in _batch_setitems
save(v)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 603, in save
self.save_reduce(obj=obj, rv)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 713, in save_reduce
self._batch_setitems(dictitems)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 997, in _batch_setitems
save(v)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 603, in save
self.save_reduce(obj=obj, *rv)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 692, in save_reduce
save(args)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 560, in save
f(self, obj) # Call unbound method with explicit self
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 886, in save_tuple
save(element)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 560, in save
f(self, obj) # Call unbound method with explicit self
File "/home/amber4mint/miniconda3/lib/python3.9/site-packages/dill/_dill.py", line 990, in save_module_dict
StockPickler.save_dict(pickler, obj)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 971, in save_dict
self._batch_setitems(obj.items())
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 996, in _batch_setitems
save(k)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 536, in save
self.framer.commit_frame()
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 233, in commit_frame
write(data)
MemoryError
and the program seem not to be stopped, then continue the processing in next ForkPoolWork:
MemoryError
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/amber4mint/miniconda3/lib/python3.9/site-packages/multiprocess/process.py", line 315, in _bootstrap
self.run()
File "/home/amber4mint/miniconda3/lib/python3.9/site-packages/multiprocess/process.py", line 108, in run
self._target(*self._args, *self._kwargs)
File "/home/amber4mint/miniconda3/lib/python3.9/site-packages/multiprocess/pool.py", line 133, in worker
wrapped = MaybeEncodingError(e, result[1])
File "/home/amber4mint/miniconda3/lib/python3.9/site-packages/multiprocess/pool.py", line 86, in init
self.value = repr(value)
MemoryError
Processed 1,000,000 reads in 7.0 hours, 54.0 minutes, 14.38 seconds. Total reads: 13,000,000 in child 718
Processed 1,000,000 reads in 15.0 hours, 51.0 minutes, 26.51 seconds. Total reads: 13,000,000 in child 727
Processed 1,000,000 reads in 1.0 day, 3.0 hours, 54.0 minutes, 43.42 seconds. Total reads: 11,000,000 in child 723
Processed 1,000,000 reads in 1.0 day, 4.0 hours, 24.0 minutes, 0.1068 seconds. Total reads: 11,000,000 in child 725
Processed 1,000,000 reads in 1.0 day, 6.0 hours, 19.0 minutes, 51.67 seconds. Total reads: 12,000,000 in child 720
Process ForkPoolWorker-24:
Traceback (most recent call last):
File "/home/amber4mint/miniconda3/lib/python3.9/site-packages/multiprocess/pool.py", line 131, in worker
put((job, i, result))
File "/home/amber4mint/miniconda3/lib/python3.9/site-packages/multiprocess/queues.py", line 375, in put
obj = _ForkingPickler.dumps(obj)
File "/home/amber4mint/miniconda3/lib/python3.9/site-packages/multiprocess/reduction.py", line 54, in dumps
cls(buf, protocol, args, *kwds).dump(obj)
File "/home/amber4mint/miniconda3/lib/python3.9/site-packages/dill/_dill.py", line 498, in dump
StockPickler.dump(self, obj)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 487, in dump
self.save(obj)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 560, in save
f(self, obj) # Call unbound method with explicit self
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 886, in save_tuple
save(element)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 560, in save
f(self, obj) # Call unbound method with explicit self
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 886, in save_tuple
save(element)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 560, in save
f(self, obj) # Call unbound method with explicit self
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 886, in save_tuple
save(element)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 603, in save
self.save_reduce(obj=obj, rv)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 692, in save_reduce
save(args)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 560, in save
f(self, obj) # Call unbound method with explicit self
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 886, in save_tuple
save(element)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 560, in save
f(self, obj) # Call unbound method with explicit self
File "/home/amber4mint/miniconda3/lib/python3.9/site-packages/dill/_dill.py", line 990, in save_module_dict
StockPickler.save_dict(pickler, obj)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 971, in save_dict
self._batch_setitems(obj.items())
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 996, in _batch_setitems
save(k)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 560, in save
f(self, obj) # Call unbound method with explicit self
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 870, in save_str
self.memoize(obj)
File "/home/amber4mint/miniconda3/lib/python3.9/pickle.py", line 511, in memoize
self.memo[id(obj)] = idx, obj
MemoryError
The codes that I run:
CITE-seq-Count -R1 /home/amber4mint/NGS1100905/KO_10X_210104_FB_S1_L002_R1_001.fastq.gz -R2 /home/amber4mint/NGS1100905/KO_10X_210104_FB_S1_L002_R2_001.fastq.gz -t /home/amber4mint/human_tag.csv -cbf 1 -cbl 16 -umif 17 -umil 26 -cells 10000 -o /home/amber4mint/cite-seq_210104/ --sliding-window -T 35
I set the threads to 35, because I met the 'struct.error' same as issue #75 and #93, and referred their solutions, before I got the 'MemoryError'.
Hi,
Thank you for developing such a convenient program, CITE-seq-Count, which helped our lab a lot, so far.
But recently when I processed the fastq data (total 516,131,880 reads of R1 and R2), I continuedly get the message "MemoryError" in each ForkPoolWorker, which showed below:
and the program seem not to be stopped, then continue the processing in next ForkPoolWork:
The codes that I run:
CITE-seq-Count -R1 /home/amber4mint/NGS1100905/KO_10X_210104_FB_S1_L002_R1_001.fastq.gz -R2 /home/amber4mint/NGS1100905/KO_10X_210104_FB_S1_L002_R2_001.fastq.gz -t /home/amber4mint/human_tag.csv -cbf 1 -cbl 16 -umif 17 -umil 26 -cells 10000 -o /home/amber4mint/cite-seq_210104/ --sliding-window -T 35
I set the threads to 35, because I met the 'struct.error' same as issue #75 and #93, and referred their solutions, before I got the 'MemoryError'.
Many thanks for your any advice!
Sincerely, Amber