kusterlab / prosit

Prosit offers high quality MS2 predicted spectra for any organism and protease as well as iRT prediction. When using Prosit is helpful for your research, please cite "Gessulat, Schmidt et al. 2019" DOI 10.1038/s41592-019-0426-7
https://www.proteomicsdb.org/prosit/
Apache License 2.0
84 stars 47 forks source link

An error occurred. Status code 2 #76

Closed fchu763 closed 2 years ago

fchu763 commented 2 years ago

Hello,

I have encountered the same error as what I reported last week again, with a smaller file size (~36 MB).

The task ID is 6C071F4C874D0FB5BE0D184D52E91925 and here is the error message:

WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap. WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap. WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap. WARNING: Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap. /root/.pyenv/versions/3.6.2/lib/python3.6/site-packages/distributed/config.py:20: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details. defaults = yaml.load(f) Using TensorFlow backend. Process Process-2: Traceback (most recent call last): File "/root/.pyenv/versions/3.6.2/lib/python3.6/multiprocessing/process.py", line 249, in _bootstrap self.run() File "/root/.pyenv/versions/3.6.2/lib/python3.6/multiprocessing/process.py", line 93, in run self._target(self._args, self._kwargs) File "/root/.pyenv/versions/3.6.2/src/converter/converter/spectronaut_conv/converter.py", line 150, in to_csv for spectrum in converted: File "/root/.pyenv/versions/3.6.2/src/converter/converter/spectronaut_conv/converter.py", line 138, in get_converted x = self.queue.get() File "", line 2, in get File "/root/.pyenv/versions/3.6.2/lib/python3.6/multiprocessing/managers.py", line 757, in _callmethod kind, result = conn.recv() File "/root/.pyenv/versions/3.6.2/lib/python3.6/multiprocessing/connection.py", line 250, in recv buf = self._recv_bytes() File "/root/.pyenv/versions/3.6.2/lib/python3.6/multiprocessing/connection.py", line 407, in _recv_bytes buf = self._recv(4) File "/root/.pyenv/versions/3.6.2/lib/python3.6/multiprocessing/connection.py", line 383, in _recv raise EOFError EOFError Traceback (most recent call last): File "/root/.pyenv/versions/3.6.2/lib/python3.6/runpy.py", line 193, in _run_module_as_main "main", mod_spec) File "/root/.pyenv/versions/3.6.2/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/root/oktoberfest/library/convert.py", line 24, in converter.iter_data() File "/root/.pyenv/versions/3.6.2/src/converter/converter/spectronaut_conv/converter.py", line 34, in iter_data conv.convert(pool) File "/root/.pyenv/versions/3.6.2/src/converter/converter/spectronaut_conv/converter.py", line 131, in convert self.queue.put(s) File "", line 2, in put File "/root/.pyenv/versions/3.6.2/lib/python3.6/multiprocessing/managers.py", line 756, in _callmethod conn.send((self._id, methodname, args, kwds)) File "/root/.pyenv/versions/3.6.2/lib/python3.6/multiprocessing/connection.py", line 206, in send self._send_bytes(_ForkingPickler.dumps(obj)) File "/root/.pyenv/versions/3.6.2/lib/python3.6/multiprocessing/connection.py", line 404, in _send_bytes self._send(header + buf) File "/root/.pyenv/versions/3.6.2/lib/python3.6/multiprocessing/connection.py", line 368, in _send n = write(self._handle, buf) BrokenPipeError: [Errno 32] Broken pipe make: [create_output] Error 9

Help with resolving this issue would be greatly appreciated.

Thank you!

WassimG commented 2 years ago

Hi,

The issue still persists, the generated library is huge and it failed cause it ran out of memory the file generated already is 12 GB and this is without processing everything, split your file further maybe this 36MB file into two and it should work fine.