icbi-lab / nextNEOpi

nextNEOpi: a comprehensive pipeline for computational neoantigen prediction
Other
67 stars 24 forks source link

NeoFuse: long running time #55

Closed fredsamhaak closed 1 year ago

fredsamhaak commented 1 year ago

Hi @riederd, I open a new issue here for Neofuse step. It has been running for more than 8 days and it is still running. I tried different ways to figure it out but had no idea. Would you please help me with this? Thank you very much! Here is the log file you might need: nextflow.log

Really looking forward to hearing from you. He

fredsamhaak commented 1 year ago

Hi, I copy the whole singularity from where I run the test dataset to the work directory where I run with a larger dataset (I softlink the singularity instead of copy it before) and rerun the pipeline. This time Neofuse step lasted about 1h 16min and completed successfully. I don't know if the problem is caused by the softlink.

riederd commented 1 year ago

Hi, I'm not sure I understand what exactly you did to get it running, sorry for that.

Anyway, no idea why it got stuck. Could you see any process running i.e. STAR with top? Did the workdir of the Neofuse have some output? There should be a LOGS dir with the logs from Neofuse.

fredsamhaak commented 1 year ago

Sorry I didn't explain clearly what I have done.

Actually, I tried to run the whole pipeline with test data you offered and completed it successfully. And then I created an another directory (to try to run the whole pipeline with a larger dataset) and softlink(ln -s) necessary 'resources' and 'tools' (e.g. singularity) from the 'test data' directory so that the pipeline won't bother downloading them again. But as you know, it didn't work until I copy these 'necessities' instead of softlink them. Hope that I make it clear this time.

From the LOGS, steps like STAR completed successfully while MHCFlurry didn't. *_10_MHCFlurry.log showed that:

` ./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: 1868770931: ./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: ./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: 22063: ./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: Can't open /dev/nullCan't open /dev/null./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: 1129073746: ./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh:

Can't open /dev/null ./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: 22063: ./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: Can't open /dev/null ./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: 22063: ./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: Can't open /dev/null ./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: 828597096: ./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: Can't open /dev/null ./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: 0: ./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: Can't open /dev/null./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: 0: ./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: Can't open /dev/null ./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: 22063: ./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: Can't open /dev/null ./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: 0: ./hcc1395/NeoFuse/tmp/MHC_I/hcc1395_10_TEST_OUT.sh: Can't open /dev/null `

And it is a little bit strange that _MHCFlurry.log said: /usr/local/bin/NeoFuse_single: line 559: /bin/sh: No such file or directory

Here is the LOGS: LOGS.tar.gz

Is it possible that softlink(ln -s) cause this problem?

riederd commented 1 year ago

Hi,

it can well be that this is caused by your symlinking.

If you want to reuse the resources (databases and so on) you just need to modify the resourcesBaseDir parameter in conf/params.config at https://github.com/icbi-lab/nextNEOpi/blob/fe7b21cdc0b97aae38195e5f5ac1b9851674f6b1/conf/params.config#L15 to e.g.:

resourcesBaseDir = "/data/my_first/nextNEOpi/resources"

To reuse the Singularity containers you simply need to activate caching for singularity by setting the nextflow NXF_SINGULARITY_CACHEDIR environment variable, e.g.:

export NXF_SINGULARITY_CACHEDIR=/data/scratch/singularity_cache

see also https://www.nextflow.io/docs/latest/singularity.html