nf-core / proteomicslfq

Proteomics label-free quantification (LFQ) analysis pipeline
https://nf-co.re/proteomicslfq
MIT License
33 stars 19 forks source link

Error executing process > 'search_engine_comet (1)' #193

Open BenSamy2020 opened 1 year ago

BenSamy2020 commented 1 year ago

Greetings,

I am a new user to nf-core and currently trying to establish nf-core proteomicslfq pipeline on my personal local PC. Unfortunately, I am facing error with search engine comet. I have attatched the error long below for your reference. Please do let me know how to troubleshoot this error.

Lastly, I would like to thank you for your assistance in advance.

Regards, Ben nextflow.log

jpfeuffer commented 1 year ago

Hi!

Can you also upload the log of the job? You will find it in the corresponding hashed workspace folder. /mnt/d/nf-core/proteomicslfq/work/a9/b7aee795829399904f455ea0ba106a

BenSamy2020 commented 1 year ago

Greetings @jpfeuffer,

Thank you for your assistances. Please do refer to attachments for the required files.

Regards, Ben

20221102_PTB_DMSO_B1_comet.log

jpfeuffer commented 1 year ago

That looks bad. Unfortunately it is the underlying tool that crashes. You might need to file a report with the inputs to the developers.

jpfeuffer commented 1 year ago

Does the test data work for you?

jpfeuffer commented 1 year ago

By the way, why do you add two file endings to your database? This is a big source for possible errors.

BenSamy2020 commented 1 year ago

Greetings @jpfeuffer,

The test data analysis completed successfully.

I changed my protein database to .fasta and searched it with comet and the search still failed. But when I switched to msgf, the search went sucessfully. I guess it might be a comet specific program.

But after searching with msgf, during the proteomicslfq processing phase, I face the error of insufficient memory, by any chance do you know how increase the memory limit from 64GB to 100 GB? (my cpu has 128GB of RAM).

Regards, Ben

BenSamy2020 commented 1 year ago

image

BenSamy2020 commented 1 year ago

Greetings @jpfeuffer,

I tried adding the command --max_memory 128.GB, I am still facing the above issue. Example command: nextflow run nf-core/proteomicslfq -r 1.0.0 -name DMSO_Drug -profile docker -work-dir /mnt/d/nf-core/proteomicslfq -params-file /mnt/d/nf-core/proteomicslfq/nf-params.json --max_memory 128.GB

I would really like to establish this proteomics pipeline in my new institute, please do advise me on how to troubleshoot?

Regards, Ben

jpfeuffer commented 1 year ago

You actually have to decrease max memory that the jobs are allowed to request. Apparently nextflow thinks you only have 62.7GB of RAM available. You might be able to get some more help about configuring for different hardware requirements on the Nextflow Slack channel as well.

By the way, the pipeline is usually run with at least two files. There might be something wrong with the settings/input if you actually have more files. You log only shows one job for each step.