jtpoirier / proteomegenerator

Proteome Generator
MIT License
11 stars 3 forks source link

unrecognized parameter name "twopassMode" #9

Open ldudley7 opened 4 years ago

ldudley7 commented 4 years ago

Hi I am getting this issue even though I checked the manual and the twopassMode set up is correct. Do I have to have a certain version of STAR installed?

Building DAG of jobs... Using shell: /bin/bash Provided cores: 1 Rules claiming more threads will be scaled down. Job counts: count jobs 1 BuildBamIndex 1 LongOrfs 1 Predict 1 STAR_denovo 1 StringTie_denovo 1 UCSC_denovo 1 all 1 blastp 1 cdna_alignment_orf_to_genome_orf 1 filter 1 gff3_file_to_proteins 1 gtf_file_to_cDNA_seqs 1 gtf_to_alignment_gff3 1 makeblastdb 1 maxQuant 1 merge 1 mqpar_conversion 1 reorderFASTA 18

[Mon Jul 20 18:50:28 2020] rule STAR_denovo: input: /Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_1.fastq.gz, /Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_2.fastq.gz, /Users/lindseydudley/Desktop/Wiita_Lab/PG output: out/SRR6425178.Aligned.sortedByCoord.out.bam log: out/logs/SRR6425178.align.txt jobid: 17 benchmark: out/benchmarks/SRR6425178.align.json wildcards: sample=SRR6425178

EXITING: FATAL INPUT ERROR: unrecoginzed parameter name "twopassMode" in input "Command-Line-Initial" SOLUTION: use correct parameter name (check the manual)

Jul 20 18:50:28 ...... FATAL ERROR, exiting [Mon Jul 20 18:50:28 2020] Error in rule STAR_denovo: jobid: 17 output: out/SRR6425178.Aligned.sortedByCoord.out.bam log: out/logs/SRR6425178.align.txt

RuleException: CalledProcessError in line 99 of /Users/lindseydudley/proteomegenerator/Snakefile-test: Command ' set -euo pipefail; STAR --genomeDir /Users/lindseydudley/Desktop/Wiita_Lab/PG --readFilesIn /Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_1.fastq.gz,/Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_2.fastq.gz --outFileNamePrefix out/SRR6425178. --outSAMattributes NH HI XS --outSAMattrRGline ID:SRR6425178 LB:1 PL:illumina PU:1 SM:SRR6425178 --runThreadN 12 --outSAMtype BAM SortedByCoordinate --clip3pAdapterSeq AGATCGGAAGAG --readFilesCommand zcat --twopassMode Basic --outSAMstrandField intronMotif --outFilterIntronMotifs None --outReadsUnmapped None --chimSegmentMin 15 --chimJunctionOverhangMin 15 --alignMatesGapMax 1000000 --alignIntronMax 1000000 --outFilterType Normal --alignSJDBoverhangMin 1 --alignSJoverhangMin 8 --outFilterMismatchNmax 1 --outSJfilterReads Unique --outFilterMultimapNmax 10 --sjdbOverhang 100 > out/logs/SRR6425178.align.txt ' returned non-zero exit status 102. File "/Users/lindseydudley/proteomegenerator/Snakefile-test", line 99, in __rule_STAR_denovo File "/opt/anaconda3/envs/snakemake/lib/python3.6/concurrent/futures/thread.py", line 56, in run Shutting down, this might take some time. Exiting because a job execution failed. Look above for error message Complete log: /Users/lindseydudley/proteomegenerator/.snakemake/log/2020-07-20T185028.637764.snakemake.log

jtpoirier commented 4 years ago

It looks like the two pass alignment parameters for STAR have been changed. In Cifani et al., we used version 2.5.2a.

On Jul 21, 2020, at 1:38 AM, ldudley7 notifications@github.com<mailto:notifications@github.com> wrote:

[EXTERNAL]

Hi I am getting this issue even though I checked the manual and the twopassMode set up is correct. Do I have to have a certain version of STAR installed?

Building DAG of jobs... Using shell: /bin/bash Provided cores: 1 Rules claiming more threads will be scaled down. Job counts: count jobs 1 BuildBamIndex 1 LongOrfs 1 Predict 1 STAR_denovo 1 StringTie_denovo 1 UCSC_denovo 1 all 1 blastp 1 cdna_alignment_orf_to_genome_orf 1 filter 1 gff3_file_to_proteins 1 gtf_file_to_cDNA_seqs 1 gtf_to_alignment_gff3 1 makeblastdb 1 maxQuant 1 merge 1 mqpar_conversion 1 reorderFASTA 18

[Mon Jul 20 18:50:28 2020] rule STAR_denovo: input: /Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_1.fastq.gz, /Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_2.fastq.gz, /Users/lindseydudley/Desktop/Wiita_Lab/PG output: out/SRR6425178.Aligned.sortedByCoord.out.bam log: out/logs/SRR6425178.align.txt jobid: 17 benchmark: out/benchmarks/SRR6425178.align.json wildcards: sample=SRR6425178

EXITING: FATAL INPUT ERROR: unrecoginzed parameter name "twopassMode" in input "Command-Line-Initial" SOLUTION: use correct parameter name (check the manual)

Jul 20 18:50:28 ...... FATAL ERROR, exiting [Mon Jul 20 18:50:28 2020] Error in rule STAR_denovo: jobid: 17 output: out/SRR6425178.Aligned.sortedByCoord.out.bam log: out/logs/SRR6425178.align.txt

RuleException: CalledProcessError in line 99 of /Users/lindseydudley/proteomegenerator/Snakefile-test: Command ' set -euo pipefail; STAR --genomeDir /Users/lindseydudley/Desktop/Wiita_Lab/PG --readFilesIn /Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_1.fastq.gz,/Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_2.fastq.gz --outFileNamePrefix out/SRR6425178. --outSAMattributes NH HI XS --outSAMattrRGline ID:SRR6425178 LB:1 PL:illumina PU:1 SM:SRR6425178 --runThreadN 12 --outSAMtype BAM SortedByCoordinate --clip3pAdapterSeq AGATCGGAAGAG --readFilesCommand zcat --twopassMode Basic --outSAMstrandField intronMotif --outFilterIntronMotifs None --outReadsUnmapped None --chimSegmentMin 15 --chimJunctionOverhangMin 15 --alignMatesGapMax 1000000 --alignIntronMax 1000000 --outFilterType Normal --alignSJDBoverhangMin 1 --alignSJoverhangMin 8 --outFilterMismatchNmax 1 --outSJfilterReads Unique --outFilterMultimapNmax 10 --sjdbOverhang 100 > out/logs/SRR6425178.align.txt ' returned non-zero exit status 102. File "/Users/lindseydudley/proteomegenerator/Snakefile-test", line 99, in __rule_STAR_denovo File "/opt/anaconda3/envs/snakemake/lib/python3.6/concurrent/futures/thread.py", line 56, in run Shutting down, this might take some time. Exiting because a job execution failed. Look above for error message Complete log: /Users/lindseydudley/proteomegenerator/.snakemake/log/2020-07-20T185028.637764.snakemake.log

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHubhttps://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_jtpoirier_proteomegenerator_issues_9&d=DwMFaQ&c=j5oPpO0eBH1iio48DtsedeElZfc04rx3ExJHeIIZuCs&r=mDZyQneLgKnHONN2b0V0N3HlSeLNCA4tZf6bwo45-1w&m=7f4muOyHhXroeKwO51e18JXYPPcYOgu2dSjwWA2v538&s=0dizjv827bwgM29ka3IMRlvdOJ989JnXYY1Qfbuz_tA&e=, or unsubscribehttps://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_notifications_unsubscribe-2Dauth_ABSNON37RYRZVB573XXDVSLR4USW3ANCNFSM4PDFONXA&d=DwMFaQ&c=j5oPpO0eBH1iio48DtsedeElZfc04rx3ExJHeIIZuCs&r=mDZyQneLgKnHONN2b0V0N3HlSeLNCA4tZf6bwo45-1w&m=7f4muOyHhXroeKwO51e18JXYPPcYOgu2dSjwWA2v538&s=lVm8Mu2U_aDOj5TVeyr25vB6VikEhVWdlJB9Bb5Eims&e=.

ldudley7 commented 4 years ago

Thank you so much! This seemed to get me through the STAR step to go download that version. I am running into trouble with the next step with picard now. I am using the --use-conda command and a conda environment seems to be generated but I'm not sure if all the dependencies are downloaded within it. Can you please advise?

(snakemake) Lindseys-MBP:proteomegenerator lindseydudley$ snakemake --snakefile Snakefile-test --use-conda Building DAG of jobs... Using shell: /bin/bash Provided cores: 1 Rules claiming more threads will be scaled down. Job counts: count jobs 1 BuildBamIndex 1 LongOrfs 1 Predict 1 STAR_denovo 1 StringTie_denovo 1 UCSC_denovo 1 all 1 blastp 1 cdna_alignment_orf_to_genome_orf 1 filter 1 gff3_file_to_proteins 1 gtf_file_to_cDNA_seqs 1 gtf_to_alignment_gff3 1 maxQuant 1 merge 1 mqpar_conversion 1 reorderFASTA 17

[Tue Jul 21 11:32:50 2020] rule STAR_denovo: input: /Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_1.fastq.gz, /Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_2.fastq.gz, /Users/lindseydudley/Desktop/Wiita_Lab/PG output: out/SRR6425178.Aligned.sortedByCoord.out.bam log: out/logs/SRR6425178.align.txt jobid: 17 benchmark: out/benchmarks/SRR6425178.align.json wildcards: sample=SRR6425178

Activating conda environment: /Users/lindseydudley/Desktop/Wiita_Lab/PG/.snakemake/conda/261e0082 zcat: can't stat: /Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_1.fastq.gz (/Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_1.fastq.gz.Z): No such file or directory zcat: can't stat: /Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_2.fastq.gz (/Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_2.fastq.gz.Z): No such file or directory zcat: can't stat: /Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_1.fastq.gz (/Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_1.fastq.gz.Z): No such file or directory zcat: can't stat: /Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_2.fastq.gz (/Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_2.fastq.gz.Z): No such file or directory [Tue Jul 21 11:33:11 2020] Finished job 17. 1 of 17 steps (6%) done

[Tue Jul 21 11:33:11 2020] rule filter: input: out/SRR6425178.Aligned.sortedByCoord.out.bam output: out/SRR6425178.Aligned.trimmed.out.bam log: out/logs/SRR6425178.filter.txt jobid: 15 benchmark: out/benchmarks/SRR6425178.filter.txt wildcards: sample=SRR6425178

Activating conda environment: /Users/lindseydudley/Desktop/Wiita_Lab/PG/.snakemake/conda/261e0082 Removing temporary output file out/SRR6425178.Aligned.sortedByCoord.out.bam. [Tue Jul 21 11:33:11 2020] Finished job 15. 2 of 17 steps (12%) done

[Tue Jul 21 11:33:11 2020] rule BuildBamIndex: input: out/SRR6425178.Aligned.trimmed.out.bam output: out/SRR6425178.Aligned.trimmed.out.bai log: out/logs/SRR6425178.BuildBamIndex.txt jobid: 16 benchmark: out/benchmarks/SRR6425178.BuildBamIndex.txt wildcards: sample=SRR6425178

Activating conda environment: /Users/lindseydudley/Desktop/Wiita_Lab/PG/.snakemake/conda/261e0082 [Tue Jul 21 11:33:13 2020] Error in rule BuildBamIndex: jobid: 16 output: out/SRR6425178.Aligned.trimmed.out.bai log: out/logs/SRR6425178.BuildBamIndex.txt conda-env: /Users/lindseydudley/Desktop/Wiita_Lab/PG/.snakemake/conda/261e0082

RuleException: CalledProcessError in line 33 of /Users/lindseydudley/proteomegenerator/pgm: Command 'source activate /Users/lindseydudley/Desktop/Wiita_Lab/PG/.snakemake/conda/261e0082; set -euo pipefail; picard BuildBamIndex INPUT=out/SRR6425178.Aligned.trimmed.out.bam 2> out/logs/SRR6425178.BuildBamIndex.txt ' returned non-zero exit status 1. File "/Users/lindseydudley/proteomegenerator/pgm", line 33, in __rule_BuildBamIndex File "/opt/anaconda3/envs/snakemake/lib/python3.6/concurrent/futures/thread.py", line 56, in run Shutting down, this might take some time. Exiting because a job execution failed. Look above for error message Complete log: /Users/lindseydudley/proteomegenerator/.snakemake/log/2020-07-21T113250.121094.snakemake.log

jtpoirier commented 4 years ago

Do you have the log files for the BuildBamIndex rules that are failing?

On Jul 21, 2020, at 3:05 PM, ldudley7 notifications@github.com<mailto:notifications@github.com> wrote:

[EXTERNAL]

Thank you so much! This seemed to get me through the STAR step to go download that version. I am running into trouble with the next step with picard now. I am using the --use-conda command and a conda environment seems to be generated but I'm not sure if all the dependencies are downloaded within it. Can you please advise?

(snakemake) Lindseys-MBP:proteomegenerator lindseydudley$ snakemake --snakefile Snakefile-test --use-conda Building DAG of jobs... Using shell: /bin/bash Provided cores: 1 Rules claiming more threads will be scaled down. Job counts: count jobs 1 BuildBamIndex 1 LongOrfs 1 Predict 1 STAR_denovo 1 StringTie_denovo 1 UCSC_denovo 1 all 1 blastp 1 cdna_alignment_orf_to_genome_orf 1 filter 1 gff3_file_to_proteins 1 gtf_file_to_cDNA_seqs 1 gtf_to_alignment_gff3 1 maxQuant 1 merge 1 mqpar_conversion 1 reorderFASTA 17

[Tue Jul 21 11:32:50 2020] rule STAR_denovo: input: /Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_1.fastq.gz, /Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_2.fastq.gz, /Users/lindseydudley/Desktop/Wiita_Lab/PG output: out/SRR6425178.Aligned.sortedByCoord.out.bam log: out/logs/SRR6425178.align.txt jobid: 17 benchmark: out/benchmarks/SRR6425178.align.json wildcards: sample=SRR6425178

Activating conda environment: /Users/lindseydudley/Desktop/Wiita_Lab/PG/.snakemake/conda/261e0082 zcat: can't stat: /Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_1.fastq.gz (/Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_1.fastq.gz.Z): No such file or directory zcat: can't stat: /Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_2.fastq.gz (/Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_2.fastq.gz.Z): No such file or directory zcat: can't stat: /Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_1.fastq.gz (/Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_1.fastq.gz.Z): No such file or directory zcat: can't stat: /Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_2.fastq.gz (/Users/lindseydudley/Desktop/Wiita_Lab/PG/SRR6425178_2.fastq.gz.Z): No such file or directory [Tue Jul 21 11:33:11 2020] Finished job 17. 1 of 17 steps (6%) done

[Tue Jul 21 11:33:11 2020] rule filter: input: out/SRR6425178.Aligned.sortedByCoord.out.bam output: out/SRR6425178.Aligned.trimmed.out.bam log: out/logs/SRR6425178.filter.txt jobid: 15 benchmark: out/benchmarks/SRR6425178.filter.txt wildcards: sample=SRR6425178

Activating conda environment: /Users/lindseydudley/Desktop/Wiita_Lab/PG/.snakemake/conda/261e0082 Removing temporary output file out/SRR6425178.Aligned.sortedByCoord.out.bam. [Tue Jul 21 11:33:11 2020] Finished job 15. 2 of 17 steps (12%) done

[Tue Jul 21 11:33:11 2020] rule BuildBamIndex: input: out/SRR6425178.Aligned.trimmed.out.bam output: out/SRR6425178.Aligned.trimmed.out.bai log: out/logs/SRR6425178.BuildBamIndex.txt jobid: 16 benchmark: out/benchmarks/SRR6425178.BuildBamIndex.txt wildcards: sample=SRR6425178

Activating conda environment: /Users/lindseydudley/Desktop/Wiita_Lab/PG/.snakemake/conda/261e0082 [Tue Jul 21 11:33:13 2020] Error in rule BuildBamIndex: jobid: 16 output: out/SRR6425178.Aligned.trimmed.out.bai log: out/logs/SRR6425178.BuildBamIndex.txt conda-env: /Users/lindseydudley/Desktop/Wiita_Lab/PG/.snakemake/conda/261e0082

RuleException: CalledProcessError in line 33 of /Users/lindseydudley/proteomegenerator/pgm: Command 'source activate /Users/lindseydudley/Desktop/Wiita_Lab/PG/.snakemake/conda/261e0082; set -euo pipefail; picard BuildBamIndex INPUT=out/SRR6425178.Aligned.trimmed.out.bam 2> out/logs/SRR6425178.BuildBamIndex.txt ' returned non-zero exit status 1. File "/Users/lindseydudley/proteomegenerator/pgm", line 33, in __rule_BuildBamIndex File "/opt/anaconda3/envs/snakemake/lib/python3.6/concurrent/futures/thread.py", line 56, in run Shutting down, this might take some time. Exiting because a job execution failed. Look above for error message Complete log: /Users/lindseydudley/proteomegenerator/.snakemake/log/2020-07-21T113250.121094.snakemake.log

— You are receiving this because you commented. Reply to this email directly, view it on GitHubhttps://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_jtpoirier_proteomegenerator_issues_9-23issuecomment-2D662050171&d=DwMCaQ&c=j5oPpO0eBH1iio48DtsedeElZfc04rx3ExJHeIIZuCs&r=mDZyQneLgKnHONN2b0V0N3HlSeLNCA4tZf6bwo45-1w&m=_SKBKk39vbBotf44b9K6mfO8eSshViSLzq04EmWPqVg&s=tji7X7GY6Xr5_UxA3mJYm2Nm-0NxZzgderszOlOFt64&e=, or unsubscribehttps://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_notifications_unsubscribe-2Dauth_ABSNONYWVPTUOOIUZWI3WXDR4XRF7ANCNFSM4PDFONXA&d=DwMCaQ&c=j5oPpO0eBH1iio48DtsedeElZfc04rx3ExJHeIIZuCs&r=mDZyQneLgKnHONN2b0V0N3HlSeLNCA4tZf6bwo45-1w&m=_SKBKk39vbBotf44b9K6mfO8eSshViSLzq04EmWPqVg&s=LIiU88btIQdf1ICq7qQyWbb1xSiz0NvcNjWM0DbiBy0&e=.

ldudley7 commented 4 years ago

Th complete log file is copied in the previous comment, but I have copied the specific section about the bam files below.

[Tue Jul 21 11:33:11 2020] rule BuildBamIndex: input: out/SRR6425178.Aligned.trimmed.out.bam output: out/SRR6425178.Aligned.trimmed.out.bai log: out/logs/SRR6425178.BuildBamIndex.txt jobid: 16 benchmark: out/benchmarks/SRR6425178.BuildBamIndex.txt wildcards: sample=SRR6425178

Activating conda environment: /Users/lindseydudley/Desktop/Wiita_Lab/PG/.snakemake/conda/261e0082 [Tue Jul 21 11:33:13 2020] Error in rule BuildBamIndex: jobid: 16 output: out/SRR6425178.Aligned.trimmed.out.bai log: out/logs/SRR6425178.BuildBamIndex.txt conda-env: /Users/lindseydudley/Desktop/Wiita_Lab/PG/.snakemake/conda/261e0082

RuleException: CalledProcessError in line 33 of /Users/lindseydudley/proteomegenerator/pgm: Command 'source activate /Users/lindseydudley/Desktop/Wiita_Lab/PG/.snakemake/conda/261e0082; set -euo pipefail; picard BuildBamIndex INPUT=out/SRR6425178.Aligned.trimmed.out.bam 2> out/logs/SRR6425178.BuildBamIndex.txt ' returned non-zero exit status 1. File "/Users/lindseydudley/proteomegenerator/pgm", line 33, in __rule_BuildBamIndex File "/opt/anaconda3/envs/snakemake/lib/python3.6/concurrent/futures/thread.py", line 56, in run Shutting down, this might take some time. Exiting because a job execution failed. Look above for error message Complete log: /Users/lindseydudley/proteomegenerator/.snakemake/log/2020-07-21T113250.121094.snakemake.log

jtpoirier commented 4 years ago

Each job has its own log file specified in the role – can you check the output of the below log file?

On Jul 21, 2020, at 3:50 PM, ldudley7 notifications@github.com<mailto:notifications@github.com> wrote:

log: out/logs/SRR6425178.BuildBamIndex.txt

ldudley7 commented 4 years ago

Thank you for the clarification. This is the file that you were talking about

11:33:12.832 INFO NativeLibraryLoader - Loading libgkl_compression.dylib from jar:file:/Users/lindseydudley/Desktop/Wiita_Lab/PG/.snakemake/conda/261e0082/share/picard-2.18.7-2/picard.jar!/com/intel/gkl/native/libgkl_compression.dylib [Tue Jul 21 11:33:13 PDT 2020] BuildBamIndex INPUT=out/SRR6425178.Aligned.trimmed.out.bam VERBOSITY=INFO QUIET=false VALIDATION_STRINGENCY=STRICT COMPRESSION_LEVEL=5 MAX_RECORDS_IN_RAM=500000 CREATE_INDEX=false CREATE_MD5_FILE=false GA4GH_CLIENT_SECRETS=client_secrets.json USE_JDK_DEFLATER=false USE_JDK_INFLATER=false [Tue Jul 21 11:33:13 PDT 2020] Executing as lindseydudley@Lindseys-MBP.attlocal.net on Mac OS X 10.15.6 x86_64; OpenJDK 64-Bit Server VM 1.8.0_152-release-1056-b12; Deflater: Intel; Inflater: Intel; Provider GCS is not available; Picard version: 2.18.7-SNAPSHOT [Tue Jul 21 11:33:13 PDT 2020] picard.sam.BuildBamIndex done. Elapsed time: 0.00 minutes. Runtime.totalMemory()=514850816 To get help, see http://broadinstitute.github.io/picard/index.html#GettingHelp Exception in thread "main" htsjdk.samtools.SAMException: Input bam file must be sorted by coordinate at picard.sam.BuildBamIndex.doWork(BuildBamIndex.java:145) at picard.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:282) at picard.cmdline.PicardCommandLine.instanceMain(PicardCommandLine.java:103) at picard.cmdline.PicardCommandLine.main(PicardCommandLine.java:113) ~

jtpoirier commented 4 years ago

Picard BuildBamIndex requires files to be sorted by coordinate. STAR accomplishes this with the flag '--outSAMtype BAM SortedByCoordinate’. You might want to check if your intermediate file actually appears to be sorted by coordinate.

On Jul 21, 2020, at 4:45 PM, ldudley7 notifications@github.com<mailto:notifications@github.com> wrote:

Input bam file must be sorted by coordinate

ldudley7 commented 4 years ago

When I open the out folder I am see the name of the file but there doesn't seem to actually be anything in the file. Do you have any suggestions to troubleshoot this issue?

ldudley7 commented 4 years ago

I am trying to run the program in de novo mode and have commented out the gtf file if that helps with troubleshooting.

jtpoirier commented 4 years ago

You can edit the Snakefile to remove the temp() directive around the output of any rule in order to save intermediate files in the pipeline.

On Jul 27, 2020, at 12:14 AM, ldudley7 notifications@github.com<mailto:notifications@github.com> wrote:

[EXTERNAL]

When I open the out folder I am see the name of the file but there doesn't seem to actually be anything in the file. Do you have any suggestions to troubleshoot this issue?

— You are receiving this because you commented. Reply to this email directly, view it on GitHubhttps://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_jtpoirier_proteomegenerator_issues_9-23issuecomment-2D664109298&d=DwMCaQ&c=j5oPpO0eBH1iio48DtsedeElZfc04rx3ExJHeIIZuCs&r=mDZyQneLgKnHONN2b0V0N3HlSeLNCA4tZf6bwo45-1w&m=98qborr5McG9O6-mdRFAz6KEd7XU1ibCcnFy3JNuQFE&s=dWTxbnLlaRq42MW6aUufX2W9uzGIzWSKuKYq5jl9CKI&e=, or unsubscribehttps://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_notifications_unsubscribe-2Dauth_ABSNON7HZCSA6WGL44GEQZDR5T5LVANCNFSM4PDFONXA&d=DwMCaQ&c=j5oPpO0eBH1iio48DtsedeElZfc04rx3ExJHeIIZuCs&r=mDZyQneLgKnHONN2b0V0N3HlSeLNCA4tZf6bwo45-1w&m=98qborr5McG9O6-mdRFAz6KEd7XU1ibCcnFy3JNuQFE&s=g5CsHbInmkRgtxX2Y_pJzj5tMAcRaO4xzgaTfBHzBT8&e=.

ldudley7 commented 4 years ago

I tried doing that but when the job fails it says that it removes the output file because it might be corrupted. Is there any way to get around this.

jtpoirier commented 4 years ago

Sure, you can protect a temporary file:

https://snakemake.readthedocs.io/en/stable/snakefiles/rules.html#protected-and-temporary-files

On Jul 27, 2020, at 3:51 PM, ldudley7 notifications@github.com<mailto:notifications@github.com> wrote:

[EXTERNAL]

I tried doing that but when the job fails it says that it removes the output file because it might be corrupted. Is there any way to get around this.

— You are receiving this because you commented. Reply to this email directly, view it on GitHubhttps://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_jtpoirier_proteomegenerator_issues_9-23issuecomment-2D664603858&d=DwMCaQ&c=j5oPpO0eBH1iio48DtsedeElZfc04rx3ExJHeIIZuCs&r=mDZyQneLgKnHONN2b0V0N3HlSeLNCA4tZf6bwo45-1w&m=9wp2CHNhnODWmVWDpkdyLnw-hmFGBYwz1zFXDOOie7o&s=nJ-UMTJJcR6bxSQQUU6FOFFHbBNheSWj1zHwKhNh8ro&e=, or unsubscribehttps://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_notifications_unsubscribe-2Dauth_ABSNON46YR2JC4TBD723CATR5XLFPANCNFSM4PDFONXA&d=DwMCaQ&c=j5oPpO0eBH1iio48DtsedeElZfc04rx3ExJHeIIZuCs&r=mDZyQneLgKnHONN2b0V0N3HlSeLNCA4tZf6bwo45-1w&m=9wp2CHNhnODWmVWDpkdyLnw-hmFGBYwz1zFXDOOie7o&s=LyAtyqsbZwzJN-iR1JVhurngEIuCT40X-sPvfvVSTxM&e=.

ldudley7 commented 4 years ago

When I write it as a protected file the name stays its there but there does not appear to be anything in it.

jtpoirier commented 4 years ago

That means something is not working correctly w/ the STAR alignment step. Try checking the logs for that rule. You can also make sure that STAR_GTF is running the command exactly as you would expect by running the pipeline with -- printshellcmds -n. This will show you the commands as they will be run, including filenames, etc.

On Jul 29, 2020, at 9:36 PM, ldudley7 notifications@github.com<mailto:notifications@github.com> wrote:

[EXTERNAL]

When I write it as a protected file the name stays its there but there does not appear to be anything in it.

— You are receiving this because you commented. Reply to this email directly, view it on GitHubhttps://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_jtpoirier_proteomegenerator_issues_9-23issuecomment-2D666026355&d=DwMCaQ&c=j5oPpO0eBH1iio48DtsedeElZfc04rx3ExJHeIIZuCs&r=mDZyQneLgKnHONN2b0V0N3HlSeLNCA4tZf6bwo45-1w&m=TSggbCYnVUjoeiUUHGuZeLUHCh2BkYffs8UN4ZL_tdw&s=yM4hSW_wFCg4lv9qrDUVZDLd15uz42M9bnzkD52Qgjk&e=, or unsubscribehttps://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_notifications_unsubscribe-2Dauth_ABSNON54CGM2XSJYZELWT3TR6DFALANCNFSM4PDFONXA&d=DwMCaQ&c=j5oPpO0eBH1iio48DtsedeElZfc04rx3ExJHeIIZuCs&r=mDZyQneLgKnHONN2b0V0N3HlSeLNCA4tZf6bwo45-1w&m=TSggbCYnVUjoeiUUHGuZeLUHCh2BkYffs8UN4ZL_tdw&s=JoYpOw1hOgW1vKmL1prDjM_ZH-oyVNY44Bbjx-xRa1U&e=.