TrinityCTAT / ctat-mutations

Mutation detection using GATK4 best practices and latest RNA editing filters resources. Works with both Hg38 and Hg19
https://github.com/TrinityCTAT/ctat-mutations
Other
71 stars 19 forks source link

Could not run the test sample using ctat-mutations #111

Open DecodeGenome opened 1 year ago

DecodeGenome commented 1 year ago

Hi,

I try to run ctat-mutations on your test samples at our server using singularity script, I got the error below. My server admin said the error is from the singularity container. Could you please help me out how to solve the problem.

Thanks,

Wei

################# Script######### singularity exec -e -B /wynton/group/bivona/RNAseq_CTAT_Mutations:/data -B /wynton/group/bivona/Analytic_tools/GRCh37_gencode_v19_CTAT_lib_Mar012021.plug-n-play/ctat_genome_lib_build_dir:/ctat_genome_lib_dir:ro /wynton/group/bivona/RNAseq_CTAT_Mutations/ctat_mutations.v3.2.1.simg /usr/local/src/ctat-mutations/ctat_mutations --left /data/reads_1.fastq.gz --right /data/reads_2.fastq.gz --sample_id test --outputdir /data/example.HC.simg_Oct4.singularity --cpu 10 --genome_lib_dir /ctat_genome_lib_dir --boosting_method none

########## error ############################### index: failed to open Traceback (most recent call last): File "", line 7, in File "/opt/conda/lib/python3.7/subprocess.py", line 347, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command '['bcftools', 'index', '']' returned non-zero exit status 255. ###############################

brianjohnhaas commented 1 year ago

Hi,

That is peculiar. Were there any earlier error messages encountered that might shed some more insight?

best,

~b

On Tue, Oct 11, 2022 at 5:36 PM DecodeGenome @.***> wrote:

Hi,

I try to run ctat-mutations on your test samples at our server using singularity script, I got the error below. My server admin said the error is from the singularity container. Could you please help me out how to solve the problem.

Thanks,

Wei

################# Script######### singularity exec -e -B /wynton/group/bivona/RNAseq_CTAT_Mutations:/data -B /wynton/group/bivona/Analytic_tools/GRCh37_gencode_v19_CTAT_lib_Mar012021.plug-n-play/ctat_genome_lib_build_dir:/ctat_genome_lib_dir:ro /wynton/group/bivona/RNAseq_CTAT_Mutations/ctat_mutations.v3.2.1.simg /usr/local/src/ctat-mutations/ctat_mutations --left /data/reads_1.fastq.gz --right /data/reads_2.fastq.gz --sample_id test --outputdir /data/example.HC.simg_Oct4.singularity --cpu 10 --genome_lib_dir /ctat_genome_lib_dir --boosting_method none

########## error ############################### index: failed to open Traceback (most recent call last): File "", line 7, in File "/opt/conda/lib/python3.7/subprocess.py", line 347, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command '['bcftools', 'index', '']' returned non-zero exit status 255. ###############################

— Reply to this email directly, view it on GitHub https://github.com/NCIP/ctat-mutations/issues/111, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABZRKXZYDK3WLLHE7NN7AFLWCXMXJANCNFSM6AAAAAARCWBNYI . You are receiving this because you are subscribed to this thread.Message ID: @.***>

--

Brian J. Haas The Broad Institute http://broadinstitute.org/~bhaas http://broad.mit.edu/~bhaas

DecodeGenome commented 1 year ago

Hi Bian,

I just ran the same script on server terminal and save all information to the text file (see attachment) for your inspection.

Hope this will help.

Thanks for your support,

Wei


From: Brian Haas @.> Sent: Tuesday, October 11, 2022 4:36 PM To: NCIP/ctat-mutations @.> Cc: Wu, Wei @.>; Author @.> Subject: Re: [NCIP/ctat-mutations] Could not run the test sample using ctat-mutations (Issue #111)

Hi, That is peculiar. Were there any earlier error messages encountered that might shed some more insight? best, ~b On Tue, Oct 11, 2022 at 5: 36 PM DecodeGenome @ . ***> wrote: > Hi, > > I try to run ctat-mutations on your test ZjQcmQRYFpfptBannerStart This Message Is From an External Sender This message came from outside your organization.

ZjQcmQRYFpfptBannerEnd

Hi,

That is peculiar. Were there any earlier error messages encountered that might shed some more insight?

best,

~b

On Tue, Oct 11, 2022 at 5:36 PM DecodeGenome @.***> wrote:

Hi,

I try to run ctat-mutations on your test samples at our server using singularity script, I got the error below. My server admin said the error is from the singularity container. Could you please help me out how to solve the problem.

Thanks,

Wei

################# Script######### singularity exec -e -B /wynton/group/bivona/RNAseq_CTAT_Mutations:/data -B /wynton/group/bivona/Analytic_tools/GRCh37_gencode_v19_CTAT_lib_Mar012021.plug-n-play/ctat_genome_lib_build_dir:/ctat_genome_lib_dir:ro /wynton/group/bivona/RNAseq_CTAT_Mutations/ctat_mutations.v3.2.1.simg /usr/local/src/ctat-mutations/ctat_mutations --left /data/reads_1.fastq.gz --right /data/reads_2.fastq.gz --sample_id test --outputdir /data/example.HC.simg_Oct4.singularity --cpu 10 --genome_lib_dir /ctat_genome_lib_dir --boosting_method none

########## error ############################### index: failed to open Traceback (most recent call last): File "", line 7, in File "/opt/conda/lib/python3.7/subprocess.py", line 347, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command '['bcftools', 'index', '']' returned non-zero exit status 255. ###############################

— Reply to this email directly, view it on GitHub https://github.com/NCIP/ctat-mutations/issues/111https://urldefense.com/v3/__https://github.com/NCIP/ctat-mutations/issues/111*3E__;JQ!!LQC6Cpwp!pCsRoGgCo6Fk6rgiqx8A1-As3vJ829vFJBrCJws3MkfD40sW9pNykI8G3VtAQrbQCKOY74r0kqORAccOFsrrKbwe$, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABZRKXZYDK3WLLHE7NN7AFLWCXMXJANCNFSM6AAAAAARCWBNYIhttps://urldefense.com/v3/__https://github.com/notifications/unsubscribe-auth/ABZRKXZYDK3WLLHE7NN7AFLWCXMXJANCNFSM6AAAAAARCWBNYI*3E__;JQ!!LQC6Cpwp!pCsRoGgCo6Fk6rgiqx8A1-As3vJ829vFJBrCJws3MkfD40sW9pNykI8G3VtAQrbQCKOY74r0kqORAccOFvqdfduP$ . You are receiving this because you are subscribed to this thread.Message ID: @.***>

--

Brian J. Haas The Broad Institute http://broadinstitute.org/~bhaashttps://urldefense.com/v3/__http://broadinstitute.org/*bhaas__;fg!!LQC6Cpwp!pCsRoGgCo6Fk6rgiqx8A1-As3vJ829vFJBrCJws3MkfD40sW9pNykI8G3VtAQrbQCKOY74r0kqORAccOFvATRP4u$ http://broad.mit.edu/~bhaashttps://urldefense.com/v3/__http://broad.mit.edu/*bhaas*3E__;fiU!!LQC6Cpwp!pCsRoGgCo6Fk6rgiqx8A1-As3vJ829vFJBrCJws3MkfD40sW9pNykI8G3VtAQrbQCKOY74r0kqORAccOFi9iavOz$

— Reply to this email directly, view it on GitHubhttps://urldefense.com/v3/__https://github.com/NCIP/ctat-mutations/issues/111*issuecomment-1275403678__;Iw!!LQC6Cpwp!pCsRoGgCo6Fk6rgiqx8A1-As3vJ829vFJBrCJws3MkfD40sW9pNykI8G3VtAQrbQCKOY74r0kqORAccOFskAhM71$, or unsubscribehttps://urldefense.com/v3/__https://github.com/notifications/unsubscribe-auth/ADZWCCEJEAQ64RAY54KFMLDWCX2Y5ANCNFSM6AAAAAARCWBNYI__;!!LQC6Cpwp!pCsRoGgCo6Fk6rgiqx8A1-As3vJ829vFJBrCJws3MkfD40sW9pNykI8G3VtAQrbQCKOY74r0kqORAccOFun3odBo$. You are receiving this because you authored the thread.Message ID: @.***>

17:19:16 : INFO : CMD: java -Djava.io.tmpdir=/data/example.singularity/__tmpdir -Dconfig.file=/usr/local/src/ctat-mutations/WDL/local_provider_config.inc.conf -jar /usr/local/src/ctat-mutations/WDL/cromwell-58.jar run -i /tmp/tmp13ca1t85.json -m /tmp/tmp8humz7mv.json /usr/local/src/ctat-mutations/WDL/ctat_mutations.wdl [2022-10-12 00:19:19,17] [info] Running with database db.url = jdbc:hsqldb:file:cromwell-executions/cromwell-db/cromwell-db; shutdown=false; hsqldb.default_table_type=cached;hsqldb.tx=mvcc; hsqldb.result_max_memory_rows=10000; hsqldb.large_data=true; hsqldb.applog=1; hsqldb.lob_compressed=true; hsqldb.script_format=3

[2022-10-12 00:19:19,63] [info] dataFileCache open start [2022-10-12 00:19:19,70] [info] dataFileCache open end [2022-10-12 00:19:19,93] [info] checkpointClose start [2022-10-12 00:19:19,93] [info] checkpointClose synched [2022-10-12 00:19:20,00] [info] checkpointClose script done [2022-10-12 00:19:20,00] [info] dataFileCache commit start [2022-10-12 00:19:20,16] [info] dataFileCache commit end [2022-10-12 00:19:20,26] [info] checkpointClose end [2022-10-12 00:19:26,71] [info] Checkpoint start [2022-10-12 00:19:26,71] [info] checkpointClose start [2022-10-12 00:19:26,71] [info] checkpointClose synched [2022-10-12 00:19:26,84] [info] checkpointClose script done [2022-10-12 00:19:26,84] [info] dataFileCache commit start [2022-10-12 00:19:26,88] [info] dataFileCache commit end [2022-10-12 00:19:27,02] [info] checkpointClose end [2022-10-12 00:19:27,03] [info] Checkpoint end - txts: 20419 [2022-10-12 00:19:27,20] [info] Checkpoint start [2022-10-12 00:19:27,20] [info] checkpointClose start [2022-10-12 00:19:27,24] [info] checkpointClose synched [2022-10-12 00:19:27,27] [info] checkpointClose script done [2022-10-12 00:19:27,28] [info] dataFileCache commit start [2022-10-12 00:19:27,35] [info] dataFileCache commit end [2022-10-12 00:19:27,55] [info] checkpointClose end [2022-10-12 00:19:27,55] [info] Checkpoint end - txts: 20435 [2022-10-12 00:19:27,55] [info] Checkpoint start [2022-10-12 00:19:27,55] [info] checkpointClose start [2022-10-12 00:19:27,56] [info] checkpointClose synched [2022-10-12 00:19:27,62] [info] checkpointClose script done [2022-10-12 00:19:27,62] [info] dataFileCache commit start [2022-10-12 00:19:27,65] [info] dataFileCache commit end [2022-10-12 00:19:27,75] [info] checkpointClose end [2022-10-12 00:19:27,76] [info] Checkpoint end - txts: 20437 [2022-10-12 00:19:29,39] [info] Checkpoint start [2022-10-12 00:19:29,39] [info] checkpointClose start [2022-10-12 00:19:29,39] [info] checkpointClose synched [2022-10-12 00:19:29,43] [info] checkpointClose script done [2022-10-12 00:19:29,43] [info] dataFileCache commit start [2022-10-12 00:19:29,48] [info] dataFileCache commit end [2022-10-12 00:19:29,54] [info] checkpointClose end [2022-10-12 00:19:29,54] [info] Checkpoint end - txts: 20460 [2022-10-12 00:19:29,54] [info] Checkpoint start [2022-10-12 00:19:29,54] [info] checkpointClose start [2022-10-12 00:19:29,54] [info] checkpointClose synched [2022-10-12 00:19:29,58] [info] checkpointClose script done [2022-10-12 00:19:29,58] [info] dataFileCache commit start [2022-10-12 00:19:29,77] [info] dataFileCache commit end [2022-10-12 00:19:29,92] [info] checkpointClose end [2022-10-12 00:19:29,92] [info] Checkpoint end - txts: 20462 [2022-10-12 00:19:30,08] [info] Checkpoint start [2022-10-12 00:19:30,08] [info] checkpointClose start [2022-10-12 00:19:30,08] [info] checkpointClose synched [2022-10-12 00:19:30,12] [info] checkpointClose script done [2022-10-12 00:19:30,12] [info] dataFileCache commit start [2022-10-12 00:19:30,15] [info] dataFileCache commit end [2022-10-12 00:19:30,28] [info] checkpointClose end [2022-10-12 00:19:30,29] [info] Checkpoint end - txts: 20464 [2022-10-12 00:19:30,32] [info] Checkpoint start [2022-10-12 00:19:30,32] [info] checkpointClose start [2022-10-12 00:19:30,36] [info] checkpointClose synched [2022-10-12 00:19:30,41] [info] checkpointClose script done [2022-10-12 00:19:30,41] [info] dataFileCache commit start [2022-10-12 00:19:30,60] [info] dataFileCache commit end [2022-10-12 00:19:30,68] [info] checkpointClose end [2022-10-12 00:19:30,68] [info] Checkpoint end - txts: 20471 [2022-10-12 00:19:30,68] [info] Checkpoint start [2022-10-12 00:19:30,68] [info] checkpointClose start [2022-10-12 00:19:30,68] [info] checkpointClose synched [2022-10-12 00:19:30,73] [info] checkpointClose script done [2022-10-12 00:19:30,73] [info] dataFileCache commit start [2022-10-12 00:19:30,76] [info] dataFileCache commit end [2022-10-12 00:19:30,82] [info] checkpointClose end [2022-10-12 00:19:30,82] [info] Checkpoint end - txts: 20473 [2022-10-12 00:19:30,82] [info] Checkpoint start [2022-10-12 00:19:30,82] [info] checkpointClose start [2022-10-12 00:19:30,82] [info] checkpointClose synched [2022-10-12 00:19:30,86] [info] checkpointClose script done [2022-10-12 00:19:30,86] [info] dataFileCache commit start [2022-10-12 00:19:30,90] [info] dataFileCache commit end [2022-10-12 00:19:30,96] [info] checkpointClose end [2022-10-12 00:19:30,96] [info] Checkpoint end - txts: 20475 [2022-10-12 00:19:30,99] [info] Running with database db.url = jdbc:hsqldb:file:cromwell-executions/cromwell-db/cromwell-db; shutdown=false; hsqldb.default_table_type=cached;hsqldb.tx=mvcc; hsqldb.result_max_memory_rows=10000; hsqldb.large_data=true; hsqldb.applog=1; hsqldb.lob_compressed=true; hsqldb.script_format=3

[2022-10-12 00:19:31,04] [info] Checkpoint start [2022-10-12 00:19:31,04] [info] checkpointClose start [2022-10-12 00:19:31,04] [info] checkpointClose synched [2022-10-12 00:19:31,13] [info] checkpointClose script done [2022-10-12 00:19:31,14] [info] dataFileCache commit start [2022-10-12 00:19:31,17] [info] dataFileCache commit end [2022-10-12 00:19:31,24] [info] checkpointClose end [2022-10-12 00:19:31,24] [info] Checkpoint end - txts: 20483 [2022-10-12 00:19:31,29] [info] Checkpoint start [2022-10-12 00:19:31,29] [info] checkpointClose start [2022-10-12 00:19:31,32] [info] checkpointClose synched [2022-10-12 00:19:31,37] [info] checkpointClose script done [2022-10-12 00:19:31,37] [info] dataFileCache commit start [2022-10-12 00:19:31,47] [info] dataFileCache commit end [2022-10-12 00:19:31,54] [info] checkpointClose end [2022-10-12 00:19:31,54] [info] Checkpoint end - txts: 20499 [2022-10-12 00:19:31,54] [info] Checkpoint start [2022-10-12 00:19:31,54] [info] checkpointClose start [2022-10-12 00:19:31,54] [info] checkpointClose synched [2022-10-12 00:19:31,61] [info] checkpointClose script done [2022-10-12 00:19:31,61] [info] dataFileCache commit start [2022-10-12 00:19:31,63] [info] dataFileCache commit end [2022-10-12 00:19:31,70] [info] checkpointClose end [2022-10-12 00:19:31,70] [info] Checkpoint end - txts: 20501 [2022-10-12 00:19:31,85] [info] Checkpoint start [2022-10-12 00:19:31,85] [info] checkpointClose start [2022-10-12 00:19:31,85] [info] checkpointClose synched [2022-10-12 00:19:31,91] [info] checkpointClose script done [2022-10-12 00:19:31,91] [info] dataFileCache commit start [2022-10-12 00:19:31,97] [info] dataFileCache commit end [2022-10-12 00:19:32,03] [info] checkpointClose end [2022-10-12 00:19:32,03] [info] Checkpoint end - txts: 20524 [2022-10-12 00:19:32,03] [info] Checkpoint start [2022-10-12 00:19:32,03] [info] checkpointClose start [2022-10-12 00:19:32,03] [info] checkpointClose synched [2022-10-12 00:19:32,09] [info] checkpointClose script done [2022-10-12 00:19:32,09] [info] dataFileCache commit start [2022-10-12 00:19:32,12] [info] dataFileCache commit end [2022-10-12 00:19:32,21] [info] checkpointClose end [2022-10-12 00:19:32,21] [info] Checkpoint end - txts: 20526 [2022-10-12 00:19:32,23] [info] Checkpoint start [2022-10-12 00:19:32,23] [info] checkpointClose start [2022-10-12 00:19:32,23] [info] checkpointClose synched [2022-10-12 00:19:32,27] [info] checkpointClose script done [2022-10-12 00:19:32,27] [info] dataFileCache commit start [2022-10-12 00:19:32,30] [info] dataFileCache commit end [2022-10-12 00:19:32,36] [info] checkpointClose end [2022-10-12 00:19:32,36] [info] Checkpoint end - txts: 20528 [2022-10-12 00:19:32,41] [info] Checkpoint start [2022-10-12 00:19:32,41] [info] checkpointClose start [2022-10-12 00:19:32,46] [info] checkpointClose synched [2022-10-12 00:19:32,53] [info] checkpointClose script done [2022-10-12 00:19:32,53] [info] dataFileCache commit start [2022-10-12 00:19:32,62] [info] dataFileCache commit end [2022-10-12 00:19:32,75] [info] checkpointClose end [2022-10-12 00:19:32,76] [info] Checkpoint end - txts: 20535 [2022-10-12 00:19:32,76] [info] Checkpoint start [2022-10-12 00:19:32,76] [info] checkpointClose start [2022-10-12 00:19:32,76] [info] checkpointClose synched [2022-10-12 00:19:32,82] [info] checkpointClose script done [2022-10-12 00:19:32,82] [info] dataFileCache commit start [2022-10-12 00:19:32,85] [info] dataFileCache commit end [2022-10-12 00:19:32,96] [info] checkpointClose end [2022-10-12 00:19:32,96] [info] Checkpoint end - txts: 20537 [2022-10-12 00:19:32,96] [info] Checkpoint start [2022-10-12 00:19:32,96] [info] checkpointClose start [2022-10-12 00:19:32,96] [info] checkpointClose synched [2022-10-12 00:19:33,01] [info] checkpointClose script done [2022-10-12 00:19:33,01] [info] dataFileCache commit start [2022-10-12 00:19:33,04] [info] dataFileCache commit end [2022-10-12 00:19:33,14] [info] checkpointClose end [2022-10-12 00:19:33,14] [info] Checkpoint end - txts: 20539 [2022-10-12 00:19:33,44] [info] Slf4jLogger started [2022-10-12 00:19:33,68] [info] Workflow heartbeat configuration: { "cromwellId" : "cromid-c53643c", "heartbeatInterval" : "2 minutes", "ttl" : "10 minutes", "failureShutdownDuration" : "5 minutes", "writeBatchSize" : 10000, "writeThreshold" : 10000 } [2022-10-12 00:19:33,75] [info] Metadata summary refreshing every 1 second. [2022-10-12 00:19:33,76] [warn] 'docker.hash-lookup.gcr-api-queries-per-100-seconds' is being deprecated, use 'docker.hash-lookup.gcr.throttle' instead (see reference.conf) [2022-10-12 00:19:33,77] [info] KvWriteActor configured to flush with batch size 200 and process rate 5 seconds. [2022-10-12 00:19:33,78] [info] CallCacheWriteActor configured to flush with batch size 100 and process rate 3 seconds. [2022-10-12 00:19:33,78] [info] WriteMetadataActor configured to flush with batch size 200 and process rate 5 seconds. [2022-10-12 00:19:33,88] [info] JobExecutionTokenDispenser - Distribution rate: 50 per 1 seconds. [2022-10-12 00:19:33,94] [info] SingleWorkflowRunnerActor: Version 58 [2022-10-12 00:19:33,95] [info] SingleWorkflowRunnerActor: Submitting workflow [2022-10-12 00:19:34,00] [info] Unspecified type (Unspecified version) workflow c00a1a33-c10b-43ec-9c63-90846d899201 submitted [2022-10-12 00:19:34,03] [info] SingleWorkflowRunnerActor: Workflow submitted c00a1a33-c10b-43ec-9c63-90846d899201 [2022-10-12 00:19:34,04] [info] 1 new workflows fetched by cromid-c53643c: c00a1a33-c10b-43ec-9c63-90846d899201 [2022-10-12 00:19:34,05] [info] WorkflowManagerActor Starting workflow c00a1a33-c10b-43ec-9c63-90846d899201 [2022-10-12 00:19:34,06] [info] WorkflowManagerActor Successfully started WorkflowActor-c00a1a33-c10b-43ec-9c63-90846d899201 [2022-10-12 00:19:34,06] [info] Retrieved 1 workflows from the WorkflowStoreActor [2022-10-12 00:19:34,09] [info] WorkflowStoreHeartbeatWriteActor configured to flush with batch size 10000 and process rate 2 minutes. [2022-10-12 00:19:34,21] [info] MaterializeWorkflowDescriptorActor [c00a1a33]: Parsing workflow as WDL 1.0 [2022-10-12 00:19:35,90] [info] MaterializeWorkflowDescriptorActor [c00a1a33]: Call-to-Backend assignments: annotate_variants_wf.annotate_RNA_editing -> LocalExample, annotate_variants_wf.annotate_gnomad -> LocalExample, ctat_mutations.HaplotypeCallerExtra -> LocalExample, ctat_mutations.HaplotypeCallerInterval -> LocalExample, annotate_variants_wf.annotate_repeats -> LocalExample, ctat_mutations.SplitReads -> LocalExample, ctat_mutations.MarkDuplicates -> LocalExample, annotate_variants_wf.annotate_blat_ED -> LocalExample, ctat_mutations.MergeFastas -> LocalExample, annotate_variants_wf.annotate_splice_distance -> LocalExample, ctat_mutations.CancerVariantReport -> LocalExample, annotate_variants_wf.left_norm_vcf -> LocalExample, annotate_variants_wf.annotate_dbsnp -> LocalExample, ctat_mutations.FilterCancerVariants -> LocalExample, ctat_mutations.ApplyBQSR -> LocalExample, annotate_variants_wf.open_cravat -> LocalExample, ctat_mutations.VariantFiltration -> LocalExample, ctat_mutations.SplitIntervals -> LocalExample, annotate_variants_wf.annotate_cosmic_variants -> LocalExample, ctat_mutations.AddOrReplaceReadGroups -> LocalExample, annotate_variants_wf.snpEff -> LocalExample, ctat_mutations.SplitNCigarReads -> LocalExample, ctat_mutations.CreateFastaIndex -> LocalExample, annotate_variants_wf.annotate_PASS_reads -> LocalExample, ctat_mutations.StarAlign -> LocalExample, ctat_mutations.MergeVCFs -> LocalExample, ctat_mutations.BaseRecalibrator -> LocalExample, annotate_variants_wf.annotate_homopolymers_n_entropy -> LocalExample, annotate_variants_wf.rename_vcf -> LocalExample, ctat_mutations.MergePrimaryAndExtraVCFs -> LocalExample, ctat_mutations.HaplotypeCaller -> LocalExample [2022-10-12 00:19:36,12] [warn] LocalExample [c00a1a33]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,12] [warn] LocalExample [c00a1a33]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,12] [warn] LocalExample [c00a1a33]: Key/s [docker, memory, disks, preemptible] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,12] [warn] LocalExample [c00a1a33]: Key/s [docker, memory, disks, preemptible] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [warn] LocalExample [c00a1a33]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [warn] LocalExample [c00a1a33]: Key/s [preemptible, bootDiskSizeGb, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [warn] LocalExample [c00a1a33]: Key/s [disks, docker, memory, preemptible] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [warn] LocalExample [c00a1a33]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [warn] LocalExample [c00a1a33]: Key/s [preemptible, bootDiskSizeGb, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [warn] LocalExample [c00a1a33]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [warn] LocalExample [c00a1a33]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [warn] LocalExample [c00a1a33]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [warn] LocalExample [c00a1a33]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [warn] LocalExample [c00a1a33]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [warn] LocalExample [c00a1a33]: Key/s [memory, disks, preemptible, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [warn] LocalExample [c00a1a33]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [warn] LocalExample [c00a1a33]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [warn] LocalExample [c00a1a33]: Key/s [preemptible, bootDiskSizeGb, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [warn] LocalExample [c00a1a33]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [warn] LocalExample [c00a1a33]: Key/s [memory, disks, docker, preemptible] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [warn] LocalExample [c00a1a33]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [warn] LocalExample [c00a1a33]: Key/s [disks, docker, memory, preemptible] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [warn] LocalExample [c00a1a33]: Key/s [preemptible, bootDiskSizeGb, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [warn] LocalExample [c00a1a33]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [warn] LocalExample [c00a1a33]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [warn] LocalExample [c00a1a33]: Key/s [memory, disks, docker, preemptible] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [warn] LocalExample [c00a1a33]: Key/s [memory, disks, docker, preemptible] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,15] [warn] LocalExample [c00a1a33]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,15] [warn] LocalExample [c00a1a33]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,15] [warn] LocalExample [c00a1a33]: Key/s [memory, disks, docker, preemptible] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,15] [warn] LocalExample [c00a1a33]: Key/s [docker, memory, disks, preemptible] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:38,89] [info] Not triggering log of token queue status. Effective log interval = None [2022-10-12 00:19:41,61] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [c00a1a33]: Starting ctat_mutations.StarAlign [2022-10-12 00:19:41,92] [info] Assigned new job execution tokens to the following groups: c00a1a33: 1 [2022-10-12 00:19:42,39] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-StarAlign/cacheCopy/execution/test.star.Log.final.out -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-StarAlign/execution/test.star.Log.final.out: Operation not permitted [2022-10-12 00:19:42,42] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-StarAlign/cacheCopy/execution/stdout -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-StarAlign/execution/stdout: Operation not permitted [2022-10-12 00:19:42,43] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-StarAlign/cacheCopy/execution/rc -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-StarAlign/execution/rc: Operation not permitted [2022-10-12 00:19:42,43] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-StarAlign/cacheCopy/execution/test.star.Aligned.sortedByCoord.out.bam -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-StarAlign/execution/test.star.Aligned.sortedByCoord.out.bam: Operation not permitted [2022-10-12 00:19:42,44] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-StarAlign/cacheCopy/execution/stderr -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-StarAlign/execution/stderr: Operation not permitted [2022-10-12 00:19:42,45] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-StarAlign/cacheCopy/execution/script -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-StarAlign/execution/script: Operation not permitted [2022-10-12 00:19:42,45] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-StarAlign/cacheCopy/execution/test.star.Log.out -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-StarAlign/execution/test.star.Log.out: Operation not permitted [2022-10-12 00:19:42,46] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-StarAlign/cacheCopy/execution/test.star.SJ.out.tab -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-StarAlign/execution/test.star.SJ.out.tab: Operation not permitted [2022-10-12 00:19:42,47] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-StarAlign/cacheCopy/execution/test.star.Aligned.sortedByCoord.out.bam.bai -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-StarAlign/execution/test.star.Aligned.sortedByCoord.out.bam.bai: Operation not permitted [2022-10-12 00:19:42,47] [warn] c00a1a33-c10b-43ec-9c63-90846d899201-BackendCacheHitCopyingActor-c00a1a33:ctat_mutations.StarAlign:-1:1-1 [c00a1a33ctat_mutations.StarAlign:NA:1]: Unrecognized runtime attribute keys: preemptible, disks, docker, cpu, memory [2022-10-12 00:19:42,49] [info] c00a1a33-c10b-43ec-9c63-90846d899201-EngineJobExecutionActor-ctat_mutations.StarAlign:NA:1 [c00a1a33]: Call cache hit process had 0 total hit failures before completing successfully [2022-10-12 00:19:43,38] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [c00a1a33]: Job results retrieved (CallCached): 'ctat_mutations.StarAlign' (scatter index: None, attempt 1) [2022-10-12 00:19:43,69] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [c00a1a33]: Starting ctat_mutations.SplitIntervals [2022-10-12 00:19:43,90] [info] Assigned new job execution tokens to the following groups: c00a1a33: 1 [2022-10-12 00:19:45,73] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [c00a1a33]: Starting ctat_mutations.AddOrReplaceReadGroups [2022-10-12 00:19:45,90] [info] Assigned new job execution tokens to the following groups: c00a1a33: 1 [2022-10-12 00:19:45,97] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-AddOrReplaceReadGroups/cacheCopy/execution/test.sorted.sorted.bam.bai -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-AddOrReplaceReadGroups/execution/test.sorted.sorted.bam.bai: Operation not permitted [2022-10-12 00:19:45,97] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-AddOrReplaceReadGroups/cacheCopy/execution/rc -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-AddOrReplaceReadGroups/execution/rc: Operation not permitted [2022-10-12 00:19:45,98] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-AddOrReplaceReadGroups/cacheCopy/execution/stdout -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-AddOrReplaceReadGroups/execution/stdout: Operation not permitted [2022-10-12 00:19:45,98] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-AddOrReplaceReadGroups/cacheCopy/execution/script -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-AddOrReplaceReadGroups/execution/script: Operation not permitted [2022-10-12 00:19:45,99] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-AddOrReplaceReadGroups/cacheCopy/execution/test.sorted.sorted.bam -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-AddOrReplaceReadGroups/execution/test.sorted.sorted.bam: Operation not permitted [2022-10-12 00:19:45,99] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-AddOrReplaceReadGroups/cacheCopy/execution/stderr -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-AddOrReplaceReadGroups/execution/stderr: Operation not permitted [2022-10-12 00:19:46,00] [warn] c00a1a33-c10b-43ec-9c63-90846d899201-BackendCacheHitCopyingActor-c00a1a33:ctat_mutations.AddOrReplaceReadGroups:-1:1-3 [c00a1a33ctat_mutations.AddOrReplaceReadGroups:NA:1]: Unrecognized runtime attribute keys: preemptible, disks, docker, memory [2022-10-12 00:19:46,00] [info] c00a1a33-c10b-43ec-9c63-90846d899201-EngineJobExecutionActor-ctat_mutations.AddOrReplaceReadGroups:NA:1 [c00a1a33]: Call cache hit process had 0 total hit failures before completing successfully [2022-10-12 00:19:49,20] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [c00a1a33]: Job results retrieved (CallCached): 'ctat_mutations.AddOrReplaceReadGroups' (scatter index: None, attempt 1) [2022-10-12 00:19:51,85] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [c00a1a33]: Starting ctat_mutations.MarkDuplicates [2022-10-12 00:19:51,90] [info] Assigned new job execution tokens to the following groups: c00a1a33: 1 [2022-10-12 00:19:51,96] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-MarkDuplicates/cacheCopy/execution/script -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-MarkDuplicates/execution/script: Operation not permitted [2022-10-12 00:19:51,96] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-MarkDuplicates/cacheCopy/execution/stderr -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-MarkDuplicates/execution/stderr: Operation not permitted [2022-10-12 00:19:51,97] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-MarkDuplicates/cacheCopy/execution/test.dedupped.bam -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-MarkDuplicates/execution/test.dedupped.bam: Operation not permitted [2022-10-12 00:19:51,97] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-MarkDuplicates/cacheCopy/execution/test.dedupped.bai -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-MarkDuplicates/execution/test.dedupped.bai: Operation not permitted [2022-10-12 00:19:51,98] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-MarkDuplicates/cacheCopy/execution/rc -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-MarkDuplicates/execution/rc: Operation not permitted [2022-10-12 00:19:51,98] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-MarkDuplicates/cacheCopy/execution/test.dedupped.metrics -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-MarkDuplicates/execution/test.dedupped.metrics: Operation not permitted [2022-10-12 00:19:51,98] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-MarkDuplicates/cacheCopy/execution/stdout -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-MarkDuplicates/execution/stdout: Operation not permitted [2022-10-12 00:19:51,99] [warn] c00a1a33-c10b-43ec-9c63-90846d899201-BackendCacheHitCopyingActor-c00a1a33:ctat_mutations.MarkDuplicates:-1:1-4 [c00a1a33ctat_mutations.MarkDuplicates:NA:1]: Unrecognized runtime attribute keys: preemptible, disks, docker, memory [2022-10-12 00:19:51,99] [info] c00a1a33-c10b-43ec-9c63-90846d899201-EngineJobExecutionActor-ctat_mutations.MarkDuplicates:NA:1 [c00a1a33]: Call cache hit process had 0 total hit failures before completing successfully [2022-10-12 00:19:55,34] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [c00a1a33]: Job results retrieved (CallCached): 'ctat_mutations.MarkDuplicates' (scatter index: None, attempt 1) [2022-10-12 00:19:57,98] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [c00a1a33]: Starting ctat_mutations.SplitNCigarReads [2022-10-12 00:19:58,90] [info] Assigned new job execution tokens to the following groups: c00a1a33: 1 [2022-10-12 00:20:53,81] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitIntervals/cacheCopy/execution/stdout -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-SplitIntervals/execution/stdout: Operation not permitted [2022-10-12 00:20:53,81] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitIntervals/cacheCopy/execution/rc -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-SplitIntervals/execution/rc: Operation not permitted [2022-10-12 00:20:53,82] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitIntervals/cacheCopy/execution/stderr -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-SplitIntervals/execution/stderr: Operation not permitted [2022-10-12 00:20:53,82] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitIntervals/cacheCopy/execution/script -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-SplitIntervals/execution/script: Operation not permitted [2022-10-12 00:20:53,83] [warn] c00a1a33-c10b-43ec-9c63-90846d899201-BackendCacheHitCopyingActor-c00a1a33:ctat_mutations.SplitIntervals:-1:1-0 [c00a1a33ctat_mutations.SplitIntervals:NA:1]: Unrecognized runtime attribute keys: preemptible, bootDiskSizeGb, disks, docker, cpu, memory [2022-10-12 00:20:53,83] [info] c00a1a33-c10b-43ec-9c63-90846d899201-EngineJobExecutionActor-ctat_mutations.SplitIntervals:NA:1 [c00a1a33]: Call cache hit process had 0 total hit failures before completing successfully [2022-10-12 00:20:54,88] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [c00a1a33]: Job results retrieved (CallCached): 'ctat_mutations.SplitIntervals' (scatter index: None, attempt 1) [2022-10-12 00:20:59,21] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [c00a1a33]: Starting ctat_mutations.MergeVCFs [2022-10-12 00:20:59,90] [info] Assigned new job execution tokens to the following groups: c00a1a33: 1 [2022-10-12 00:20:59,93] [info] c00a1a33-c10b-43ec-9c63-90846d899201-EngineJobExecutionActor-ctat_mutations.MergeVCFs:NA:1 [c00a1a33]: Could not copy a suitable cache hit for c00a1a33:ctat_mutations.MergeVCFs:-1:1. No copy attempts were made. [2022-10-12 00:20:59,96] [warn] BackgroundConfigAsyncJobExecutionActor [c00a1a33ctat_mutations.MergeVCFs:NA:1]: Unrecognized runtime attribute keys: preemptible, disks, docker, memory [2022-10-12 00:21:00,02] [info] BackgroundConfigAsyncJobExecutionActor [c00a1a33ctat_mutations.MergeVCFs:NA:1]: set -e

monitor_script.sh &

python <<CODE

make sure vcf index exists

import subprocess import os input_vcfs = ''.split(',') for input_vcf in input_vcfs: if not os.path.exists(input_vcf + '.tbi') and not os.path.exists(input_vcf + '.csi') and not os.path.exists(input_vcf + '.idx'): subprocess.check_call(['bcftools', 'index', input_vcf]) CODE

gatk --java-options "-Xmx2000m" \ MergeVcfs \ -I \ -O test.vcf.gz [2022-10-12 00:21:00,08] [info] BackgroundConfigAsyncJobExecutionActor [c00a1a33ctat_mutations.MergeVCFs:NA:1]: executing: /usr/bin/env bash /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-MergeVCFs/execution/script [2022-10-12 00:21:03,83] [info] BackgroundConfigAsyncJobExecutionActor [c00a1a33ctat_mutations.MergeVCFs:NA:1]: job id: 44124 [2022-10-12 00:21:03,84] [info] BackgroundConfigAsyncJobExecutionActor [c00a1a33ctat_mutations.MergeVCFs:NA:1]: Status change from - to Done [2022-10-12 00:21:08,50] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitNCigarReads/cacheCopy/execution/stderr -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-SplitNCigarReads/execution/stderr: Operation not permitted [2022-10-12 00:21:08,51] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitNCigarReads/cacheCopy/execution/stdout -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-SplitNCigarReads/execution/stdout: Operation not permitted [2022-10-12 00:21:08,51] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitNCigarReads/cacheCopy/execution/rc -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-SplitNCigarReads/execution/rc: Operation not permitted [2022-10-12 00:21:08,53] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitNCigarReads/cacheCopy/execution/test.split.bai -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-SplitNCigarReads/execution/test.split.bai: Operation not permitted [2022-10-12 00:21:08,54] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitNCigarReads/cacheCopy/execution/test.split.bam -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-SplitNCigarReads/execution/test.split.bam: Operation not permitted [2022-10-12 00:21:08,54] [warn] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitNCigarReads/cacheCopy/execution/script -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-SplitNCigarReads/execution/script: Operation not permitted [2022-10-12 00:21:08,55] [warn] c00a1a33-c10b-43ec-9c63-90846d899201-BackendCacheHitCopyingActor-c00a1a33:ctat_mutations.SplitNCigarReads:-1:1-6 [c00a1a33ctat_mutations.SplitNCigarReads:NA:1]: Unrecognized runtime attribute keys: preemptible, disks, docker, memory [2022-10-12 00:21:08,55] [info] c00a1a33-c10b-43ec-9c63-90846d899201-EngineJobExecutionActor-ctat_mutations.SplitNCigarReads:NA:1 [c00a1a33]: Call cache hit process had 0 total hit failures before completing successfully [2022-10-12 00:21:10,32] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [c00a1a33]: Job results retrieved (CallCached): 'ctat_mutations.SplitNCigarReads' (scatter index: None, attempt 1) [2022-10-12 00:21:10,98] [info] WorkflowManagerActor Workflow c00a1a33-c10b-43ec-9c63-90846d899201 failed (during ExecutingWorkflowState): Job ctat_mutations.MergeVCFs:NA:1 exited with return code 1 which has not been declared as a valid return code. See 'continueOnReturnCode' runtime attribute for more details. Check the content of stderr for potential additional information: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-MergeVCFs/execution/stderr. [First 3000 bytes]:index: failed to open Traceback (most recent call last): File "", line 7, in File "/opt/conda/lib/python3.7/subprocess.py", line 347, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command '['bcftools', 'index', '']' returned non-zero exit status 255.

[2022-10-12 00:21:10,98] [info] WorkflowManagerActor WorkflowActor-c00a1a33-c10b-43ec-9c63-90846d899201 is in a terminal state: WorkflowFailedState [2022-10-12 00:21:16,74] [info] SingleWorkflowRunnerActor workflow finished with status 'Failed'. [2022-10-12 00:21:19,13] [info] SingleWorkflowRunnerActor writing metadata to /tmp/tmp8humz7mv.json [2022-10-12 00:21:19,16] [info] Workflow polling stopped [2022-10-12 00:21:19,17] [info] 0 workflows released by cromid-c53643c [2022-10-12 00:21:19,17] [info] Shutting down WorkflowStoreActor - Timeout = 5 seconds [2022-10-12 00:21:19,18] [info] Shutting down WorkflowLogCopyRouter - Timeout = 5 seconds [2022-10-12 00:21:19,18] [info] Shutting down JobExecutionTokenDispenser - Timeout = 5 seconds [2022-10-12 00:21:19,18] [info] Aborting all running workflows. [2022-10-12 00:21:19,18] [info] JobExecutionTokenDispenser stopped [2022-10-12 00:21:19,18] [info] WorkflowStoreActor stopped [2022-10-12 00:21:19,19] [info] WorkflowLogCopyRouter stopped [2022-10-12 00:21:19,19] [info] Shutting down WorkflowManagerActor - Timeout = 3600 seconds [2022-10-12 00:21:19,19] [info] WorkflowManagerActor All workflows finished [2022-10-12 00:21:19,19] [info] WorkflowManagerActor stopped [2022-10-12 00:21:19,45] [info] Connection pools shut down [2022-10-12 00:21:19,46] [info] Shutting down SubWorkflowStoreActor - Timeout = 1800 seconds [2022-10-12 00:21:19,46] [info] Shutting down JobStoreActor - Timeout = 1800 seconds [2022-10-12 00:21:19,46] [info] Shutting down CallCacheWriteActor - Timeout = 1800 seconds [2022-10-12 00:21:19,46] [info] Shutting down ServiceRegistryActor - Timeout = 1800 seconds [2022-10-12 00:21:19,46] [info] Shutting down DockerHashActor - Timeout = 1800 seconds [2022-10-12 00:21:19,46] [info] SubWorkflowStoreActor stopped [2022-10-12 00:21:19,46] [info] Shutting down IoProxy - Timeout = 1800 seconds [2022-10-12 00:21:19,46] [info] CallCacheWriteActor Shutting down: 0 queued messages to process [2022-10-12 00:21:19,46] [info] JobStoreActor stopped [2022-10-12 00:21:19,46] [info] CallCacheWriteActor stopped [2022-10-12 00:21:19,46] [info] WriteMetadataActor Shutting down: 0 queued messages to process [2022-10-12 00:21:19,46] [info] KvWriteActor Shutting down: 0 queued messages to process [2022-10-12 00:21:19,46] [info] IoProxy stopped [2022-10-12 00:21:19,46] [info] ServiceRegistryActor stopped [2022-10-12 00:21:19,47] [info] DockerHashActor stopped [2022-10-12 00:21:19,54] [info] Database closed [2022-10-12 00:21:19,54] [info] Stream materializer shut down [2022-10-12 00:21:19,55] [info] Shutting down connection pool: curAllocated=0 idleQueues.size=0 waitQueue.size=0 maxWaitQueueLimit=256 closed=false [2022-10-12 00:21:19,55] [info] Shutting down connection pool: curAllocated=0 idleQueues.size=0 waitQueue.size=0 maxWaitQueueLimit=256 closed=false [2022-10-12 00:21:19,55] [info] Shutting down connection pool: curAllocated=0 idleQueues.size=0 waitQueue.size=0 maxWaitQueueLimit=256 closed=false [2022-10-12 00:21:19,55] [info] Shutting down connection pool: curAllocated=0 idleQueues.size=0 waitQueue.size=0 maxWaitQueueLimit=256 closed=false [2022-10-12 00:21:19,55] [info] WDL HTTP import resolver closed Workflow c00a1a33-c10b-43ec-9c63-90846d899201 transitioned to state Failed

brianjohnhaas commented 1 year ago

Hi,

Sorry, it's not clear to me why it's not working here. Have you tried rerunning it with a new output directory name?

On Tue, Oct 11, 2022 at 8:30 PM DecodeGenome @.***> wrote:

Hi Bian,

I just ran the same script on server terminal and save all information to the text file (see attachment) for your inspection.

Hope this will help.

Thanks for your support,

Wei


From: Brian Haas @.> Sent: Tuesday, October 11, 2022 4:36 PM To: NCIP/ctat-mutations @.> Cc: Wu, Wei @.>; Author @.> Subject: Re: [NCIP/ctat-mutations] Could not run the test sample using ctat-mutations (Issue #111)

Hi, That is peculiar. Were there any earlier error messages encountered that might shed some more insight? best, ~b On Tue, Oct 11, 2022 at 5: 36 PM DecodeGenome @ . ***> wrote: > Hi, > > I try to run ctat-mutations on your test ZjQcmQRYFpfptBannerStart This Message Is From an External Sender This message came from outside your organization.

ZjQcmQRYFpfptBannerEnd

Hi,

That is peculiar. Were there any earlier error messages encountered that might shed some more insight?

best,

~b

On Tue, Oct 11, 2022 at 5:36 PM DecodeGenome @.***> wrote:

Hi,

I try to run ctat-mutations on your test samples at our server using singularity script, I got the error below. My server admin said the error is from the singularity container. Could you please help me out how to solve the problem.

Thanks,

Wei

################# Script######### singularity exec -e -B /wynton/group/bivona/RNAseq_CTAT_Mutations:/data -B

/wynton/group/bivona/Analytic_tools/GRCh37_gencode_v19_CTAT_lib_Mar012021.plug-n-play/ctat_genome_lib_build_dir:/ctat_genome_lib_dir:ro

/wynton/group/bivona/RNAseq_CTAT_Mutations/ctat_mutations.v3.2.1.simg /usr/local/src/ctat-mutations/ctat_mutations --left /data/reads_1.fastq.gz --right /data/reads_2.fastq.gz --sample_id test --outputdir /data/example.HC.simg_Oct4.singularity --cpu 10 --genome_lib_dir /ctat_genome_lib_dir --boosting_method none

########## error ############################### index: failed to open Traceback (most recent call last): File "", line 7, in File "/opt/conda/lib/python3.7/subprocess.py", line 347, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command '['bcftools', 'index', '']' returned non-zero exit status 255. ###############################

— Reply to this email directly, view it on GitHub https://github.com/NCIP/ctat-mutations/issues/111< https://urldefense.com/v3/__https://github.com/NCIP/ctat-mutations/issues/111*3E__;JQ!!LQC6Cpwp!pCsRoGgCo6Fk6rgiqx8A1-As3vJ829vFJBrCJws3MkfD40sW9pNykI8G3VtAQrbQCKOY74r0kqORAccOFsrrKbwe$>, or unsubscribe < https://github.com/notifications/unsubscribe-auth/ABZRKXZYDK3WLLHE7NN7AFLWCXMXJANCNFSM6AAAAAARCWBNYI < https://urldefense.com/v3/__https://github.com/notifications/unsubscribe-auth/ABZRKXZYDK3WLLHE7NN7AFLWCXMXJANCNFSM6AAAAAARCWBNYI*3E__;JQ!!LQC6Cpwp!pCsRoGgCo6Fk6rgiqx8A1-As3vJ829vFJBrCJws3MkfD40sW9pNykI8G3VtAQrbQCKOY74r0kqORAccOFvqdfduP$>

. You are receiving this because you are subscribed to this thread.Message ID: @.***>

--

Brian J. Haas The Broad Institute http://broadinstitute.org/~bhaas< https://urldefense.com/v3/__http://broadinstitute.org/*bhaas__;fg!!LQC6Cpwp!pCsRoGgCo6Fk6rgiqx8A1-As3vJ829vFJBrCJws3MkfD40sW9pNykI8G3VtAQrbQCKOY74r0kqORAccOFvATRP4u$> http://broad.mit.edu/~bhaas< https://urldefense.com/v3/__http://broad.mit.edu/*bhaas*3E__;fiU!!LQC6Cpwp!pCsRoGgCo6Fk6rgiqx8A1-As3vJ829vFJBrCJws3MkfD40sW9pNykI8G3VtAQrbQCKOY74r0kqORAccOFi9iavOz$>

— Reply to this email directly, view it on GitHub< https://urldefense.com/v3/__https://github.com/NCIP/ctat-mutations/issues/111*issuecomment-1275403678__;Iw!!LQC6Cpwp!pCsRoGgCo6Fk6rgiqx8A1-As3vJ829vFJBrCJws3MkfD40sW9pNykI8G3VtAQrbQCKOY74r0kqORAccOFskAhM71$>, or unsubscribe< https://urldefense.com/v3/__https://github.com/notifications/unsubscribe-auth/ADZWCCEJEAQ64RAY54KFMLDWCX2Y5ANCNFSM6AAAAAARCWBNYI__;!!LQC6Cpwp!pCsRoGgCo6Fk6rgiqx8A1-As3vJ829vFJBrCJws3MkfD40sW9pNykI8G3VtAQrbQCKOY74r0kqORAccOFun3odBo$>.

You are receiving this because you authored the thread.Message ID: @.***>

17:19:16 : INFO : CMD: java -Djava.io.tmpdir=/data/example.singularity/__tmpdir -Dconfig.file=/usr/local/src/ctat-mutations/WDL/local_provider_config.inc.conf -jar /usr/local/src/ctat-mutations/WDL/cromwell-58.jar run -i /tmp/tmp13ca1t85.json -m /tmp/tmp8humz7mv.json /usr/local/src/ctat-mutations/WDL/ctat_mutations.wdl [2022-10-12 00:19:19,17] [info] Running with database db.url = jdbc:hsqldb:file:cromwell-executions/cromwell-db/cromwell-db; shutdown=false; hsqldb.default_table_type=cached;hsqldb.tx=mvcc; hsqldb.result_max_memory_rows=10000; hsqldb.large_data=true; hsqldb.applog=1; hsqldb.lob_compressed=true; hsqldb.script_format=3

[2022-10-12 00:19:19,63] [info] dataFileCache open start [2022-10-12 00:19:19,70] [info] dataFileCache open end [2022-10-12 00:19:19,93] [info] checkpointClose start [2022-10-12 00:19:19,93] [info] checkpointClose synched [2022-10-12 00:19:20,00] [info] checkpointClose script done [2022-10-12 00:19:20,00] [info] dataFileCache commit start [2022-10-12 00:19:20,16] [info] dataFileCache commit end [2022-10-12 00:19:20,26] [info] checkpointClose end [2022-10-12 00:19:26,71] [info] Checkpoint start [2022-10-12 00:19:26,71] [info] checkpointClose start [2022-10-12 00:19:26,71] [info] checkpointClose synched [2022-10-12 00:19:26,84] [info] checkpointClose script done [2022-10-12 00:19:26,84] [info] dataFileCache commit start [2022-10-12 00:19:26,88] [info] dataFileCache commit end [2022-10-12 00:19:27,02] [info] checkpointClose end [2022-10-12 00:19:27,03] [info] Checkpoint end - txts: 20419 [2022-10-12 00:19:27,20] [info] Checkpoint start [2022-10-12 00:19:27,20] [info] checkpointClose start [2022-10-12 00:19:27,24] [info] checkpointClose synched [2022-10-12 00:19:27,27] [info] checkpointClose script done [2022-10-12 00:19:27,28] [info] dataFileCache commit start [2022-10-12 00:19:27,35] [info] dataFileCache commit end [2022-10-12 00:19:27,55] [info] checkpointClose end [2022-10-12 00:19:27,55] [info] Checkpoint end - txts: 20435 [2022-10-12 00:19:27,55] [info] Checkpoint start [2022-10-12 00:19:27,55] [info] checkpointClose start [2022-10-12 00:19:27,56] [info] checkpointClose synched [2022-10-12 00:19:27,62] [info] checkpointClose script done [2022-10-12 00:19:27,62] [info] dataFileCache commit start [2022-10-12 00:19:27,65] [info] dataFileCache commit end [2022-10-12 00:19:27,75] [info] checkpointClose end [2022-10-12 00:19:27,76] [info] Checkpoint end - txts: 20437 [2022-10-12 00:19:29,39] [info] Checkpoint start [2022-10-12 00:19:29,39] [info] checkpointClose start [2022-10-12 00:19:29,39] [info] checkpointClose synched [2022-10-12 00:19:29,43] [info] checkpointClose script done [2022-10-12 00:19:29,43] [info] dataFileCache commit start [2022-10-12 00:19:29,48] [info] dataFileCache commit end [2022-10-12 00:19:29,54] [info] checkpointClose end [2022-10-12 00:19:29,54] [info] Checkpoint end - txts: 20460 [2022-10-12 00:19:29,54] [info] Checkpoint start [2022-10-12 00:19:29,54] [info] checkpointClose start [2022-10-12 00:19:29,54] [info] checkpointClose synched [2022-10-12 00:19:29,58] [info] checkpointClose script done [2022-10-12 00:19:29,58] [info] dataFileCache commit start [2022-10-12 00:19:29,77] [info] dataFileCache commit end [2022-10-12 00:19:29,92] [info] checkpointClose end [2022-10-12 00:19:29,92] [info] Checkpoint end - txts: 20462 [2022-10-12 00:19:30,08] [info] Checkpoint start [2022-10-12 00:19:30,08] [info] checkpointClose start [2022-10-12 00:19:30,08] [info] checkpointClose synched [2022-10-12 00:19:30,12] [info] checkpointClose script done [2022-10-12 00:19:30,12] [info] dataFileCache commit start [2022-10-12 00:19:30,15] [info] dataFileCache commit end [2022-10-12 00:19:30,28] [info] checkpointClose end [2022-10-12 00:19:30,29] [info] Checkpoint end - txts: 20464 [2022-10-12 00:19:30,32] [info] Checkpoint start [2022-10-12 00:19:30,32] [info] checkpointClose start [2022-10-12 00:19:30,36] [info] checkpointClose synched [2022-10-12 00:19:30,41] [info] checkpointClose script done [2022-10-12 00:19:30,41] [info] dataFileCache commit start [2022-10-12 00:19:30,60] [info] dataFileCache commit end [2022-10-12 00:19:30,68] [info] checkpointClose end [2022-10-12 00:19:30,68] [info] Checkpoint end - txts: 20471 [2022-10-12 00:19:30,68] [info] Checkpoint start [2022-10-12 00:19:30,68] [info] checkpointClose start [2022-10-12 00:19:30,68] [info] checkpointClose synched [2022-10-12 00:19:30,73] [info] checkpointClose script done [2022-10-12 00:19:30,73] [info] dataFileCache commit start [2022-10-12 00:19:30,76] [info] dataFileCache commit end [2022-10-12 00:19:30,82] [info] checkpointClose end [2022-10-12 00:19:30,82] [info] Checkpoint end - txts: 20473 [2022-10-12 00:19:30,82] [info] Checkpoint start [2022-10-12 00:19:30,82] [info] checkpointClose start [2022-10-12 00:19:30,82] [info] checkpointClose synched [2022-10-12 00:19:30,86] [info] checkpointClose script done [2022-10-12 00:19:30,86] [info] dataFileCache commit start [2022-10-12 00:19:30,90] [info] dataFileCache commit end [2022-10-12 00:19:30,96] [info] checkpointClose end [2022-10-12 00:19:30,96] [info] Checkpoint end - txts: 20475 [2022-10-12 00:19:30,99] [info] Running with database db.url = jdbc:hsqldb:file:cromwell-executions/cromwell-db/cromwell-db; shutdown=false; hsqldb.default_table_type=cached;hsqldb.tx=mvcc; hsqldb.result_max_memory_rows=10000; hsqldb.large_data=true; hsqldb.applog=1; hsqldb.lob_compressed=true; hsqldb.script_format=3

[2022-10-12 00:19:31,04] [info] Checkpoint start [2022-10-12 00:19:31,04] [info] checkpointClose start [2022-10-12 00:19:31,04] [info] checkpointClose synched [2022-10-12 00:19:31,13] [info] checkpointClose script done [2022-10-12 00:19:31,14] [info] dataFileCache commit start [2022-10-12 00:19:31,17] [info] dataFileCache commit end [2022-10-12 00:19:31,24] [info] checkpointClose end [2022-10-12 00:19:31,24] [info] Checkpoint end - txts: 20483 [2022-10-12 00:19:31,29] [info] Checkpoint start [2022-10-12 00:19:31,29] [info] checkpointClose start [2022-10-12 00:19:31,32] [info] checkpointClose synched [2022-10-12 00:19:31,37] [info] checkpointClose script done [2022-10-12 00:19:31,37] [info] dataFileCache commit start [2022-10-12 00:19:31,47] [info] dataFileCache commit end [2022-10-12 00:19:31,54] [info] checkpointClose end [2022-10-12 00:19:31,54] [info] Checkpoint end - txts: 20499 [2022-10-12 00:19:31,54] [info] Checkpoint start [2022-10-12 00:19:31,54] [info] checkpointClose start [2022-10-12 00:19:31,54] [info] checkpointClose synched [2022-10-12 00:19:31,61] [info] checkpointClose script done [2022-10-12 00:19:31,61] [info] dataFileCache commit start [2022-10-12 00:19:31,63] [info] dataFileCache commit end [2022-10-12 00:19:31,70] [info] checkpointClose end [2022-10-12 00:19:31,70] [info] Checkpoint end - txts: 20501 [2022-10-12 00:19:31,85] [info] Checkpoint start [2022-10-12 00:19:31,85] [info] checkpointClose start [2022-10-12 00:19:31,85] [info] checkpointClose synched [2022-10-12 00:19:31,91] [info] checkpointClose script done [2022-10-12 00:19:31,91] [info] dataFileCache commit start [2022-10-12 00:19:31,97] [info] dataFileCache commit end [2022-10-12 00:19:32,03] [info] checkpointClose end [2022-10-12 00:19:32,03] [info] Checkpoint end - txts: 20524 [2022-10-12 00:19:32,03] [info] Checkpoint start [2022-10-12 00:19:32,03] [info] checkpointClose start [2022-10-12 00:19:32,03] [info] checkpointClose synched [2022-10-12 00:19:32,09] [info] checkpointClose script done [2022-10-12 00:19:32,09] [info] dataFileCache commit start [2022-10-12 00:19:32,12] [info] dataFileCache commit end [2022-10-12 00:19:32,21] [info] checkpointClose end [2022-10-12 00:19:32,21] [info] Checkpoint end - txts: 20526 [2022-10-12 00:19:32,23] [info] Checkpoint start [2022-10-12 00:19:32,23] [info] checkpointClose start [2022-10-12 00:19:32,23] [info] checkpointClose synched [2022-10-12 00:19:32,27] [info] checkpointClose script done [2022-10-12 00:19:32,27] [info] dataFileCache commit start [2022-10-12 00:19:32,30] [info] dataFileCache commit end [2022-10-12 00:19:32,36] [info] checkpointClose end [2022-10-12 00:19:32,36] [info] Checkpoint end - txts: 20528 [2022-10-12 00:19:32,41] [info] Checkpoint start [2022-10-12 00:19:32,41] [info] checkpointClose start [2022-10-12 00:19:32,46] [info] checkpointClose synched [2022-10-12 00:19:32,53] [info] checkpointClose script done [2022-10-12 00:19:32,53] [info] dataFileCache commit start [2022-10-12 00:19:32,62] [info] dataFileCache commit end [2022-10-12 00:19:32,75] [info] checkpointClose end [2022-10-12 00:19:32,76] [info] Checkpoint end - txts: 20535 [2022-10-12 00:19:32,76] [info] Checkpoint start [2022-10-12 00:19:32,76] [info] checkpointClose start [2022-10-12 00:19:32,76] [info] checkpointClose synched [2022-10-12 00:19:32,82] [info] checkpointClose script done [2022-10-12 00:19:32,82] [info] dataFileCache commit start [2022-10-12 00:19:32,85] [info] dataFileCache commit end [2022-10-12 00:19:32,96] [info] checkpointClose end [2022-10-12 00:19:32,96] [info] Checkpoint end - txts: 20537 [2022-10-12 00:19:32,96] [info] Checkpoint start [2022-10-12 00:19:32,96] [info] checkpointClose start [2022-10-12 00:19:32,96] [info] checkpointClose synched [2022-10-12 00:19:33,01] [info] checkpointClose script done [2022-10-12 00:19:33,01] [info] dataFileCache commit start [2022-10-12 00:19:33,04] [info] dataFileCache commit end [2022-10-12 00:19:33,14] [info] checkpointClose end [2022-10-12 00:19:33,14] [info] Checkpoint end - txts: 20539 [2022-10-12 00:19:33,44] [info] Slf4jLogger started [2022-10-12 00:19:33,68] [info] Workflow heartbeat configuration: { "cromwellId" : "cromid-c53643c", "heartbeatInterval" : "2 minutes", "ttl" : "10 minutes", "failureShutdownDuration" : "5 minutes", "writeBatchSize" : 10000, "writeThreshold" : 10000 } [2022-10-12 00:19:33,75] [info] Metadata summary refreshing every 1 second. [2022-10-12 00:19:33,76] [ [38;5;220mwarn [0m] 'docker.hash-lookup.gcr-api-queries-per-100-seconds' is being deprecated, use 'docker.hash-lookup.gcr.throttle' instead (see reference.conf) [2022-10-12 00:19:33,77] [info] KvWriteActor configured to flush with batch size 200 and process rate 5 seconds. [2022-10-12 00:19:33,78] [info] CallCacheWriteActor configured to flush with batch size 100 and process rate 3 seconds. [2022-10-12 00:19:33,78] [info] WriteMetadataActor configured to flush with batch size 200 and process rate 5 seconds. [2022-10-12 00:19:33,88] [info] JobExecutionTokenDispenser - Distribution rate: 50 per 1 seconds. [2022-10-12 00:19:33,94] [info] SingleWorkflowRunnerActor: Version 58 [2022-10-12 00:19:33,95] [info] SingleWorkflowRunnerActor: Submitting workflow [2022-10-12 00:19:34,00] [info] Unspecified type (Unspecified version) workflow c00a1a33-c10b-43ec-9c63-90846d899201 submitted [2022-10-12 00:19:34,03] [info] SingleWorkflowRunnerActor: Workflow submitted [38;5;2mc00a1a33-c10b-43ec-9c63-90846d899201 [0m [2022-10-12 00:19:34,04] [info] 1 new workflows fetched by cromid-c53643c: c00a1a33-c10b-43ec-9c63-90846d899201 [2022-10-12 00:19:34,05] [info] WorkflowManagerActor Starting workflow [38;5;2mc00a1a33-c10b-43ec-9c63-90846d899201 [0m [2022-10-12 00:19:34,06] [info] WorkflowManagerActor Successfully started WorkflowActor-c00a1a33-c10b-43ec-9c63-90846d899201 [2022-10-12 00:19:34,06] [info] Retrieved 1 workflows from the WorkflowStoreActor [2022-10-12 00:19:34,09] [info] WorkflowStoreHeartbeatWriteActor configured to flush with batch size 10000 and process rate 2 minutes. [2022-10-12 00:19:34,21] [info] MaterializeWorkflowDescriptorActor [ [38;5;2mc00a1a33 [0m]: Parsing workflow as WDL 1.0 [2022-10-12 00:19:35,90] [info] MaterializeWorkflowDescriptorActor [ [38;5;2mc00a1a33 [0m]: Call-to-Backend assignments: annotate_variants_wf.annotate_RNA_editing -> LocalExample, annotate_variants_wf.annotate_gnomad -> LocalExample, ctat_mutations.HaplotypeCallerExtra -> LocalExample, ctat_mutations.HaplotypeCallerInterval -> LocalExample, annotate_variants_wf.annotate_repeats -> LocalExample, ctat_mutations.SplitReads -> LocalExample, ctat_mutations.MarkDuplicates -> LocalExample, annotate_variants_wf.annotate_blat_ED -> LocalExample, ctat_mutations.MergeFastas -> LocalExample, annotate_variants_wf.annotate_splice_distance -> LocalExample, ctat_mutations.CancerVariantReport -> LocalExample, annotate_variants_wf.left_norm_vcf -> LocalExample, annotate_variants_wf.annotate_dbsnp -> LocalExample, ctat_mutations.FilterCancerVariants -> LocalExample, ctat_mutations.ApplyBQSR -> LocalExample, annotate_variants_wf.open_cravat -> LocalExample, ctat_mutations.VariantFiltration -> LocalExample, ctat_mutations.SplitIntervals -> LocalExample, annotate_variants_wf.annotate_cosmic_variants -> LocalExample, ctat_mutations.AddOrReplaceReadGroups -> LocalExample, annotate_variants_wf.snpEff -> LocalExample, ctat_mutations.SplitNCigarReads -> LocalExample, ctat_mutations.CreateFastaIndex -> LocalExample, annotate_variants_wf.annotate_PASS_reads -> LocalExample, ctat_mutations.StarAlign -> LocalExample, ctat_mutations.MergeVCFs -> LocalExample, ctat_mutations.BaseRecalibrator -> LocalExample, annotate_variants_wf.annotate_homopolymers_n_entropy -> LocalExample, annotate_variants_wf.rename_vcf -> LocalExample, ctat_mutations.MergePrimaryAndExtraVCFs -> LocalExample, ctat_mutations.HaplotypeCaller -> LocalExample [2022-10-12 00:19:36,12] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,12] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,12] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [docker, memory, disks, preemptible] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,12] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [docker, memory, disks, preemptible] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, bootDiskSizeGb, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [disks, docker, memory, preemptible] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, bootDiskSizeGb, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,13] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [memory, disks, preemptible, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, bootDiskSizeGb, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [memory, disks, docker, preemptible] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [disks, docker, memory, preemptible] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, bootDiskSizeGb, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [memory, disks, docker, preemptible] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,14] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [memory, disks, docker, preemptible] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,15] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,15] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [preemptible, disks, docker, cpu, memory] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,15] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [memory, disks, docker, preemptible] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:36,15] [ [38;5;220mwarn [0m] LocalExample [ [38;5;2mc00a1a33 [0m]: Key/s [docker, memory, disks, preemptible] is/are not supported by backend. Unsupported attributes will not be part of job executions. [2022-10-12 00:19:38,89] [info] Not triggering log of token queue status. Effective log interval = None [2022-10-12 00:19:41,61] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [ [38;5;2mc00a1a33 [0m]: Starting ctat_mutations.StarAlign [2022-10-12 00:19:41,92] [info] Assigned new job execution tokens to the following groups: c00a1a33: 1 [2022-10-12 00:19:42,39] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-StarAlign/cacheCopy/execution/test.star.Log.final.out -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-StarAlign/execution/test.star.Log.final.out: Operation not permitted [2022-10-12 00:19:42,42] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-StarAlign/cacheCopy/execution/stdout -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-StarAlign/execution/stdout: Operation not permitted [2022-10-12 00:19:42,43] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-StarAlign/cacheCopy/execution/rc -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-StarAlign/execution/rc: Operation not permitted [2022-10-12 00:19:42,43] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-StarAlign/cacheCopy/execution/test.star.Aligned.sortedByCoord.out.bam -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-StarAlign/execution/test.star.Aligned.sortedByCoord.out.bam: Operation not permitted [2022-10-12 00:19:42,44] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-StarAlign/cacheCopy/execution/stderr -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-StarAlign/execution/stderr: Operation not permitted [2022-10-12 00:19:42,45] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-StarAlign/cacheCopy/execution/script -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-StarAlign/execution/script: Operation not permitted [2022-10-12 00:19:42,45] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-StarAlign/cacheCopy/execution/test.star.Log.out -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-StarAlign/execution/test.star.Log.out: Operation not permitted [2022-10-12 00:19:42,46] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-StarAlign/cacheCopy/execution/test.star.SJ.out.tab -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-StarAlign/execution/test.star.SJ.out.tab: Operation not permitted [2022-10-12 00:19:42,47] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-StarAlign/cacheCopy/execution/test.star.Aligned.sortedByCoord.out.bam.bai -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-StarAlign/execution/test.star.Aligned.sortedByCoord.out.bam.bai: Operation not permitted [2022-10-12 00:19:42,47] [ [38;5;220mwarn [0m] c00a1a33-c10b-43ec-9c63-90846d899201-BackendCacheHitCopyingActor-c00a1a33:ctat_mutations.StarAlign:-1:1-1 [ [38;5;2mc00a1a33 [0mctat_mutations.StarAlign:NA:1]: Unrecognized runtime attribute keys: preemptible, disks, docker, cpu, memory [2022-10-12 00:19:42,49] [info] c00a1a33-c10b-43ec-9c63-90846d899201-EngineJobExecutionActor-ctat_mutations.StarAlign:NA:1 [ [38;5;2mc00a1a33 [0m]: Call cache hit process had 0 total hit failures before completing successfully [2022-10-12 00:19:43,38] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [ [38;5;2mc00a1a33 [0m]: Job results retrieved (CallCached): 'ctat_mutations.StarAlign' (scatter index: None, attempt 1) [2022-10-12 00:19:43,69] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [ [38;5;2mc00a1a33 [0m]: Starting ctat_mutations.SplitIntervals [2022-10-12 00:19:43,90] [info] Assigned new job execution tokens to the following groups: c00a1a33: 1 [2022-10-12 00:19:45,73] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [ [38;5;2mc00a1a33 [0m]: Starting ctat_mutations.AddOrReplaceReadGroups [2022-10-12 00:19:45,90] [info] Assigned new job execution tokens to the following groups: c00a1a33: 1 [2022-10-12 00:19:45,97] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-AddOrReplaceReadGroups/cacheCopy/execution/test.sorted.sorted.bam.bai -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-AddOrReplaceReadGroups/execution/test.sorted.sorted.bam.bai: Operation not permitted [2022-10-12 00:19:45,97] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-AddOrReplaceReadGroups/cacheCopy/execution/rc -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-AddOrReplaceReadGroups/execution/rc: Operation not permitted [2022-10-12 00:19:45,98] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-AddOrReplaceReadGroups/cacheCopy/execution/stdout -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-AddOrReplaceReadGroups/execution/stdout: Operation not permitted [2022-10-12 00:19:45,98] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-AddOrReplaceReadGroups/cacheCopy/execution/script -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-AddOrReplaceReadGroups/execution/script: Operation not permitted [2022-10-12 00:19:45,99] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-AddOrReplaceReadGroups/cacheCopy/execution/test.sorted.sorted.bam -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-AddOrReplaceReadGroups/execution/test.sorted.sorted.bam: Operation not permitted [2022-10-12 00:19:45,99] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-AddOrReplaceReadGroups/cacheCopy/execution/stderr -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-AddOrReplaceReadGroups/execution/stderr: Operation not permitted [2022-10-12 00:19:46,00] [ [38;5;220mwarn [0m] c00a1a33-c10b-43ec-9c63-90846d899201-BackendCacheHitCopyingActor-c00a1a33:ctat_mutations.AddOrReplaceReadGroups:-1:1-3 [ [38;5;2mc00a1a33 [0mctat_mutations.AddOrReplaceReadGroups:NA:1]: Unrecognized runtime attribute keys: preemptible, disks, docker, memory [2022-10-12 00:19:46,00] [info] c00a1a33-c10b-43ec-9c63-90846d899201-EngineJobExecutionActor-ctat_mutations.AddOrReplaceReadGroups:NA:1 [ [38;5;2mc00a1a33 [0m]: Call cache hit process had 0 total hit failures before completing successfully [2022-10-12 00:19:49,20] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [ [38;5;2mc00a1a33 [0m]: Job results retrieved (CallCached): 'ctat_mutations.AddOrReplaceReadGroups' (scatter index: None, attempt 1) [2022-10-12 00:19:51,85] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [ [38;5;2mc00a1a33 [0m]: Starting ctat_mutations.MarkDuplicates [2022-10-12 00:19:51,90] [info] Assigned new job execution tokens to the following groups: c00a1a33: 1 [2022-10-12 00:19:51,96] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-MarkDuplicates/cacheCopy/execution/script -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-MarkDuplicates/execution/script: Operation not permitted [2022-10-12 00:19:51,96] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-MarkDuplicates/cacheCopy/execution/stderr -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-MarkDuplicates/execution/stderr: Operation not permitted [2022-10-12 00:19:51,97] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-MarkDuplicates/cacheCopy/execution/test.dedupped.bam -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-MarkDuplicates/execution/test.dedupped.bam: Operation not permitted [2022-10-12 00:19:51,97] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-MarkDuplicates/cacheCopy/execution/test.dedupped.bai -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-MarkDuplicates/execution/test.dedupped.bai: Operation not permitted [2022-10-12 00:19:51,98] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-MarkDuplicates/cacheCopy/execution/rc -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-MarkDuplicates/execution/rc: Operation not permitted [2022-10-12 00:19:51,98] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-MarkDuplicates/cacheCopy/execution/test.dedupped.metrics -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-MarkDuplicates/execution/test.dedupped.metrics: Operation not permitted [2022-10-12 00:19:51,98] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-MarkDuplicates/cacheCopy/execution/stdout -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-MarkDuplicates/execution/stdout: Operation not permitted [2022-10-12 00:19:51,99] [ [38;5;220mwarn [0m] c00a1a33-c10b-43ec-9c63-90846d899201-BackendCacheHitCopyingActor-c00a1a33:ctat_mutations.MarkDuplicates:-1:1-4

runtime attribute keys: preemptible, disks, docker, memory [2022-10-12 00:19:51,99] [info] c00a1a33-c10b-43ec-9c63-90846d899201-EngineJobExecutionActor-ctat_mutations.MarkDuplicates:NA:1 [ [38;5;2mc00a1a33 [0m]: Call cache hit process had 0 total hit failures before completing successfully [2022-10-12 00:19:55,34] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [ [38;5;2mc00a1a33 [0m]: Job results retrieved (CallCached): 'ctat_mutations.MarkDuplicates' (scatter index: None, attempt 1) [2022-10-12 00:19:57,98] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [ [38;5;2mc00a1a33 [0m]: Starting ctat_mutations.SplitNCigarReads [2022-10-12 00:19:58,90] [info] Assigned new job execution tokens to the following groups: c00a1a33: 1 [2022-10-12 00:20:53,81] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitIntervals/cacheCopy/execution/stdout -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-SplitIntervals/execution/stdout: Operation not permitted [2022-10-12 00:20:53,81] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitIntervals/cacheCopy/execution/rc -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-SplitIntervals/execution/rc: Operation not permitted [2022-10-12 00:20:53,82] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitIntervals/cacheCopy/execution/stderr -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-SplitIntervals/execution/stderr: Operation not permitted [2022-10-12 00:20:53,82] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitIntervals/cacheCopy/execution/script -> /data/example.singularity/cromwell-executions/ctat_mutations/0e41ee47-9ac8-4509-a4be-028feb100e09/call-SplitIntervals/execution/script: Operation not permitted [2022-10-12 00:20:53,83] [ [38;5;220mwarn [0m] c00a1a33-c10b-43ec-9c63-90846d899201-BackendCacheHitCopyingActor-c00a1a33:ctat_mutations.SplitIntervals:-1:1-0

runtime attribute keys: preemptible, bootDiskSizeGb, disks, docker, cpu, memory [2022-10-12 00:20:53,83] [info] c00a1a33-c10b-43ec-9c63-90846d899201-EngineJobExecutionActor-ctat_mutations.SplitIntervals:NA:1 [ [38;5;2mc00a1a33 [0m]: Call cache hit process had 0 total hit failures before completing successfully [2022-10-12 00:20:54,88] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [ [38;5;2mc00a1a33 [0m]: Job results retrieved (CallCached): 'ctat_mutations.SplitIntervals' (scatter index: None, attempt 1) [2022-10-12 00:20:59,21] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [ [38;5;2mc00a1a33 [0m]: Starting ctat_mutations.MergeVCFs [2022-10-12 00:20:59,90] [info] Assigned new job execution tokens to the following groups: c00a1a33: 1 [2022-10-12 00:20:59,93] [info] c00a1a33-c10b-43ec-9c63-90846d899201-EngineJobExecutionActor-ctat_mutations.MergeVCFs:NA:1 [ [38;5;2mc00a1a33 [0m]: Could not copy a suitable cache hit for c00a1a33:ctat_mutations.MergeVCFs:-1:1. No copy attempts were made. [2022-10-12 00:20:59,96] [ [38;5;220mwarn [0m] BackgroundConfigAsyncJobExecutionActor [ [38;5;2mc00a1a33 [0mctat_mutations.MergeVCFs:NA:1]: Unrecognized runtime attribute keys: preemptible, disks, docker, memory [2022-10-12 00:21:00,02] [info] BackgroundConfigAsyncJobExecutionActor [ [38;5;2mc00a1a33 [0mctat_mutations.MergeVCFs:NA:1]: [38;5;5mset -e

monitor_script.sh &

python <<CODE

make sure vcf index exists

import subprocess import os input_vcfs = ''.split(',') for input_vcf in input_vcfs: if not os.path.exists(input_vcf + '.tbi') and not os.path.exists(input_vcf

  • '.csi') and not os.path.exists(input_vcf + '.idx'): subprocess.check_call(['bcftools', 'index', input_vcf]) CODE

gatk --java-options "-Xmx2000m" \ MergeVcfs \ -I \ -O test.vcf.gz [0m [2022-10-12 00:21:00,08] [info] BackgroundConfigAsyncJobExecutionActor [ [38;5;2mc00a1a33 [0mctat_mutations.MergeVCFs:NA:1]: executing: /usr/bin/env bash /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-MergeVCFs/execution/script [2022-10-12 00:21:03,83] [info] BackgroundConfigAsyncJobExecutionActor [ [38;5;2mc00a1a33 [0mctat_mutations.MergeVCFs:NA:1]: job id: 44124 [2022-10-12 00:21:03,84] [info] BackgroundConfigAsyncJobExecutionActor [ [38;5;2mc00a1a33 [0mctat_mutations.MergeVCFs:NA:1]: Status change from - to Done [2022-10-12 00:21:08,50] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitNCigarReads/cacheCopy/execution/stderr -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-SplitNCigarReads/execution/stderr: Operation not permitted [2022-10-12 00:21:08,51] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitNCigarReads/cacheCopy/execution/stdout -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-SplitNCigarReads/execution/stdout: Operation not permitted [2022-10-12 00:21:08,51] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitNCigarReads/cacheCopy/execution/rc -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-SplitNCigarReads/execution/rc: Operation not permitted [2022-10-12 00:21:08,53] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitNCigarReads/cacheCopy/execution/test.split.bai -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-SplitNCigarReads/execution/test.split.bai: Operation not permitted [2022-10-12 00:21:08,54] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitNCigarReads/cacheCopy/execution/test.split.bam -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-SplitNCigarReads/execution/test.split.bam: Operation not permitted [2022-10-12 00:21:08,54] [ [38;5;220mwarn [0m] Localization via hard link has failed: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-SplitNCigarReads/cacheCopy/execution/script -> /data/example.singularity/cromwell-executions/ctat_mutations/7fa4d680-f7a3-4d36-9070-775b96d89127/call-SplitNCigarReads/execution/script: Operation not permitted [2022-10-12 00:21:08,55] [ [38;5;220mwarn [0m] c00a1a33-c10b-43ec-9c63-90846d899201-BackendCacheHitCopyingActor-c00a1a33:ctat_mutations.SplitNCigarReads:-1:1-6

runtime attribute keys: preemptible, disks, docker, memory [2022-10-12 00:21:08,55] [info] c00a1a33-c10b-43ec-9c63-90846d899201-EngineJobExecutionActor-ctat_mutations.SplitNCigarReads:NA:1 [ [38;5;2mc00a1a33 [0m]: Call cache hit process had 0 total hit failures before completing successfully [2022-10-12 00:21:10,32] [info] WorkflowExecutionActor-c00a1a33-c10b-43ec-9c63-90846d899201 [ [38;5;2mc00a1a33 [0m]: Job results retrieved (CallCached): 'ctat_mutations.SplitNCigarReads' (scatter index: None, attempt 1) [2022-10-12 00:21:10,98] [info] WorkflowManagerActor Workflow c00a1a33-c10b-43ec-9c63-90846d899201 failed (during ExecutingWorkflowState): Job ctat_mutations.MergeVCFs:NA:1 exited with return code 1 which has not been declared as a valid return code. See 'continueOnReturnCode' runtime attribute for more details. Check the content of stderr for potential additional information: /data/example.singularity/cromwell-executions/ctat_mutations/c00a1a33-c10b-43ec-9c63-90846d899201/call-MergeVCFs/execution/stderr. [First 3000 bytes]:index: failed to open Traceback (most recent call last): File "", line 7, in File "/opt/conda/lib/python3.7/subprocess.py", line 347, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command '['bcftools', 'index', '']' returned non-zero exit status 255.

[2022-10-12 00:21:10,98] [info] WorkflowManagerActor WorkflowActor-c00a1a33-c10b-43ec-9c63-90846d899201 is in a terminal state: WorkflowFailedState [2022-10-12 00:21:16,74] [info] SingleWorkflowRunnerActor workflow finished with status 'Failed'. [2022-10-12 00:21:19,13] [info] SingleWorkflowRunnerActor writing metadata to /tmp/tmp8humz7mv.json [2022-10-12 00:21:19,16] [info] Workflow polling stopped [2022-10-12 00:21:19,17] [info] 0 workflows released by cromid-c53643c [2022-10-12 00:21:19,17] [info] Shutting down WorkflowStoreActor - Timeout = 5 seconds [2022-10-12 00:21:19,18] [info] Shutting down WorkflowLogCopyRouter - Timeout = 5 seconds [2022-10-12 00:21:19,18] [info] Shutting down JobExecutionTokenDispenser - Timeout = 5 seconds [2022-10-12 00:21:19,18] [info] Aborting all running workflows. [2022-10-12 00:21:19,18] [info] JobExecutionTokenDispenser stopped [2022-10-12 00:21:19,18] [info] WorkflowStoreActor stopped [2022-10-12 00:21:19,19] [info] WorkflowLogCopyRouter stopped [2022-10-12 00:21:19,19] [info] Shutting down WorkflowManagerActor - Timeout = 3600 seconds [2022-10-12 00:21:19,19] [info] WorkflowManagerActor All workflows finished [2022-10-12 00:21:19,19] [info] WorkflowManagerActor stopped [2022-10-12 00:21:19,45] [info] Connection pools shut down [2022-10-12 00:21:19,46] [info] Shutting down SubWorkflowStoreActor - Timeout = 1800 seconds [2022-10-12 00:21:19,46] [info] Shutting down JobStoreActor - Timeout = 1800 seconds [2022-10-12 00:21:19,46] [info] Shutting down CallCacheWriteActor - Timeout = 1800 seconds [2022-10-12 00:21:19,46] [info] Shutting down ServiceRegistryActor - Timeout = 1800 seconds [2022-10-12 00:21:19,46] [info] Shutting down DockerHashActor - Timeout = 1800 seconds [2022-10-12 00:21:19,46] [info] SubWorkflowStoreActor stopped [2022-10-12 00:21:19,46] [info] Shutting down IoProxy - Timeout = 1800 seconds [2022-10-12 00:21:19,46] [info] CallCacheWriteActor Shutting down: 0 queued messages to process [2022-10-12 00:21:19,46] [info] JobStoreActor stopped [2022-10-12 00:21:19,46] [info] CallCacheWriteActor stopped [2022-10-12 00:21:19,46] [info] WriteMetadataActor Shutting down: 0 queued messages to process [2022-10-12 00:21:19,46] [info] KvWriteActor Shutting down: 0 queued messages to process [2022-10-12 00:21:19,46] [info] IoProxy stopped [2022-10-12 00:21:19,46] [info] ServiceRegistryActor stopped [2022-10-12 00:21:19,47] [info] DockerHashActor stopped [2022-10-12 00:21:19,54] [info] Database closed [2022-10-12 00:21:19,54] [info] Stream materializer shut down [2022-10-12 00:21:19,55] [info] Shutting down connection pool: curAllocated=0 idleQueues.size=0 waitQueue.size=0 maxWaitQueueLimit=256 closed=false [2022-10-12 00:21:19,55] [info] Shutting down connection pool: curAllocated=0 idleQueues.size=0 waitQueue.size=0 maxWaitQueueLimit=256 closed=false [2022-10-12 00:21:19,55] [info] Shutting down connection pool: curAllocated=0 idleQueues.size=0 waitQueue.size=0 maxWaitQueueLimit=256 closed=false [2022-10-12 00:21:19,55] [info] Shutting down connection pool: curAllocated=0 idleQueues.size=0 waitQueue.size=0 maxWaitQueueLimit=256 closed=false [2022-10-12 00:21:19,55] [info] WDL HTTP import resolver closed Workflow c00a1a33-c10b-43ec-9c63-90846d899201 transitioned to state Failed

— Reply to this email directly, view it on GitHub https://github.com/NCIP/ctat-mutations/issues/111#issuecomment-1275432962, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABZRKX5F2UQT5SF2FZCV42TWCYBCDANCNFSM6AAAAAARCWBNYI . You are receiving this because you commented.Message ID: @.***>

--

Brian J. Haas The Broad Institute http://broadinstitute.org/~bhaas http://broad.mit.edu/~bhaas