ENCODE-DCC / atac-seq-pipeline

ENCODE ATAC-seq pipeline
MIT License
388 stars 174 forks source link

Peak Calling failed at call-gc_bias #367

Open thaorope opened 2 years ago

thaorope commented 2 years ago

Describe the bug

Hi team,

I tried to peak-call using input files as dedupped BAM files. It worked initially for 1 sample (2 replicates), but not for the second sample.

OS/Platform

Caper configuration file

Paste contents of ~/.caper/default.conf.

slurm-partition=medium
slurm-account=knipe_dmk2
local-hash-strat=path+modtime
local-loc-dir=/home/tht900/ATAC-seq-testing/caper_temp
cromwell=/home/tht900/.caper/cromwell_jar/cromwell-65.jar
womtool=/home/tht900/.caper/womtool_jar/womtool-65.jar

Input JSON file

Paste contents of your input JSON file.

{
    "atac.title" : "ATRX KO 4 hours_Peak Calling",
    "atac.description" : "ATAC-seq for ATRX KO cells at 4hpi mapped to human-HSV (hg38_hsv) genome. Multimap = 0",

    "atac.pipeline_type" : "atac",
    "atac.true_rep_only" : false,

    "atac.genome_tsv" : "/home/tht900/ATAC-seq-testing/hg38_hsv_genome_database3/hg38_hsv.tsv",

    "atac.paired_end" : true,

    "atac.nodup_bams" : ["/n/scratch3/users/t/tht900/atac-testing/atac/bamfiles/PCR_dedups/ATRX_KO_4_R1_rep1.sub.dedups.bam", 
    "/n/scratch3/users/t/tht900/atac-testing/atac/bamfiles/PCR_dedups/ATRX_KO_4hr_R1_rep2.sub.dedups.bam" ],

    "atac.auto_detect_adapter" : false,
    "atac.adapter" : "TCCTGAGC",

    "atac.multimapping" : 0
}

Troubleshooting result

If you ran caper run without Caper server then Caper automatically runs a troubleshooter for failed workflows. Find troubleshooting result in the bottom of Caper's screen log.

If you ran caper submit with a running Caper server then first find your workflow ID (1st column) with caper list and run caper debug [WORKFLOW_ID].

Paste troubleshooting result.

2022-02-07 09:34:23,443|caper.cli|INFO| Cromwell stdout: /n/scratch3/users/t/tht900/atac-testing/cromwell.out.2
2022-02-07 09:34:23,451|caper.caper_base|INFO| Creating a timestamped temporary directory. /n/scratch3/users/t/tht900/atac-testing/caper_temp/atac_medium_2/20220207_093423_447456
2022-02-07 09:34:23,451|caper.caper_runner|INFO| Localizing files on work_dir. /n/scratch3/users/t/tht900/atac-testing/caper_temp/atac_medium_2/20220207_093423_447456
2022-02-07 09:34:25,035|caper.caper_workflow_opts|INFO| Conda environment name found in WDL metadata. wdl=/home/tht900/atac-seq-pipeline/atac_medium_2.wdl, s=encode-atac-seq-pipeline
2022-02-07 09:34:25,056|caper.cromwell|INFO| Validating WDL/inputs/imports with Womtool...
2022-02-07 09:34:33,326|caper.cromwell|INFO| Passed Womtool validation.
2022-02-07 09:34:33,326|caper.caper_runner|INFO| launching run: wdl=/home/tht900/atac-seq-pipeline/atac_medium_2.wdl, inputs=/home/tht900/ATAC-seq-testing/atac_practice_peakCalling_Thao.json, backend_conf=/n/scratch3/users/t/tht900/atac-testing/caper_temp/atac_medium_2/20220207_093423_447456/backend.conf
2022-02-07 09:34:54,281|caper.cromwell_workflow_monitor|INFO| Workflow: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, status=Submitted
2022-02-07 09:34:54,479|caper.cromwell_workflow_monitor|INFO| Workflow: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, status=Running
2022-02-07 09:35:13,720|caper.cromwell_workflow_monitor|INFO| Task: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, task=atac.read_genome_tsv:-1, retry=0, status=Started, job_id=19415
2022-02-07 09:35:13,726|caper.cromwell_workflow_monitor|INFO| Task: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, task=atac.read_genome_tsv:-1, retry=0, status=WaitingForReturnCode
2022-02-07 09:35:28,710|caper.cromwell_workflow_monitor|INFO| Task: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, task=atac.fraglen_stat_pe:0, retry=0, status=Started, job_id=19459
2022-02-07 09:35:28,716|caper.cromwell_workflow_monitor|INFO| Task: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, task=atac.fraglen_stat_pe:0, retry=0, status=WaitingForReturnCode
2022-02-07 09:36:50,791|caper.cromwell_workflow_monitor|INFO| Task: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, task=atac.read_genome_tsv:-1, retry=0, status=Done
2022-02-07 09:36:58,710|caper.cromwell_workflow_monitor|INFO| Task: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, task=atac.bam2ta:0, retry=0, status=Started, job_id=19645
2022-02-07 09:36:58,715|caper.cromwell_workflow_monitor|INFO| Task: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, task=atac.bam2ta:0, retry=0, status=WaitingForReturnCode
2022-02-07 09:37:03,710|caper.cromwell_workflow_monitor|INFO| Task: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, task=atac.gc_bias:0, retry=0, status=Started, job_id=19664
2022-02-07 09:37:03,715|caper.cromwell_workflow_monitor|INFO| Task: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, task=atac.gc_bias:0, retry=0, status=WaitingForReturnCode
2022-02-07 09:37:08,718|caper.cromwell_workflow_monitor|INFO| Task: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, task=atac.jsd:-1, retry=0, status=Started, job_id=19699
2022-02-07 09:37:08,725|caper.cromwell_workflow_monitor|INFO| Task: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, task=atac.jsd:-1, retry=0, status=WaitingForReturnCode
2022-02-07 09:47:02,972|caper.cromwell_workflow_monitor|INFO| Task: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, task=atac.fraglen_stat_pe:0, retry=0, status=Done
2022-02-07 09:51:28,087|caper.cromwell_workflow_monitor|INFO| Task: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, task=atac.jsd:-1, retry=0, status=Done
2022-02-07 09:52:19,290|caper.cromwell_workflow_monitor|INFO| Task: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, task=atac.gc_bias:0, retry=0, status=Done
2022-02-07 09:52:28,701|caper.cromwell_workflow_monitor|INFO| Task: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, task=atac.gc_bias:0, retry=1, status=Started, job_id=29676
2022-02-07 09:52:28,704|caper.cromwell_workflow_monitor|INFO| Task: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, task=atac.gc_bias:0, retry=1, status=WaitingForReturnCode
2022-02-07 10:10:22,927|caper.cromwell_workflow_monitor|INFO| Task: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, task=atac.gc_bias:0, retry=1, status=Done
2022-02-07 10:34:20,682|caper.cromwell_workflow_monitor|INFO| Task: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, task=atac.bam2ta:0, retry=0, status=Done
2022-02-07 10:34:21,347|caper.cromwell_workflow_monitor|INFO| Workflow: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, status=Failed
2022-02-07 10:35:19,992|caper.cromwell_metadata|INFO| Wrote metadata file. /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/metadata.json
2022-02-07 10:35:19,993|caper.cromwell|INFO| Workflow failed. Auto-troubleshooting...
2022-02-07 10:35:20,000|caper.nb_subproc_thread|ERROR| Cromwell failed. returncode=1
2022-02-07 10:35:20,000|caper.cli|ERROR| Check stdout in /n/scratch3/users/t/tht900/atac-testing/cromwell.out.2
* Started troubleshooting workflow: id=69c5d7ab-4e56-4d62-9a66-49f973ae6af3, status=Failed
* Found failures JSON object.
[
    {
        "message": "Workflow failed",
        "causedBy": [
            {
                "causedBy": [],
                "message": "Job atac.gc_bias:0:2 exited with return code 1 which has not been declared as a valid return code. See 'continueOnReturnCode' runtime attribute for more details."
            }
        ]
    }
]
* Recursively finding failures in calls (tasks)...

==== NAME=atac.gc_bias, STATUS=RetryableFailure, PARENT=
SHARD_IDX=0, RC=1, JOB_ID=19664
START=2022-02-07T14:36:59.853Z, END=2022-02-07T14:52:23.712Z
STDOUT=/n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/execution/stdout
STDERR=/n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/execution/stderr
STDERR_CONTENTS=
Picked up _JAVA_OPTIONS: -Djava.io.tmpdir=/n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/tmp.7755ee33
INFO    2022-02-07 09:47:07 CollectGcBiasMetrics    

********** NOTE: Picard's command line syntax is changing.
**********
********** For more information, please see:
********** https://github.com/broadinstitute/picard/wiki/Command-Line-Syntax-Transition-For-Users-(Pre-Transition)
**********
********** The command line looks like this in the new syntax:
**********
**********    CollectGcBiasMetrics -R /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/inputs/-1609375697/GRCh38_no_alt_analysis_set_GCA_000001405_hsv.15.fasta.gz -I ./ATRX_KO_4hr_R1_rep2.sub.dedups.sort.no_rg.bam -O ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gc.txt -USE_JDK_DEFLATER TRUE -USE_JDK_INFLATER TRUE -VERBOSITY ERROR -QUIET TRUE -ASSUME_SORTED FALSE -CHART ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gcPlot.pdf -S ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gcSummary.txt
**********

Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: Index 17009 out of bounds for length 16570
    at picard.analysis.GcBiasMetricsCollector.addRead(GcBiasMetricsCollector.java:384)
    at picard.analysis.GcBiasMetricsCollector.access$600(GcBiasMetricsCollector.java:48)
    at picard.analysis.GcBiasMetricsCollector$PerUnitGcBiasMetricsCollector.addReadToGcData(GcBiasMetricsCollector.java:221)
    at picard.analysis.GcBiasMetricsCollector$PerUnitGcBiasMetricsCollector.acceptRecord(GcBiasMetricsCollector.java:155)
    at picard.analysis.GcBiasMetricsCollector$PerUnitGcBiasMetricsCollector.acceptRecord(GcBiasMetricsCollector.java:100)
    at picard.metrics.MultiLevelCollector$AllReadsDistributor.acceptRecord(MultiLevelCollector.java:192)
    at picard.metrics.MultiLevelCollector.acceptRecord(MultiLevelCollector.java:315)
    at picard.analysis.CollectGcBiasMetrics.acceptRead(CollectGcBiasMetrics.java:172)
    at picard.analysis.SinglePassSamProgram.makeItSo(SinglePassSamProgram.java:145)
    at picard.analysis.SinglePassSamProgram.doWork(SinglePassSamProgram.java:84)
    at picard.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:305)
    at picard.cmdline.PicardCommandLine.instanceMain(PicardCommandLine.java:103)
    at picard.cmdline.PicardCommandLine.main(PicardCommandLine.java:113)
Traceback (most recent call last):
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/bin/encode_task_gc_bias.py", line 143, in <module>
    main()
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/bin/encode_task_gc_bias.py", line 132, in main
    plot_gc(gc_out, OUTPUT_PREFIX)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/bin/encode_task_gc_bias.py", line 80, in plot_gc
    data = pd.read_table(data_file, comment="#")
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 767, in read_table
    return read_csv(**locals())
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 688, in read_csv
    return _read(filepath_or_buffer, kwds)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 454, in _read
    parser = TextFileReader(fp_or_buf, **kwds)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 948, in __init__
    self._make_engine(self.engine)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 1180, in _make_engine
    self._engine = CParserWrapper(self.f, **self.options)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 2010, in __init__
    self._reader = parsers.TextReader(src, **kwds)
  File "pandas/_libs/parsers.pyx", line 382, in pandas._libs.parsers.TextReader.__cinit__
  File "pandas/_libs/parsers.pyx", line 674, in pandas._libs.parsers.TextReader._setup_parser_source
FileNotFoundError: [Errno 2] No such file or directory: 'ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gc.txt'
Picked up _JAVA_OPTIONS: -Djava.io.tmpdir=/n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/tmp.7755ee33
INFO    2022-02-07 09:47:07 CollectGcBiasMetrics    

********** NOTE: Picard's command line syntax is changing.
**********
********** For more information, please see:
********** https://github.com/broadinstitute/picard/wiki/Command-Line-Syntax-Transition-For-Users-(Pre-Transition)
**********
********** The command line looks like this in the new syntax:
**********
**********    CollectGcBiasMetrics -R /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/inputs/-1609375697/GRCh38_no_alt_analysis_set_GCA_000001405_hsv.15.fasta.gz -I ./ATRX_KO_4hr_R1_rep2.sub.dedups.sort.no_rg.bam -O ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gc.txt -USE_JDK_DEFLATER TRUE -USE_JDK_INFLATER TRUE -VERBOSITY ERROR -QUIET TRUE -ASSUME_SORTED FALSE -CHART ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gcPlot.pdf -S ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gcSummary.txt
**********

Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: Index 17009 out of bounds for length 16570
    at picard.analysis.GcBiasMetricsCollector.addRead(GcBiasMetricsCollector.java:384)
    at picard.analysis.GcBiasMetricsCollector.access$600(GcBiasMetricsCollector.java:48)
    at picard.analysis.GcBiasMetricsCollector$PerUnitGcBiasMetricsCollector.addReadToGcData(GcBiasMetricsCollector.java:221)
    at picard.analysis.GcBiasMetricsCollector$PerUnitGcBiasMetricsCollector.acceptRecord(GcBiasMetricsCollector.java:155)
    at picard.analysis.GcBiasMetricsCollector$PerUnitGcBiasMetricsCollector.acceptRecord(GcBiasMetricsCollector.java:100)
    at picard.metrics.MultiLevelCollector$AllReadsDistributor.acceptRecord(MultiLevelCollector.java:192)
    at picard.metrics.MultiLevelCollector.acceptRecord(MultiLevelCollector.java:315)
    at picard.analysis.CollectGcBiasMetrics.acceptRead(CollectGcBiasMetrics.java:172)
    at picard.analysis.SinglePassSamProgram.makeItSo(SinglePassSamProgram.java:145)
    at picard.analysis.SinglePassSamProgram.doWork(SinglePassSamProgram.java:84)
    at picard.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:305)
    at picard.cmdline.PicardCommandLine.instanceMain(PicardCommandLine.java:103)
    at picard.cmdline.PicardCommandLine.main(PicardCommandLine.java:113)
Traceback (most recent call last):
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/bin/encode_task_gc_bias.py", line 143, in <module>
    main()
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/bin/encode_task_gc_bias.py", line 132, in main
    plot_gc(gc_out, OUTPUT_PREFIX)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/bin/encode_task_gc_bias.py", line 80, in plot_gc
    data = pd.read_table(data_file, comment="#")
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 767, in read_table
    return read_csv(**locals())
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 688, in read_csv
    return _read(filepath_or_buffer, kwds)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 454, in _read
    parser = TextFileReader(fp_or_buf, **kwds)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 948, in __init__
    self._make_engine(self.engine)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 1180, in _make_engine
    self._engine = CParserWrapper(self.f, **self.options)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 2010, in __init__
    self._reader = parsers.TextReader(src, **kwds)
  File "pandas/_libs/parsers.pyx", line 382, in pandas._libs.parsers.TextReader.__cinit__
  File "pandas/_libs/parsers.pyx", line 674, in pandas._libs.parsers.TextReader._setup_parser_source
FileNotFoundError: [Errno 2] No such file or directory: 'ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gc.txt'
ln: failed to access ‘/n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/execution/*.gc_plot.png’: No such file or directory
ln: failed to access ‘/n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/execution/*.gc.txt’: No such file or directory

STDERR_BACKGROUND_CONTENTS=

==== NAME=atac.gc_bias, STATUS=Failed, PARENT=
SHARD_IDX=0, RC=1, JOB_ID=29676
START=2022-02-07T14:52:25.842Z, END=2022-02-07T15:10:22.942Z
STDOUT=/n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/execution/stdout
STDERR=/n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/execution/stderr
STDERR_CONTENTS=
Picked up _JAVA_OPTIONS: -Djava.io.tmpdir=/n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/tmp.d45903f0
INFO    2022-02-07 10:03:19 CollectGcBiasMetrics    

********** NOTE: Picard's command line syntax is changing.
**********
********** For more information, please see:
********** https://github.com/broadinstitute/picard/wiki/Command-Line-Syntax-Transition-For-Users-(Pre-Transition)
**********
********** The command line looks like this in the new syntax:
**********
**********    CollectGcBiasMetrics -R /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/inputs/-1609375697/GRCh38_no_alt_analysis_set_GCA_000001405_hsv.15.fasta.gz -I ./ATRX_KO_4hr_R1_rep2.sub.dedups.sort.no_rg.bam -O ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gc.txt -USE_JDK_DEFLATER TRUE -USE_JDK_INFLATER TRUE -VERBOSITY ERROR -QUIET TRUE -ASSUME_SORTED FALSE -CHART ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gcPlot.pdf -S ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gcSummary.txt
**********

Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: Index 17009 out of bounds for length 16570
    at picard.analysis.GcBiasMetricsCollector.addRead(GcBiasMetricsCollector.java:384)
    at picard.analysis.GcBiasMetricsCollector.access$600(GcBiasMetricsCollector.java:48)
    at picard.analysis.GcBiasMetricsCollector$PerUnitGcBiasMetricsCollector.addReadToGcData(GcBiasMetricsCollector.java:221)
    at picard.analysis.GcBiasMetricsCollector$PerUnitGcBiasMetricsCollector.acceptRecord(GcBiasMetricsCollector.java:155)
    at picard.analysis.GcBiasMetricsCollector$PerUnitGcBiasMetricsCollector.acceptRecord(GcBiasMetricsCollector.java:100)
    at picard.metrics.MultiLevelCollector$AllReadsDistributor.acceptRecord(MultiLevelCollector.java:192)
    at picard.metrics.MultiLevelCollector.acceptRecord(MultiLevelCollector.java:315)
    at picard.analysis.CollectGcBiasMetrics.acceptRead(CollectGcBiasMetrics.java:172)
    at picard.analysis.SinglePassSamProgram.makeItSo(SinglePassSamProgram.java:145)
    at picard.analysis.SinglePassSamProgram.doWork(SinglePassSamProgram.java:84)
    at picard.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:305)
    at picard.cmdline.PicardCommandLine.instanceMain(PicardCommandLine.java:103)
    at picard.cmdline.PicardCommandLine.main(PicardCommandLine.java:113)
Traceback (most recent call last):
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/bin/encode_task_gc_bias.py", line 143, in <module>
    main()
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/bin/encode_task_gc_bias.py", line 132, in main
    plot_gc(gc_out, OUTPUT_PREFIX)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/bin/encode_task_gc_bias.py", line 80, in plot_gc
    data = pd.read_table(data_file, comment="#")
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 767, in read_table
    return read_csv(**locals())
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 688, in read_csv
    return _read(filepath_or_buffer, kwds)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 454, in _read
    parser = TextFileReader(fp_or_buf, **kwds)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 948, in __init__
    self._make_engine(self.engine)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 1180, in _make_engine
    self._engine = CParserWrapper(self.f, **self.options)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 2010, in __init__
    self._reader = parsers.TextReader(src, **kwds)
  File "pandas/_libs/parsers.pyx", line 382, in pandas._libs.parsers.TextReader.__cinit__
  File "pandas/_libs/parsers.pyx", line 674, in pandas._libs.parsers.TextReader._setup_parser_source
FileNotFoundError: [Errno 2] No such file or directory: 'ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gc.txt'
Picked up _JAVA_OPTIONS: -Djava.io.tmpdir=/n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/tmp.d45903f0
INFO    2022-02-07 10:03:19 CollectGcBiasMetrics    

********** NOTE: Picard's command line syntax is changing.
**********
********** For more information, please see:
********** https://github.com/broadinstitute/picard/wiki/Command-Line-Syntax-Transition-For-Users-(Pre-Transition)
**********
********** The command line looks like this in the new syntax:
**********
**********    CollectGcBiasMetrics -R /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/inputs/-1609375697/GRCh38_no_alt_analysis_set_GCA_000001405_hsv.15.fasta.gz -I ./ATRX_KO_4hr_R1_rep2.sub.dedups.sort.no_rg.bam -O ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gc.txt -USE_JDK_DEFLATER TRUE -USE_JDK_INFLATER TRUE -VERBOSITY ERROR -QUIET TRUE -ASSUME_SORTED FALSE -CHART ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gcPlot.pdf -S ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gcSummary.txt
**********

Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: Index 17009 out of bounds for length 16570
    at picard.analysis.GcBiasMetricsCollector.addRead(GcBiasMetricsCollector.java:384)
    at picard.analysis.GcBiasMetricsCollector.access$600(GcBiasMetricsCollector.java:48)
    at picard.analysis.GcBiasMetricsCollector$PerUnitGcBiasMetricsCollector.addReadToGcData(GcBiasMetricsCollector.java:221)
    at picard.analysis.GcBiasMetricsCollector$PerUnitGcBiasMetricsCollector.acceptRecord(GcBiasMetricsCollector.java:155)
    at picard.analysis.GcBiasMetricsCollector$PerUnitGcBiasMetricsCollector.acceptRecord(GcBiasMetricsCollector.java:100)
    at picard.metrics.MultiLevelCollector$AllReadsDistributor.acceptRecord(MultiLevelCollector.java:192)
    at picard.metrics.MultiLevelCollector.acceptRecord(MultiLevelCollector.java:315)
    at picard.analysis.CollectGcBiasMetrics.acceptRead(CollectGcBiasMetrics.java:172)
    at picard.analysis.SinglePassSamProgram.makeItSo(SinglePassSamProgram.java:145)
    at picard.analysis.SinglePassSamProgram.doWork(SinglePassSamProgram.java:84)
    at picard.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:305)
    at picard.cmdline.PicardCommandLine.instanceMain(PicardCommandLine.java:103)
    at picard.cmdline.PicardCommandLine.main(PicardCommandLine.java:113)
Traceback (most recent call last):
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/bin/encode_task_gc_bias.py", line 143, in <module>
    main()
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/bin/encode_task_gc_bias.py", line 132, in main
    plot_gc(gc_out, OUTPUT_PREFIX)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/bin/encode_task_gc_bias.py", line 80, in plot_gc
    data = pd.read_table(data_file, comment="#")
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 767, in read_table
    return read_csv(**locals())
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 688, in read_csv
    return _read(filepath_or_buffer, kwds)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 454, in _read
    parser = TextFileReader(fp_or_buf, **kwds)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 948, in __init__
    self._make_engine(self.engine)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 1180, in _make_engine
    self._engine = CParserWrapper(self.f, **self.options)
  File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 2010, in __init__
    self._reader = parsers.TextReader(src, **kwds)
  File "pandas/_libs/parsers.pyx", line 382, in pandas._libs.parsers.TextReader.__cinit__
  File "pandas/_libs/parsers.pyx", line 674, in pandas._libs.parsers.TextReader._setup_parser_source
FileNotFoundError: [Errno 2] No such file or directory: 'ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gc.txt'
ln: failed to access ‘/n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/execution/*.gc_plot.png’: No such file or directory
ln: failed to access ‘/n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/execution/*.gc.txt’: No such file or directory

STDERR_BACKGROUND_CONTENTS=
thaorope commented 2 years ago

cromwell.out

2022-02-07 09:34:37,052 INFO - Running with database db.url = jdbc:hsqldb:mem:f3301075-5e78-4ea0-bd46-51dc6f99f702;shutdown=false;hsqldb.tx=mvcc 2022-02-07 09:34:51,290 INFO - Running migration RenameWorkflowOptionsInMetadata with a read batch size of 100000 and a write batch size of 100000 2022-02-07 09:34:51,339 INFO - [RenameWorkflowOptionsInMetadata] 100% 2022-02-07 09:34:51,597 INFO - Running with database db.url = jdbc:hsqldb:mem:32f55dd3-f0c1-4304-9d26-c8d54a08d98b;shutdown=false;hsqldb.tx=mvcc 2022-02-07 09:34:52,813 INFO - Slf4jLogger started 2022-02-07 09:34:53,219 cromwell-system-akka.dispatchers.engine-dispatcher-9 INFO - Workflow heartbeat configuration: { "cromwellId" : "cromid-c73d3a1", "heartbeatInterval" : "2 minutes", "ttl" : "10 minutes", "failureShutdownDuration" : "5 minutes", "writeBatchSize" : 10000, "writeThreshold" : 10000 } 2022-02-07 09:34:53,585 WARN - 'docker.hash-lookup.gcr-api-queries-per-100-seconds' is being deprecated, use 'docker.hash-lookup.gcr.throttle' instead (see reference.conf) 2022-02-07 09:34:53,688 cromwell-system-akka.dispatchers.service-dispatcher-16 INFO - WriteMetadataActor configured to flush with batch size 200 and process rate 5 seconds. 2022-02-07 09:34:53,694 cromwell-system-akka.actor.default-dispatcher-8 INFO - KvWriteActor configured to flush with batch size 200 and process rate 5 seconds. 2022-02-07 09:34:53,695 cromwell-system-akka.dispatchers.engine-dispatcher-10 INFO - CallCacheWriteActor configured to flush with batch size 100 and process rate 3 seconds. 2022-02-07 09:34:53,700 cromwell-system-akka.dispatchers.service-dispatcher-20 INFO - Metadata summary refreshing every 1 second. 2022-02-07 09:34:53,700 cromwell-system-akka.dispatchers.service-dispatcher-20 INFO - No metadata archiver defined in config 2022-02-07 09:34:53,700 cromwell-system-akka.dispatchers.service-dispatcher-20 INFO - No metadata deleter defined in config 2022-02-07 09:34:53,834 cromwell-system-akka.dispatchers.engine-dispatcher-32 INFO - JobExecutionTokenDispenser - Distribution rate: 1 per 2 seconds. 2022-02-07 09:34:54,049 cromwell-system-akka.dispatchers.engine-dispatcher-9 INFO - SingleWorkflowRunnerActor: Version 65 2022-02-07 09:34:54,083 cromwell-system-akka.dispatchers.engine-dispatcher-9 INFO - SingleWorkflowRunnerActor: Submitting workflow 2022-02-07 09:34:54,280 cromwell-system-akka.dispatchers.api-dispatcher-34 INFO - Unspecified type (Unspecified version) workflow 69c5d7ab-4e56-4d62-9a66-49f973ae6af3 submitted 2022-02-07 09:34:54,409 cromwell-system-akka.dispatchers.engine-dispatcher-10 INFO - SingleWorkflowRunnerActor: Workflow submitted UUID(69c5d7ab-4e56-4d62-9a66-49f973ae6af3) 2022-02-07 09:34:54,436 cromwell-system-akka.dispatchers.engine-dispatcher-10 INFO - 1 new workflows fetched by cromid-c73d3a1: 69c5d7ab-4e56-4d62-9a66-49f973ae6af3 2022-02-07 09:34:54,462 cromwell-system-akka.dispatchers.engine-dispatcher-11 INFO - WorkflowManagerActor: Starting workflow UUID(69c5d7ab-4e56-4d62-9a66-49f973ae6af3) 2022-02-07 09:34:54,479 cromwell-system-akka.dispatchers.engine-dispatcher-11 INFO - WorkflowManagerActor: Successfully started WorkflowActor-69c5d7ab-4e56-4d62-9a66-49f973ae6af3 2022-02-07 09:34:54,479 cromwell-system-akka.dispatchers.engine-dispatcher-11 INFO - Retrieved 1 workflows from the WorkflowStoreActor 2022-02-07 09:34:54,514 cromwell-system-akka.dispatchers.engine-dispatcher-11 INFO - WorkflowStoreHeartbeatWriteActor configured to flush with batch size 10000 and process rate 2 minutes. 2022-02-07 09:34:54,805 cromwell-system-akka.dispatchers.engine-dispatcher-9 INFO - MaterializeWorkflowDescriptorActor [UUID(69c5d7ab)]: Parsing workflow as WDL 1.0 2022-02-07 09:34:58,892 cromwell-system-akka.dispatchers.engine-dispatcher-11 INFO - Not triggering log of token queue status. Effective log interval = None 2022-02-07 09:35:02,015 cromwell-system-akka.dispatchers.engine-dispatcher-9 INFO - MaterializeWorkflowDescriptorActor [UUID(69c5d7ab)]: Call-to-Backend assignments: atac.count_signal_track -> slurm, atac.spr -> slurm, atac.jsd -> slurm, atac.pool_ta -> slurm, atac.count_signal_track_pooled -> slurm, atac.pool_blacklist -> slurm, atac.reproducibility_overlap -> slurm, atac.call_peak_ppr2 -> slurm, atac.reproducibility_idr -> slurm, atac.align -> slurm, atac.pool_ta_pr1 -> slurm, atac.bam2ta -> slurm, atac.align_mito -> slurm, atac.call_peak_pr1 -> slurm, atac.preseq -> slurm, atac.idr_pr -> slurm, atac.xcor -> slurm, atac.idr -> slurm, atac.tss_enrich -> slurm, atac.qc_report -> slurm, atac.read_genome_tsv -> slurm, atac.call_peak -> slurm, atac.compare_signal_to_roadmap -> slurm, atac.bam2ta_no_dedup -> slurm, atac.call_peak_pooled -> slurm, atac.overlap_ppr -> slurm, atac.pool_ta_pr2 -> slurm, atac.call_peak_pr2 -> slurm, atac.gc_bias -> slurm, atac.idr_ppr -> slurm, atac.error_input_data -> slurm, atac.fraglen_stat_pe -> slurm, atac.macs2_signal_track -> slurm, atac.frac_mito -> slurm, atac.call_peak_ppr1 -> slurm, atac.annot_enrich -> slurm, atac.filter_no_dedup -> slurm, atac.filter -> slurm, atac.macs2_signal_track_pooled -> slurm, atac.overlap_pr -> slurm, atac.overlap -> slurm 2022-02-07 09:35:02,560 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,562 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,563 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,564 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,570 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,571 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,572 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,572 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [preemptible, disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,573 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,574 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [preemptible, disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,575 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,575 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,576 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [preemptible, disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,577 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [preemptible, disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,578 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,579 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,582 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,582 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,588 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,589 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,590 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,591 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [preemptible, disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,593 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,593 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,593 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [preemptible, disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,594 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,594 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,595 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [preemptible, disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,596 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,596 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,597 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,612 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,613 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [preemptible, disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,613 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,613 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [preemptible, disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,614 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,614 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,615 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,615 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [preemptible, disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,618 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:02,618 cromwell-system-akka.dispatchers.backend-dispatcher-38 WARN - slurm [UUID(69c5d7ab)]: Key/s [disks, docker] is/are not supported by backend. Unsupported attributes will not be part of job executions. 2022-02-07 09:35:07,175 cromwell-system-akka.dispatchers.engine-dispatcher-33 INFO - WorkflowExecutionActor-69c5d7ab-4e56-4d62-9a66-49f973ae6af3 [UUID(69c5d7ab)]: Starting atac.read_genome_tsv 2022-02-07 09:35:07,848 cromwell-system-akka.dispatchers.engine-dispatcher-32 INFO - Assigned new job execution tokens to the following groups: 69c5d7ab: 1 2022-02-07 09:35:08,094 cromwell-system-akka.dispatchers.engine-dispatcher-31 INFO - 69c5d7ab-4e56-4d62-9a66-49f973ae6af3-EngineJobExecutionActor-atac.read_genome_tsv:NA:1 [UUID(69c5d7ab)]: Could not copy a suitable cache hit for 69c5d7ab:atac.read_genome_tsv:-1:1. No copy attempts were made. 2022-02-07 09:35:08,170 cromwell-system-akka.dispatchers.backend-dispatcher-41 WARN - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.read_genome_tsv:NA:1]: Unrecognized runtime attribute keys: disks, docker 2022-02-07 09:35:08,344 cromwell-system-akka.dispatchers.backend-dispatcher-41 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.read_genome_tsv:NA:1]: `echo "$(basename /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-read_genome_tsv/inputs/-1609375697/hg38_hsv.tsv)" > genome_name

create empty files for all entries

touch ref_fa bowtie2_idx_tar chrsz gensz blacklist blacklist2 touch ref_mito_fa touch bowtie2_mito_idx_tar touch tss tss_enrich # for backward compatibility touch dnase prom enh reg2map reg2map_bed roadmap_meta touch mito_chr_name touch regex_bfilt_peak_chr_name

python <<CODE import os with open('/n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-read_genome_tsv/inputs/-1609375697/hg38_hsv.tsv','r') as fp: for line in fp: arr = line.strip('\n').split('\t') if arr: key, val = arr with open(key,'w') as fp2: fp2.write(val) CODE` 2022-02-07 09:35:08,887 cromwell-system-akka.dispatchers.backend-dispatcher-41 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.read_genome_tsv:NA:1]: executing: cat << EOF > /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-read_genome_tsv/execution/script.caper

!/bin/bash

if [ 'true' == 'true' ] && [ 'conda' == 'singularity' ] || \ [ 'true' == 'false' ] && [ 'true' == 'true' ] && [ ! -z 'https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif' ] then mkdir -p $HOME/.singularity/lock/ flock --exclusive --timeout 600 \ $HOME/.singularity/lock/echo -n 'https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif' | md5sum | cut -d' ' -f1 \ singularity exec --containall https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif echo 'Successfully pulled https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif'

singularity exec --cleanenv --home=`dirname /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-read_genome_tsv` \
    --bind=, \
     \
    https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif /bin/bash /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-read_genome_tsv/execution/script

elif [ 'true' == 'true' ] && [ 'conda' == 'conda' ] || \ [ 'true' == 'false' ] && [ 'true' == 'true' ] && [ ! -z 'encode-atac-seq-pipeline' ] then conda run --name=encode-atac-seq-pipeline /bin/bash /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-read_genome_tsv/execution/script

else /bin/bash /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-read_genome_tsv/execution/script fi

EOF

for ITER in 1 2 3 do sbatch --export=ALL -J cromwell_69c5d7ab_read_genome_tsv -D /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-read_genome_tsv -o /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-read_genome_tsv/execution/stdout -e /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-read_genome_tsv/execution/stderr \ -p medium --account knipe_dmk2 \ -n 1 --ntasks-per-node=1 --cpus-per-task=1 --mem=2048M --time=780 \ \ /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-read_genome_tsv/execution/script.caper && exit 0 sleep 30 done exit 1 2022-02-07 09:35:13,720 cromwell-system-akka.dispatchers.backend-dispatcher-41 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.read_genome_tsv:NA:1]: job id: 19415 2022-02-07 09:35:13,726 cromwell-system-akka.dispatchers.backend-dispatcher-43 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.read_genome_tsv:NA:1]: Status change from - to WaitingForReturnCode 2022-02-07 09:35:23,600 cromwell-system-akka.dispatchers.engine-dispatcher-9 INFO - WorkflowExecutionActor-69c5d7ab-4e56-4d62-9a66-49f973ae6af3 [UUID(69c5d7ab)]: Starting atac.fraglen_stat_pe 2022-02-07 09:35:23,841 cromwell-system-akka.dispatchers.engine-dispatcher-32 INFO - Assigned new job execution tokens to the following groups: 69c5d7ab: 1 2022-02-07 09:35:23,869 cromwell-system-akka.dispatchers.engine-dispatcher-29 INFO - 69c5d7ab-4e56-4d62-9a66-49f973ae6af3-EngineJobExecutionActor-atac.fraglen_stat_pe:0:1 [UUID(69c5d7ab)]: Could not copy a suitable cache hit for 69c5d7ab:atac.fraglen_stat_pe:0:1. No copy attempts were made. 2022-02-07 09:35:23,869 cromwell-system-akka.dispatchers.backend-dispatcher-46 WARN - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.fraglen_stat_pe:0:1]: Unrecognized runtime attribute keys: disks, docker 2022-02-07 09:35:23,921 cromwell-system-akka.dispatchers.backend-dispatcher-46 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.fraglen_stat_pe:0:1]: set -e python3 $(which encode_task_fraglen_stat_pe.py) \ --nodup-bam /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-fraglen_stat_pe/shard-0/inputs/-1090778166/ATRX_KO_4hr_R1_rep2.sub.dedups.sort.bam \ --picard-java-heap 7G 2022-02-07 09:35:23,965 cromwell-system-akka.dispatchers.backend-dispatcher-46 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.fraglen_stat_pe:0:1]: executing: cat << EOF > /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-fraglen_stat_pe/shard-0/execution/script.caper

!/bin/bash

if [ 'true' == 'true' ] && [ 'conda' == 'singularity' ] || \ [ 'true' == 'false' ] && [ 'true' == 'true' ] && [ ! -z 'https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif' ] then mkdir -p $HOME/.singularity/lock/ flock --exclusive --timeout 600 \ $HOME/.singularity/lock/echo -n 'https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif' | md5sum | cut -d' ' -f1 \ singularity exec --containall https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif echo 'Successfully pulled https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif'

singularity exec --cleanenv --home=`dirname /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-fraglen_stat_pe/shard-0` \
    --bind=, \
     \
    https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif /bin/bash /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-fraglen_stat_pe/shard-0/execution/script

elif [ 'true' == 'true' ] && [ 'conda' == 'conda' ] || \ [ 'true' == 'false' ] && [ 'true' == 'true' ] && [ ! -z 'encode-atac-seq-pipeline' ] then conda run --name=encode-atac-seq-pipeline /bin/bash /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-fraglen_stat_pe/shard-0/execution/script

else /bin/bash /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-fraglen_stat_pe/shard-0/execution/script fi

EOF

for ITER in 1 2 3 do sbatch --export=ALL -J cromwell_69c5d7ab_fraglen_stat_pe -D /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-fraglen_stat_pe/shard-0 -o /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-fraglen_stat_pe/shard-0/execution/stdout -e /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-fraglen_stat_pe/shard-0/execution/stderr \ -p medium --account knipe_dmk2 \ -n 1 --ntasks-per-node=1 --cpus-per-task=1 --mem=8192M --time=780 \ \ /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-fraglen_stat_pe/shard-0/execution/script.caper && exit 0 sleep 30 done exit 1 2022-02-07 09:35:28,709 cromwell-system-akka.dispatchers.backend-dispatcher-47 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.fraglen_stat_pe:0:1]: job id: 19459 2022-02-07 09:35:28,716 cromwell-system-akka.dispatchers.backend-dispatcher-41 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.fraglen_stat_pe:0:1]: Status change from - to WaitingForReturnCode 2022-02-07 09:36:50,791 cromwell-system-akka.dispatchers.backend-dispatcher-41 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.read_genome_tsv:NA:1]: Status change from WaitingForReturnCode to Done 2022-02-07 09:36:57,401 cromwell-system-akka.dispatchers.engine-dispatcher-10 INFO - WorkflowExecutionActor-69c5d7ab-4e56-4d62-9a66-49f973ae6af3 [UUID(69c5d7ab)]: Starting atac.bam2ta 2022-02-07 09:36:57,840 cromwell-system-akka.dispatchers.engine-dispatcher-10 INFO - Assigned new job execution tokens to the following groups: 69c5d7ab: 1 2022-02-07 09:36:57,858 cromwell-system-akka.dispatchers.engine-dispatcher-33 INFO - 69c5d7ab-4e56-4d62-9a66-49f973ae6af3-EngineJobExecutionActor-atac.bam2ta:0:1 [UUID(69c5d7ab)]: Could not copy a suitable cache hit for 69c5d7ab:atac.bam2ta:0:1. No copy attempts were made. 2022-02-07 09:36:57,858 cromwell-system-akka.dispatchers.backend-dispatcher-46 WARN - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.bam2ta:0:1]: Unrecognized runtime attribute keys: disks, docker 2022-02-07 09:36:57,883 cromwell-system-akka.dispatchers.backend-dispatcher-46 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.bam2ta:0:1]: set -e python3 $(which encode_task_bam2ta.py) \ /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-bam2ta/shard-0/inputs/-1090778166/ATRX_KO_4hr_R1_rep2.sub.dedups.sort.bam \ --paired-end \ \ --mito-chr-name chrM \ --subsample 0 \ --mem-gb 3.724710758626461 \ --nth 2 2022-02-07 09:36:57,906 cromwell-system-akka.dispatchers.backend-dispatcher-46 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.bam2ta:0:1]: executing: cat << EOF > /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-bam2ta/shard-0/execution/script.caper

!/bin/bash

if [ 'true' == 'true' ] && [ 'conda' == 'singularity' ] || \ [ 'true' == 'false' ] && [ 'true' == 'true' ] && [ ! -z 'https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif' ] then mkdir -p $HOME/.singularity/lock/ flock --exclusive --timeout 600 \ $HOME/.singularity/lock/echo -n 'https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif' | md5sum | cut -d' ' -f1 \ singularity exec --containall https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif echo 'Successfully pulled https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif'

singularity exec --cleanenv --home=`dirname /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-bam2ta/shard-0` \
    --bind=, \
     \
    https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif /bin/bash /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-bam2ta/shard-0/execution/script

elif [ 'true' == 'true' ] && [ 'conda' == 'conda' ] || \ [ 'true' == 'false' ] && [ 'true' == 'true' ] && [ ! -z 'encode-atac-seq-pipeline' ] then conda run --name=encode-atac-seq-pipeline /bin/bash /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-bam2ta/shard-0/execution/script

else /bin/bash /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-bam2ta/shard-0/execution/script fi

EOF

for ITER in 1 2 3 do sbatch --export=ALL -J cromwell_69c5d7ab_bam2ta -D /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-bam2ta/shard-0 -o /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-bam2ta/shard-0/execution/stdout -e /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-bam2ta/shard-0/execution/stderr \ -p medium --account knipe_dmk2 \ -n 1 --ntasks-per-node=1 --cpus-per-task=2 --mem=4767M --time=780 \ \ /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-bam2ta/shard-0/execution/script.caper && exit 0 sleep 30 done exit 1 2022-02-07 09:36:58,710 cromwell-system-akka.dispatchers.backend-dispatcher-71 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.bam2ta:0:1]: job id: 19645 2022-02-07 09:36:58,714 cromwell-system-akka.dispatchers.backend-dispatcher-69 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.bam2ta:0:1]: Status change from - to WaitingForReturnCode 2022-02-07 09:36:59,443 cromwell-system-akka.dispatchers.engine-dispatcher-10 INFO - WorkflowExecutionActor-69c5d7ab-4e56-4d62-9a66-49f973ae6af3 [UUID(69c5d7ab)]: Starting atac.gc_bias 2022-02-07 09:36:59,839 cromwell-system-akka.dispatchers.engine-dispatcher-10 INFO - Assigned new job execution tokens to the following groups: 69c5d7ab: 1 2022-02-07 09:36:59,848 cromwell-system-akka.dispatchers.engine-dispatcher-10 INFO - 69c5d7ab-4e56-4d62-9a66-49f973ae6af3-EngineJobExecutionActor-atac.gc_bias:0:1 [UUID(69c5d7ab)]: Could not copy a suitable cache hit for 69c5d7ab:atac.gc_bias:0:1. No copy attempts were made. 2022-02-07 09:36:59,852 cromwell-system-akka.dispatchers.backend-dispatcher-71 WARN - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.gc_bias:0:1]: Unrecognized runtime attribute keys: disks, docker 2022-02-07 09:36:59,881 cromwell-system-akka.dispatchers.backend-dispatcher-71 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.gc_bias:0:1]: set -e python3 $(which encode_task_gc_bias.py) \ --nodup-bam /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/inputs/-1090778166/ATRX_KO_4hr_R1_rep2.sub.dedups.sort.bam \ --ref-fa /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/inputs/-1609375697/GRCh38_no_alt_analysis_set_GCA_000001405_hsv.15.fasta.gz \ --picard-java-heap 4G 2022-02-07 09:36:59,896 cromwell-system-akka.dispatchers.backend-dispatcher-71 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.gc_bias:0:1]: executing: cat << EOF > /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/execution/script.caper

!/bin/bash

if [ 'true' == 'true' ] && [ 'conda' == 'singularity' ] || \ [ 'true' == 'false' ] && [ 'true' == 'true' ] && [ ! -z 'https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif' ] then mkdir -p $HOME/.singularity/lock/ flock --exclusive --timeout 600 \ $HOME/.singularity/lock/echo -n 'https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif' | md5sum | cut -d' ' -f1 \ singularity exec --containall https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif echo 'Successfully pulled https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif'

singularity exec --cleanenv --home=`dirname /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0` \
    --bind=, \
     \
    https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif /bin/bash /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/execution/script

elif [ 'true' == 'true' ] && [ 'conda' == 'conda' ] || \ [ 'true' == 'false' ] && [ 'true' == 'true' ] && [ ! -z 'encode-atac-seq-pipeline' ] then conda run --name=encode-atac-seq-pipeline /bin/bash /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/execution/script

else /bin/bash /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/execution/script fi

EOF

for ITER in 1 2 3 do sbatch --export=ALL -J cromwell_69c5d7ab_gc_bias -D /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0 -o /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/execution/stdout -e /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/execution/stderr \ -p medium --account knipe_dmk2 \ -n 1 --ntasks-per-node=1 --cpus-per-task=1 --mem=4767M --time=780 \ \ /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/execution/script.caper && exit 0 sleep 30 done exit 1 2022-02-07 09:37:03,710 cromwell-system-akka.dispatchers.backend-dispatcher-71 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.gc_bias:0:1]: job id: 19664 2022-02-07 09:37:03,715 cromwell-system-akka.dispatchers.backend-dispatcher-69 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.gc_bias:0:1]: Status change from - to WaitingForReturnCode 2022-02-07 09:37:04,524 cromwell-system-akka.dispatchers.engine-dispatcher-10 INFO - WorkflowExecutionActor-69c5d7ab-4e56-4d62-9a66-49f973ae6af3 [UUID(69c5d7ab)]: Starting atac.jsd 2022-02-07 09:37:05,841 cromwell-system-akka.dispatchers.engine-dispatcher-10 INFO - Assigned new job execution tokens to the following groups: 69c5d7ab: 1 2022-02-07 09:37:05,852 cromwell-system-akka.dispatchers.engine-dispatcher-33 INFO - 69c5d7ab-4e56-4d62-9a66-49f973ae6af3-EngineJobExecutionActor-atac.jsd:NA:1 [UUID(69c5d7ab)]: Could not copy a suitable cache hit for 69c5d7ab:atac.jsd:-1:1. No copy attempts were made. 2022-02-07 09:37:05,852 cromwell-system-akka.dispatchers.backend-dispatcher-46 WARN - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.jsd:NA:1]: Unrecognized runtime attribute keys: disks, docker 2022-02-07 09:37:05,873 cromwell-system-akka.dispatchers.backend-dispatcher-46 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.jsd:NA:1]: set -e python3 $(which encode_task_jsd.py) \ /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-jsd/inputs/-1090778166/ATRX_KO_4hr_R1_rep2.sub.dedups.sort.bam \ --mapq-thresh 30 \ --blacklist /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-jsd/inputs/-1609375697/hg38.blacklist.bed.gz \ --nth 4 2022-02-07 09:37:05,887 cromwell-system-akka.dispatchers.backend-dispatcher-46 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.jsd:NA:1]: executing: cat << EOF > /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-jsd/execution/script.caper

!/bin/bash

if [ 'true' == 'true' ] && [ 'conda' == 'singularity' ] || \ [ 'true' == 'false' ] && [ 'true' == 'true' ] && [ ! -z 'https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif' ] then mkdir -p $HOME/.singularity/lock/ flock --exclusive --timeout 600 \ $HOME/.singularity/lock/echo -n 'https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif' | md5sum | cut -d' ' -f1 \ singularity exec --containall https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif echo 'Successfully pulled https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif'

singularity exec --cleanenv --home=`dirname /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-jsd` \
    --bind=, \
     \
    https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif /bin/bash /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-jsd/execution/script

elif [ 'true' == 'true' ] && [ 'conda' == 'conda' ] || \ [ 'true' == 'false' ] && [ 'true' == 'true' ] && [ ! -z 'encode-atac-seq-pipeline' ] then conda run --name=encode-atac-seq-pipeline /bin/bash /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-jsd/execution/script

else /bin/bash /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-jsd/execution/script fi

EOF

for ITER in 1 2 3 do sbatch --export=ALL -J cromwell_69c5d7ab_jsd -D /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-jsd -o /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-jsd/execution/stdout -e /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-jsd/execution/stderr \ -p medium --account knipe_dmk2 \ -n 1 --ntasks-per-node=1 --cpus-per-task=4 --mem=5343M --time=780 \ \ /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-jsd/execution/script.caper && exit 0 sleep 30 done exit 1 2022-02-07 09:37:08,718 cromwell-system-akka.dispatchers.backend-dispatcher-71 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.jsd:NA:1]: job id: 19699 2022-02-07 09:37:08,725 cromwell-system-akka.dispatchers.backend-dispatcher-70 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.jsd:NA:1]: Status change from - to WaitingForReturnCode 2022-02-07 09:47:02,971 cromwell-system-akka.dispatchers.backend-dispatcher-176 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.fraglen_stat_pe:0:1]: Status change from WaitingForReturnCode to Done 2022-02-07 09:51:28,087 cromwell-system-akka.dispatchers.backend-dispatcher-226 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.jsd:NA:1]: Status change from WaitingForReturnCode to Done 2022-02-07 09:52:19,290 cromwell-system-akka.dispatchers.backend-dispatcher-261 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.gc_bias:0:1]: Status change from WaitingForReturnCode to Done 2022-02-07 09:52:23,719 cromwell-system-akka.dispatchers.engine-dispatcher-33 INFO - WorkflowExecutionActor-69c5d7ab-4e56-4d62-9a66-49f973ae6af3 [UUID(69c5d7ab)]: Retrying job execution for atac.gc_bias:0:2 2022-02-07 09:52:23,981 cromwell-system-akka.dispatchers.engine-dispatcher-10 INFO - WorkflowExecutionActor-69c5d7ab-4e56-4d62-9a66-49f973ae6af3 [UUID(69c5d7ab)]: Starting atac.gc_bias 2022-02-07 09:52:25,831 cromwell-system-akka.dispatchers.engine-dispatcher-33 INFO - Assigned new job execution tokens to the following groups: 69c5d7ab: 1 2022-02-07 09:52:25,857 cromwell-system-akka.dispatchers.backend-dispatcher-261 WARN - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.gc_bias:0:2]: Unrecognized runtime attribute keys: disks, docker 2022-02-07 09:52:25,883 cromwell-system-akka.dispatchers.backend-dispatcher-261 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.gc_bias:0:2]: set -e python3 $(which encode_task_gc_bias.py) \ --nodup-bam /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/inputs/-1090778166/ATRX_KO_4hr_R1_rep2.sub.dedups.sort.bam \ --ref-fa /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/inputs/-1609375697/GRCh38_no_alt_analysis_set_GCA_000001405_hsv.15.fasta.gz \ --picard-java-heap 4G 2022-02-07 09:52:25,919 cromwell-system-akka.dispatchers.backend-dispatcher-261 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.gc_bias:0:2]: executing: cat << EOF > /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/execution/script.caper

!/bin/bash

if [ 'true' == 'true' ] && [ 'conda' == 'singularity' ] || \ [ 'true' == 'false' ] && [ 'true' == 'true' ] && [ ! -z 'https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif' ] then mkdir -p $HOME/.singularity/lock/ flock --exclusive --timeout 600 \ $HOME/.singularity/lock/echo -n 'https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif' | md5sum | cut -d' ' -f1 \ singularity exec --containall https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif echo 'Successfully pulled https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif'

singularity exec --cleanenv --home=`dirname /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2` \
    --bind=, \
     \
    https://encode-pipeline-singularity-image.s3.us-west-2.amazonaws.com/atac-seq-pipeline_v2.0.3.sif /bin/bash /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/execution/script

elif [ 'true' == 'true' ] && [ 'conda' == 'conda' ] || \ [ 'true' == 'false' ] && [ 'true' == 'true' ] && [ ! -z 'encode-atac-seq-pipeline' ] then conda run --name=encode-atac-seq-pipeline /bin/bash /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/execution/script

else /bin/bash /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/execution/script fi

EOF

for ITER in 1 2 3 do sbatch --export=ALL -J cromwell_69c5d7ab_gc_bias -D /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2 -o /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/execution/stdout -e /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/execution/stderr \ -p medium --account knipe_dmk2 \ -n 1 --ntasks-per-node=1 --cpus-per-task=1 --mem=4767M --time=780 \ \ /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/execution/script.caper && exit 0 sleep 30 done exit 1 2022-02-07 09:52:28,701 cromwell-system-akka.dispatchers.backend-dispatcher-275 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.gc_bias:0:2]: job id: 29676 2022-02-07 09:52:28,704 cromwell-system-akka.dispatchers.backend-dispatcher-278 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.gc_bias:0:2]: Status change from - to WaitingForReturnCode 2022-02-07 10:10:22,926 cromwell-system-akka.dispatchers.backend-dispatcher-480 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.gc_bias:0:2]: Status change from WaitingForReturnCode to Done 2022-02-07 10:34:20,682 cromwell-system-akka.dispatchers.backend-dispatcher-513 INFO - BackgroundConfigAsyncJobExecutionActor [UUID(69c5d7ab)atac.bam2ta:0:1]: Status change from WaitingForReturnCode to Done 2022-02-07 10:34:21,347 cromwell-system-akka.dispatchers.engine-dispatcher-29 INFO - WorkflowManagerActor: Workflow 69c5d7ab-4e56-4d62-9a66-49f973ae6af3 failed (during ExecutingWorkflowState): Job atac.gc_bias:0:2 exited with return code 1 which has not been declared as a valid return code. See 'continueOnReturnCode' runtime attribute for more details. Check the content of stderr for potential additional information: /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/execution/stderr. [First 3000 bytes]:Picked up _JAVA_OPTIONS: -Djava.io.tmpdir=/n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/tmp.d45903f0 INFO 2022-02-07 10:03:19 CollectGcBiasMetrics

** NOTE: Picard's command line syntax is changing.


** For more information, please see: ** https://github.com/broadinstitute/picard/wiki/Command-Line-Syntax-Transition-For-Users-(Pre-Transition)


** The command line looks like this in the new syntax:


** CollectGcBiasMetrics -R /n/scratch3/users/t/tht900/atac-testing/atac/69c5d7ab-4e56-4d62-9a66-49f973ae6af3/call-gc_bias/shard-0/attempt-2/inputs/-1609375697/GRCh38_no_alt_analysis_set_GCA_000001405_hsv.15.fasta.gz -I ./ATRX_KO_4hr_R1_rep2.sub.dedups.sort.no_rg.bam -O ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gc.txt -USE_JDK_DEFLATER TRUE -USE_JDK_INFLATER TRUE -VERBOSITY ERROR -QUIET TRUE -ASSUME_SORTED FALSE -CHART ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gcPlot.pdf -S ATRX_KO_4hr_R1_rep2.sub.dedups.sort.gcSummary.txt


Exception in thread "main" java.lang.ArrayIndexOutOfBoundsException: Index 17009 out of bounds for length 16570 at picard.analysis.GcBiasMetricsCollector.addRead(GcBiasMetricsCollector.java:384) at picard.analysis.GcBiasMetricsCollector.access$600(GcBiasMetricsCollector.java:48) at picard.analysis.GcBiasMetricsCollector$PerUnitGcBiasMetricsCollector.addReadToGcData(GcBiasMetricsCollector.java:221) at picard.analysis.GcBiasMetricsCollector$PerUnitGcBiasMetricsCollector.acceptRecord(GcBiasMetricsCollector.java:155) at picard.analysis.GcBiasMetricsCollector$PerUnitGcBiasMetricsCollector.acceptRecord(GcBiasMetricsCollector.java:100) at picard.metrics.MultiLevelCollector$AllReadsDistributor.acceptRecord(MultiLevelCollector.java:192) at picard.metrics.MultiLevelCollector.acceptRecord(MultiLevelCollector.java:315) at picard.analysis.CollectGcBiasMetrics.acceptRead(CollectGcBiasMetrics.java:172) at picard.analysis.SinglePassSamProgram.makeItSo(SinglePassSamProgram.java:145) at picard.analysis.SinglePassSamProgram.doWork(SinglePassSamProgram.java:84) at picard.cmdline.CommandLineProgram.instanceMain(CommandLineProgram.java:305) at picard.cmdline.PicardCommandLine.instanceMain(PicardCommandLine.java:103) at picard.cmdline.PicardCommandLine.main(PicardCommandLine.java:113) Traceback (most recent call last): File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/bin/encode_task_gc_bias.py", line 143, in main() File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/bin/encode_task_gc_bias.py", line 132, in main plot_gc(gc_out, OUTPUT_PREFIX) File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/bin/encode_task_gc_bias.py", line 80, in plot_gc data = pd.read_table(data_file, comment="#") File "/home/tht900/miniconda3/envs/encode-atac-seq-pipeline/lib/python3.6/site-packages/pandas/io/parsers.py", line 767, in read_table return read_csv(**locals())

2022-02-07 10:34:23,717 cromwell-system-akka.dispatchers.engine-dispatcher-33 INFO - WorkflowManagerActor: Workflow actor for 69c5d7ab-4e56-4d62-9a66-49f973ae6af3 completed with status 'Failed'. The workflow will be removed from the workflow store. 2022-02-07 10:35:15,061 cromwell-system-akka.dispatchers.engine-dispatcher-11 INFO - SingleWorkflowRunnerActor workflow finished with status 'Failed'. 2022-02-07 10:35:19,080 cromwell-system-akka.dispatchers.engine-dispatcher-33 INFO - SingleWorkflowRunnerActor writing metadata to /n/scratch3/users/t/tht900/atac-testing/caper_temp/atac_medium_2/20220207_093423_447456/metadata.json 2022-02-07 10:35:19,147 INFO - Workflow polling stopped 2022-02-07 10:35:19,152 INFO - 0 workflows released by cromid-c73d3a1 2022-02-07 10:35:19,156 INFO - Shutting down WorkflowStoreActor - Timeout = 5 seconds 2022-02-07 10:35:19,166 INFO - Shutting down WorkflowLogCopyRouter - Timeout = 5 seconds 2022-02-07 10:35:19,169 cromwell-system-akka.dispatchers.engine-dispatcher-30 INFO - Aborting all running workflows. 2022-02-07 10:35:19,173 INFO - WorkflowStoreActor stopped 2022-02-07 10:35:19,191 INFO - Shutting down JobExecutionTokenDispenser - Timeout = 5 seconds 2022-02-07 10:35:19,193 INFO - JobExecutionTokenDispenser stopped 2022-02-07 10:35:19,196 INFO - WorkflowLogCopyRouter stopped 2022-02-07 10:35:19,197 INFO - Shutting down WorkflowManagerActor - Timeout = 3600 seconds 2022-02-07 10:35:19,198 cromwell-system-akka.dispatchers.engine-dispatcher-30 INFO - WorkflowManagerActor: All workflows finished 2022-02-07 10:35:19,201 INFO - WorkflowManagerActor stopped 2022-02-07 10:35:19,744 INFO - Connection pools shut down 2022-02-07 10:35:19,744 INFO - Shutting down SubWorkflowStoreActor - Timeout = 1800 seconds 2022-02-07 10:35:19,745 INFO - Shutting down JobStoreActor - Timeout = 1800 seconds 2022-02-07 10:35:19,746 INFO - Shutting down CallCacheWriteActor - Timeout = 1800 seconds 2022-02-07 10:35:19,748 INFO - CallCacheWriteActor Shutting down: 0 queued messages to process 2022-02-07 10:35:19,750 INFO - Shutting down ServiceRegistryActor - Timeout = 1800 seconds 2022-02-07 10:35:19,751 INFO - Shutting down DockerHashActor - Timeout = 1800 seconds 2022-02-07 10:35:19,808 INFO - Shutting down connection pool: curAllocated=0 idleQueues.size=0 waitQueue.size=0 maxWaitQueueLimit=256 closed=false 2022-02-07 10:35:19,810 INFO - Shutting down connection pool: curAllocated=0 idleQueues.size=0 waitQueue.size=0 maxWaitQueueLimit=256 closed=false 2022-02-07 10:35:19,811 INFO - Shutting down connection pool: curAllocated=0 idleQueues.size=0 waitQueue.size=0 maxWaitQueueLimit=256 closed=false 2022-02-07 10:35:19,818 INFO - Shutting down connection pool: curAllocated=0 idleQueues.size=0 waitQueue.size=0 maxWaitQueueLimit=256 closed=false 2022-02-07 10:35:19,820 INFO - Shutting down IoProxy - Timeout = 1800 seconds 2022-02-07 10:35:19,827 INFO - SubWorkflowStoreActor stopped 2022-02-07 10:35:19,829 INFO - JobStoreActor stopped 2022-02-07 10:35:19,830 INFO - CallCacheWriteActor stopped 2022-02-07 10:35:19,830 INFO - DockerHashActor stopped 2022-02-07 10:35:19,830 INFO - IoProxy stopped 2022-02-07 10:35:19,832 INFO - WriteMetadataActor Shutting down: 0 queued messages to process 2022-02-07 10:35:19,833 INFO - KvWriteActor Shutting down: 0 queued messages to process 2022-02-07 10:35:19,843 INFO - ServiceRegistryActor stopped 2022-02-07 10:35:19,868 INFO - Database closed 2022-02-07 10:35:19,868 INFO - Stream materializer shut down 2022-02-07 10:35:19,870 INFO - WDL HTTP import resolver closed

leepc12 commented 2 years ago

If you don't start from FASTQs make sure that genome reference data defined in /home/tht900/ATAC-seq-testing/hg38_hsv_genome_database3/hg38_hsv.tsv match with those used for aligning the BAM file ATRX_KO_4hr_R1_rep2.sub.dedups.bam.

Did you test the pipeline with our test example?

thaorope commented 2 years ago

@leepc12 : Thank you for replying!

I tested the pipeline with your test example, and it worked well.

For my sample:

  1. I tried to run with FASTQ for alignment only - OK.
  2. With the output files after alignment, I subsampled (based on calculated library Complexity) + PCR dedupped and then ran these output BAM files through pipeline to call peaks - This did not work.

I don't quite understand what you mean by "geome reference data matching those used for alignment the BAM file". Could you please elaborate on this?

Thank you for your help!

thaorope commented 2 years ago

Hi @leepc12 : I re-ran this again multiple times, and still got the same errors. I can't seem to figure out what is going wrong here. The pipeline ran well with the test example. Do you have any idea where I should look into with my dataset?

leepc12 commented 2 years ago

Please try with pipeline's own subsample feature. Start from FASTQs and add something like "atac.subsample_reads": 1000000 to your input JSON.