rabix / bunny

[Legacy] Executor for CWL workflows. Executes sbg:draft-2 and CWL 1.0
http://rabix.io
Apache License 2.0
74 stars 28 forks source link

Cleaner thread is executing. There are 1 runnable(s) in the pool. (repeated) #215

Closed jeremiahsavage closed 7 years ago

jeremiahsavage commented 7 years ago

I am trying to ease debugging by creating the cwl pattern used by the gdc pipeline using just unix tools, as this pattern runs in a fraction of the time while using the same cwl features.

git clone https://github.com/jeremiahsavage/cwl_align_merge_pattern.git

Currently, with develop branch of bunny, the workflow hangs with repeated: Cleaner thread is executing. There are 1 runnable(s) in the pool. cwltool is able to run the workflow successfully.

I have not yet tried the https://github.com/rabix/bunny/tree/feature/gdc branch.


$ java -jar ~/code/bunny/rabix-backend-local/target/rabix-backend-local-1.0.0-rc3.jar cwl_align_merge_pattern/workflow.cwl cwl_align_merge_pattern/workflow.yml

... snipped output
...
...
Job root.merge_pe input counters:
 -- Input port INPUT, counter=0
 -- Input port OUTPUT, counter=0

18:37:52.952 [pool-4-thread-1] DEBUG o.r.e.s.impl.JobRecordServiceImpl - 
Job root.merge_pe output counters:
 -- Output port MERGED_OUTPUT, counter=0

18:37:52.952 [pool-4-thread-1] DEBUG org.rabix.engine.lru.dag.DAGCache - DAGNode rootId=191c9c20-eac3-49f1-b4d6-2eb30c67e578, id=root.merge_pe found in cache
18:37:52.952 [pool-4-thread-1] DEBUG org.rabix.engine.lru.dag.DAGCache - Cache size=1
18:37:52.956 [pool-4-thread-1] DEBUG org.rabix.engine.JobHelper - 
 ---- JobRecord root.merge_pe
 ---- Input INPUT, value [FileValue [size=38, path=/home/jeremiah/cwl_align_merge_pattern/191c9c20-eac3-49f1-b4d6-2eb30c67e578/root/align_pe/1/file1_realign.bam, location=/home/jeremiah/cwl_align_merge_pattern/191c9c20-eac3-49f1-b4d6-2eb30c67e578/root/align_pe/1/file1_realign.bam, checksum=sha1$e6f9fb2f4e90de2cfc53e614a2de1fa60c3becb3, secondaryFiles=[], properties={sbg:metadata=null}]]
 ---- Input OUTPUT, value test.bam

18:37:52.956 [pool-4-thread-1] INFO  o.r.e.service.impl.JobServiceImpl - Job root.merge_pe rootId: 191c9c20-eac3-49f1-b4d6-2eb30c67e578 is completed.
18:37:52.956 [pool-4-thread-1] DEBUG o.r.e.s.i.DefaultEngineStatusCallback - onJobCompleted(jobId=310540ca-99a9-4bfd-9f57-747623798afa)
18:37:52.956 [pool-4-thread-1] DEBUG o.r.e.p.impl.MultiEventProcessorImpl - Root Job 191c9c20-eac3-49f1-b4d6-2eb30c67e578 goes to EventProcessor 2
18:37:52.956 [pool-4-thread-1] DEBUG org.rabix.engine.lru.dag.DAGCache - DAGNode rootId=191c9c20-eac3-49f1-b4d6-2eb30c67e578, id=root.merge_all found in cache
18:37:52.956 [pool-4-thread-1] DEBUG org.rabix.engine.lru.dag.DAGCache - Cache size=1
18:37:52.956 [pool-4-thread-1] DEBUG o.r.e.s.impl.JobRecordServiceImpl - JobRecord root.merge_all. Decrementing port INPUT.
18:37:52.956 [pool-4-thread-1] DEBUG o.r.e.s.impl.JobRecordServiceImpl - 
Job root.merge_all input counters:
 -- Input port INPUT, counter=1
 -- Input port OUTPUT, counter=0

18:37:52.956 [pool-4-thread-1] DEBUG o.r.e.s.impl.JobRecordServiceImpl - 
Job root.merge_all output counters:
 -- Output port MERGED_OUTPUT, counter=1

18:37:52.957 [pool-4-thread-1] DEBUG org.rabix.engine.cache.Cache - org.rabix.engine.memory.impl.InMemoryJobRecordRepository@865af66 flushed 2 item(s). Cache hits 2.
18:37:52.957 [pool-4-thread-1] DEBUG org.rabix.engine.cache.Cache - org.rabix.engine.memory.impl.InMemoryVariableRecordRepository@2ebc6005 flushed 2 item(s). Cache hits 2.
18:37:53.163 [JobHandler-Thread-1] DEBUG o.r.e.execution.JobHandlerRunnable - Command STATUS is last command. Stop thread.
18:37:53.163 [JobHandler-Thread-1] INFO  o.r.e.execution.JobHandlerRunnable - JobHandlerRunnable JobHandler-Thread-1 stopped.
18:37:53.163 [JobHandler-Thread-1] INFO  o.r.e.execution.JobHandlerRunnable - JobHandlerRunnable JobHandler-Thread-1 finished.
18:38:27.891 [pool-11-thread-1] DEBUG o.r.e.e.JobHandlerCommandDispatcher - Cleaner thread is executing. There are 1 runnable(s) in the pool.
18:38:27.893 [pool-11-thread-1] DEBUG o.r.e.e.JobHandlerCommandDispatcher - Cleaner thread removes JobHandlerRunnable for context 191c9c20-eac3-49f1-b4d6-2eb30c67e578 and job 29c10ded-758f-492b-abe5-45253ab0ba9b.
18:38:27.893 [pool-11-thread-1] DEBUG o.r.e.e.JobHandlerCommandDispatcher - Cleaner thread removes JobHandlerRunnable for context 191c9c20-eac3-49f1-b4d6-2eb30c67e578 and job 274e66ce-9b3b-4b23-865d-d8de94deafb1.
18:38:27.893 [pool-11-thread-1] DEBUG o.r.e.e.JobHandlerCommandDispatcher - Cleaner thread removes JobHandlerRunnable for context 191c9c20-eac3-49f1-b4d6-2eb30c67e578 and job 2b6e5ac2-21cb-4185-bb1c-09a91287a645.
18:38:27.893 [pool-11-thread-1] DEBUG o.r.e.e.JobHandlerCommandDispatcher - Cleaner thread removes JobHandlerRunnable for context 191c9c20-eac3-49f1-b4d6-2eb30c67e578 and job 92153bdc-41aa-4538-9f66-db2c1fce0167.
18:38:27.893 [pool-11-thread-1] DEBUG o.r.e.e.JobHandlerCommandDispatcher - Cleaner thread removes JobHandlerRunnable for context 191c9c20-eac3-49f1-b4d6-2eb30c67e578 and job 310540ca-99a9-4bfd-9f57-747623798afa.
18:38:27.893 [pool-11-thread-1] DEBUG o.r.e.e.JobHandlerCommandDispatcher - Cleaner thread removes JobHandlerRunnable for context 191c9c20-eac3-49f1-b4d6-2eb30c67e578 and job a589ab79-0fa0-4a5f-a1e4-608788ade87f.
18:39:27.894 [pool-11-thread-1] DEBUG o.r.e.e.JobHandlerCommandDispatcher - Cleaner thread is executing. There are 1 runnable(s) in the pool.
18:40:27.895 [pool-11-thread-1] DEBUG o.r.e.e.JobHandlerCommandDispatcher - Cleaner thread is executing. There are 1 runnable(s) in the pool.

This mirrors what is seen when running the GDC DNASeq worklow. Snipped output:

17:48:50.610 [JobHandler-Thread-24] DEBUG o.r.b.cwl.CWLCommandLineBuilder - Building command line part for value {metadata=null, format=edam:format_3621, dirname=/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-74091a9e6e57/root/met
rics_pe/1/merge_sqlite, nameroot=demo, path=/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-74091a9e6e57/root/metrics_pe/1/merge_sqlite/demo.db, basename=demo.db, size=556032, nameext=.db, contents=null, checksum=sha1$fd41953d826da
3eddcd3403501db340ff535bfaf, location=/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-74091a9e6e57/root/metrics_pe/1/merge_sqlite/demo.db, secondaryFiles=[], class=File} and schema {}
17:48:50.610 [JobHandler-Thread-24] DEBUG o.r.b.cwl.CWLCommandLineBuilder - Building command line part for value demo and schema string
17:48:50.610 [JobHandler-Thread-24] INFO  o.r.b.cwl.CWLCommandLineBuilder - Command line built. CommandLine = CommandLine [parts=[/usr/local/bin/merge_sqlite, --source_sqlite, /mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-74091a9
e6e57/root/metrics_pe/1/merge_sqlite/demo.db, --uuid, demo], standardIn=null, standardOut=null, standardError=null]
17:48:50.610 [JobHandler-Thread-24] INFO  verbose - Running command line: /usr/local/bin/merge_sqlite --source_sqlite /mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-74091a9e6e57/root/metrics_pe/1/merge_sqlite/demo.db --uuid demo
17:48:50.610 [pool-1-thread-2] INFO  c.s.d.client.DefaultDockerClient - Creating container with ContainerConfig: ContainerConfig{hostname=null, domainname=null, username=null, attachStdin=null, attachStdout=null, attachStderr=null, portSpecs=null, expo
sedPorts=null, tty=null, openStdin=null, stdinOnce=null, env=[HOME=/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-74091a9e6e57/root/merge_metrics_pe, TMPDIR=/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-7409
1a9e6e57/root/merge_metrics_pe], cmd=[/bin/sh, -c, /usr/local/bin/merge_sqlite --source_sqlite /mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-74091a9e6e57/root/metrics_pe/1/merge_sqlite/demo.db --uuid demo], image=quay.io/ncigdc/m
erge_sqlite:3, volumes=[/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq], workingDir=/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-74091a9e6e57/root/merge_metrics_pe, entrypoint=null, networkDisabled=null, onBuild=null, labels=null,
 macAddress=null, hostConfig=HostConfig{binds=[/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq:/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq:rw], containerIDFile=null, lxcConf=null, privileged=null, portBindings=null, links=null, publishAllPorts=null, dns=
null, dnsSearch=null, extraHosts=null, volumesFrom=null, networkMode=null, securityOpt=null, memory=null, memorySwap=null, cpuShares=null, cpusetCpus=null, cpuQuota=null, cgroupParent=null, restartPolicy=null, logConfig=null}, stopSignal=null}
17:48:51.582 [pool-1-thread-3] INFO  c.s.d.client.DefaultDockerClient - Starting container with Id: 80c1aea982a972076ca08919f73871102886c627e80a3a46a218883fd4f21714
17:48:51.992 [JobHandler-Thread-24] INFO  o.r.e.c.impl.DockerContainerHandler - Docker container 80c1aea982a972076ca08919f73871102886c627e80a3a46a218883fd4f21714 has started.
17:48:51.992 [JobHandler-Thread-24] DEBUG o.r.e.execution.JobHandlerRunnable - Command STATUS found. Start execution.
17:48:51.992 [JobHandler-Thread-24] DEBUG o.r.e.execution.JobHandlerRunnable - Command STATUS is repeatable. Delay and put it back to queue.
17:48:52.992 [JobHandler-Thread-24] DEBUG o.r.e.e.command.StatusCommand - Check status for 436bf791-2a8b-47f9-bfa6-038393615ded command line tool.
17:48:52.993 [JobHandler-Thread-24] DEBUG o.r.e.handler.impl.JobHandlerImpl - postprocess(id=436bf791-2a8b-47f9-bfa6-038393615ded)
17:48:52.995 [JobHandler-Thread-24] INFO  o.r.executor.service.CacheService - Trying to find cached results in the directory /Users/janko/Desktop/Archive/varscan/root/.merge_metrics_pe.meta
17:48:52.995 [JobHandler-Thread-24] INFO  o.r.executor.service.CacheService - Cache directory doesn't exist. Directory /Users/janko/Desktop/Archive/varscan/root/.merge_metrics_pe.meta
17:48:52.995 [JobHandler-Thread-24] DEBUG o.r.e.c.impl.DockerContainerHandler - Saving standard error files for id=436bf791-2a8b-47f9-bfa6-038393615ded
17:48:53.000 [JobHandler-Thread-24] DEBUG o.r.e.handler.impl.JobHandlerImpl - isSuccessful()
17:48:53.000 [JobHandler-Thread-24] DEBUG o.r.e.handler.impl.JobHandlerImpl - getExitStatus()
17:48:53.006 [pool-1-thread-3] INFO  c.s.d.client.DefaultDockerClient - Creating container with ContainerConfig: ContainerConfig{hostname=null, domainname=null, username=null, attachStdin=null, attachStdout=null, attachStderr=null, portSpecs=null, expo
sedPorts=null, tty=null, openStdin=null, stdinOnce=null, env=null, cmd=[sh, -c, chmod -R u+r "/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-74091a9e6e57/root/merge_metrics_pe"/*;chown -R 1000:1000 "/mnt/SCRATCH/gdc-dnaseq-cwl/wor
kflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-74091a9e6e57/root/merge_metrics_pe"/*], image=ubuntu:latest, volumes=[/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq], workingDir=/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-74091a9e6e57/root
/merge_metrics_pe, entrypoint=null, networkDisabled=null, onBuild=null, labels=null, macAddress=null, hostConfig=HostConfig{binds=[/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq:/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq:rw], containerIDFile=null, lxcC
onf=null, privileged=null, portBindings=null, links=null, publishAllPorts=null, dns=null, dnsSearch=null, extraHosts=null, volumesFrom=null, networkMode=null, securityOpt=null, memory=null, memorySwap=null, cpuShares=null, cpusetCpus=null, cpuQuota=nul
l, cgroupParent=null, restartPolicy=null, logConfig=null}, stopSignal=null}
17:48:54.052 [pool-1-thread-5] INFO  c.s.d.client.DefaultDockerClient - Starting container with Id: 22f8e11f0425985575bf63c1786aba1fc8d9a3608da4a148fd6727b1ca76b575
17:48:54.461 [JobHandler-Thread-24] INFO  o.r.e.s.i.FilePermissionServiceImpl - Docker container 22f8e11f0425985575bf63c1786aba1fc8d9a3608da4a148fd6727b1ca76b575 has started.
17:49:03.875 [JobHandler-Thread-24] DEBUG org.rabix.bindings.cwl.CWLProcessor - Collecting outputs for root.merge_metrics_pe.
17:49:03.877 [JobHandler-Thread-24] DEBUG org.rabix.bindings.cwl.CWLProcessor - Glob service returned result [/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-74091a9e6e57/root/merge_metrics_pe/demo.db]
17:49:03.880 [JobHandler-Thread-24] INFO  org.rabix.bindings.cwl.CWLProcessor - Metadata expressions evaluated. Metadata is null.
17:49:03.880 [JobHandler-Thread-24] INFO  o.r.b.c.s.i.CWLMetadataServiceImpl - Metadata for destination_sqlite is null.
17:49:03.880 [JobHandler-Thread-24] INFO  org.rabix.bindings.cwl.CWLProcessor - Metadata for destination_sqlite output is empty.
17:49:03.880 [JobHandler-Thread-24] DEBUG org.rabix.bindings.cwl.CWLProcessor - Collecting outputs for root.merge_metrics_pe.
17:49:03.880 [JobHandler-Thread-24] DEBUG org.rabix.bindings.cwl.CWLProcessor - Glob service returned result [/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-74091a9e6e57/root/merge_metrics_pe/demo.log]
17:49:03.881 [JobHandler-Thread-24] INFO  org.rabix.bindings.cwl.CWLProcessor - Metadata expressions evaluated. Metadata is null.
17:49:03.881 [JobHandler-Thread-24] INFO  o.r.b.c.s.i.CWLMetadataServiceImpl - Metadata for log is null.
17:49:03.881 [JobHandler-Thread-24] INFO  org.rabix.bindings.cwl.CWLProcessor - Metadata for log output is empty.
17:49:03.881 [JobHandler-Thread-24] DEBUG o.r.e.s.i.NoOpExecutorStatusCallback - onOutputFilesUploadStarted(jobId=436bf791-2a8b-47f9-bfa6-038393615ded)
17:49:03.881 [JobHandler-Thread-24] DEBUG o.r.e.s.i.NoOpExecutorStatusCallback - onOutputFilesUploadComplted(jobId=436bf791-2a8b-47f9-bfa6-038393615ded)
17:49:03.882 [JobHandler-Thread-24] DEBUG o.r.e.handler.impl.JobHandlerImpl - Command line tool 436bf791-2a8b-47f9-bfa6-038393615ded returned result {destination_sqlite=FileValue [size=555008, path=/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-
6cd3-40c9-9023-74091a9e6e57/root/merge_metrics_pe/demo.db, locat
ion=/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-74091a9e6e57/root/merge_metrics_pe/demo.db, checksum=sha1$8be5eb7982336c0d137708a79fabc0827ff47da4, secondaryFiles=[], properties={sbg:metadata=null}], log=FileValue [size=169, path=/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-74091a9e6e57/root/merge_metrics_pe/demo.log, location=/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-74091a9e6e57/root/merge_metrics_pe/demo.log, checksum=sha1$06680a92ee7acb7859c95199e3ab7fe48e16a5cd, secondaryFiles=[], properties={sbg:metadata=null}]}.
17:49:03.882 [JobHandler-Thread-24] DEBUG o.r.e.handler.impl.JobHandlerImpl - isSuccessful()
17:49:03.882 [JobHandler-Thread-24] DEBUG o.r.e.handler.impl.JobHandlerImpl - getExitStatus()
17:49:03.884 [JobHandler-Thread-24] INFO  o.r.e.e.command.StatusCommand - Job 436bf791-2a8b-47f9-bfa6-038393615ded completed successfully.
17:49:03.884 [JobHandler-Thread-24] DEBUG o.r.e.s.i.NoOpExecutorStatusCallback - onJobCompleted(jobId=436bf791-2a8b-47f9-bfa6-038393615ded)
17:49:03.885 [JobHandler-Thread-24] INFO  verbose - Job root.merge_metrics_pe has completed
17:49:03.885 [JobHandler-Thread-24] DEBUG o.r.e.c.impl.DockerContainerHandler - Removing container with id 80c1aea982a972076ca08919f73871102886c627e80a3a46a218883fd4f21714
17:49:03.885 [pool-22-thread-1] DEBUG o.r.e.service.impl.JobServiceImpl - Update Job 436bf791-2a8b-47f9-bfa6-038393615ded
17:49:03.885 [pool-22-thread-1] DEBUG o.r.e.p.impl.MultiEventProcessorImpl - Root Job f4b5b4f2-6cd3-40c9-9023-74091a9e6e57 goes to EventProcessor 5
17:49:03.885 [pool-22-thread-1] DEBUG o.r.e.p.impl.MultiEventProcessorImpl - Root Job f4b5b4f2-6cd3-40c9-9023-74091a9e6e57 goes to EventProcessor 5
17:49:03.903 [pool-7-thread-1] DEBUG o.r.e.p.impl.MultiEventProcessorImpl - Root Job f4b5b4f2-6cd3-40c9-9023-74091a9e6e57 goes to EventProcessor 5
17:49:03.903 [pool-7-thread-1] DEBUG o.r.e.s.impl.JobRecordServiceImpl - JobRecord root.merge_metrics_pe. Decrementing port destination_sqlite.
17:49:03.903 [pool-7-thread-1] DEBUG o.r.e.s.impl.JobRecordServiceImpl - 
Job root.merge_metrics_pe input counters:
 -- Input port source_sqlite, counter=0
 -- Input port uuid, counter=0

17:49:03.903 [pool-7-thread-1] DEBUG o.r.e.s.impl.JobRecordServiceImpl - 
Job root.merge_metrics_pe output counters:
 -- Output port destination_sqlite, counter=0

17:49:03.903 [pool-7-thread-1] DEBUG org.rabix.engine.lru.dag.DAGCache - DAGNode rootId=f4b5b4f2-6cd3-40c9-9023-74091a9e6e57, id=root.merge_metrics_pe found in cache
17:49:03.903 [pool-7-thread-1] DEBUG org.rabix.engine.lru.dag.DAGCache - Cache size=1
17:49:03.903 [pool-7-thread-1] DEBUG org.rabix.engine.JobHelper - 
 ---- JobRecord root.merge_metrics_pe
 ---- Input source_sqlite, value [FileValue [size=556032, path=/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-74091a9e6e57/root/metrics_pe/1/merge_sqlite/demo.db, location=/mnt/SCRATCH/gdc-dnaseq-cwl/workflows/dnaseq/f4b5b4f2-6cd3-40c9-9023-74091a9e6e57/root/metrics_pe/1/merge_sqlite/demo.db, checksum=sha1$fd41953d826da3eddcd3403501db340ff535bfaf, secondaryFiles=[], properties={sbg:metadata=null}]]
 ---- Input uuid, value demo

17:49:03.903 [pool-7-thread-1] INFO  o.r.e.service.impl.JobServiceImpl - Job root.merge_metrics_pe rootId: f4b5b4f2-6cd3-40c9-9023-74091a9e6e57 is completed.
17:49:03.903 [pool-7-thread-1] DEBUG o.r.e.s.i.DefaultEngineStatusCallback - onJobCompleted(jobId=436bf791-2a8b-47f9-bfa6-038393615ded)
17:49:03.903 [pool-7-thread-1] DEBUG o.r.e.p.impl.MultiEventProcessorImpl - Root Job f4b5b4f2-6cd3-40c9-9023-74091a9e6e57 goes to EventProcessor 5
17:49:03.904 [pool-7-thread-1] DEBUG org.rabix.engine.lru.dag.DAGCache - DAGNode rootId=f4b5b4f2-6cd3-40c9-9023-74091a9e6e57, id=root.merge_all_sqlite found in cache
17:49:03.904 [pool-7-thread-1] DEBUG org.rabix.engine.lru.dag.DAGCache - Cache size=1
17:49:03.904 [pool-7-thread-1] DEBUG o.r.e.s.impl.JobRecordServiceImpl - JobRecord root.merge_all_sqlite. Decrementing port source_sqlite.
17:49:03.904 [pool-7-thread-1] DEBUG o.r.e.s.impl.JobRecordServiceImpl - 
Job root.merge_all_sqlite input counters:
 -- Input port source_sqlite, counter=5
 -- Input port uuid, counter=0

17:49:03.904 [pool-7-thread-1] DEBUG o.r.e.s.impl.JobRecordServiceImpl - 
Job root.merge_all_sqlite output counters:
 -- Output port destination_sqlite, counter=1

17:49:03.904 [pool-7-thread-1] DEBUG org.rabix.engine.cache.Cache - org.rabix.engine.memory.impl.InMemoryJobRecordRepository@31341860 flushed 2 item(s). Cache hits 2.
17:49:03.904 [pool-7-thread-1] DEBUG org.rabix.engine.cache.Cache - org.rabix.engine.memory.impl.InMemoryVariableRecordRepository@240ccda0 flushed 2 item(s). Cache hits 2.
17:49:15.751 [JobHandler-Thread-24] DEBUG o.r.e.execution.JobHandlerRunnable - Command STATUS is last command. Stop thread.
17:49:15.751 [JobHandler-Thread-24] INFO  o.r.e.execution.JobHandlerRunnable - JobHandlerRunnable JobHandler-Thread-24 stopped.
17:49:15.751 [JobHandler-Thread-24] INFO  o.r.e.execution.JobHandlerRunnable - JobHandlerRunnable JobHandler-Thread-24 finished.
17:49:41.305 [pool-15-thread-1] DEBUG o.r.e.e.JobHandlerCommandDispatcher - Cleaner thread is executing. There are 1 runnable(s) in the pool.
17:49:41.305 [pool-15-thread-1] DEBUG o.r.e.e.JobHandlerCommandDispatcher - Cleaner thread removes JobHandlerRunnable for context f4b5b4f2-6cd3-40c9-9023-74091a9e6e57 and job 7b04c3bf-59be-498e-8684-5f6d15ec2b4f.
17:49:41.305 [pool-15-thread-1] DEBUG o.r.e.e.JobHandlerCommandDispatcher - Cleaner thread removes JobHandlerRunnable for context f4b5b4f2-6cd3-40c9-9023-74091a9e6e57 and job 436bf791-2a8b-47f9-bfa6-038393615ded.
17:50:41.305 [pool-15-thread-1] DEBUG o.r.e.e.JobHandlerCommandDispatcher - Cleaner thread is executing. There are 1 runnable(s) in the pool.
17:51:41.305 [pool-15-thread-1] DEBUG o.r.e.e.JobHandlerCommandDispatcher - Cleaner thread is executing. There are 1 runnable(s) in the pool.
17:52:41.305 [pool-15-thread-1] DEBUG o.r.e.e.JobHandlerCommandDispatcher - Cleaner thread is executing. There are 1 runnable(s) in the pool.
17:53:41.306 [pool-15-thread-1] DEBUG o.r.e.e.JobHandlerCommandDispatcher - Cleaner thread is executing. There are 1 runnable(s) in the pool.
17:54:41.306 [pool-15-thread-1] DEBUG o.r.e.e.JobHandlerCommandDispatcher - Cleaner thread is executing. There are 1 runnable(s) in the pool.
jeremiahsavage commented 7 years ago

This issue is fixed with the latest develop branch commits. Thank you.