I wonder if there are either 1) solutions for this or 2) easy ways to add the ability to run a looper pipeline in an ad hoc manner. What I mean by that is this: occasionally, the overhead of a traditional workflow can be a bit daunting, but I really enjoy the ease of dispatching off jobs through slurm+looper.
I would love to replace traditional bash for loops with looper calls.
An example
I have a folder with hundreds of mixed-type files. Some of these might be bedGraph files. I want to convert these to .bw format. I can use bigtools bedGraphToBigWig. Traditionally, I might just use a for loop:
for file in *.bdg;
bigtools bedGraphToBigWig $file $file.bw
done;
But this takes awhile since it goes one-by-one, and there are hundreds. I'd love to fire them all off at once using looper and slurm:
ls *.bdg | looper run "bigtools bedGraphToBigWig {$1} {$1}.bw"
I suppose I am trying to identify or nail-down a potential gap between traditional workflows and the flexibility researchers often need for quick, ad hoc job submission.
I wonder if there are either 1) solutions for this or 2) easy ways to add the ability to run a looper pipeline in an ad hoc manner. What I mean by that is this: occasionally, the overhead of a traditional workflow can be a bit daunting, but I really enjoy the ease of dispatching off jobs through slurm+looper.
I would love to replace traditional bash for loops with
looper
calls.An example
I have a folder with hundreds of mixed-type files. Some of these might be
bedGraph
files. I want to convert these to.bw
format. I can usebigtools bedGraphToBigWig
. Traditionally, I might just use a for loop:But this takes awhile since it goes one-by-one, and there are hundreds. I'd love to fire them all off at once using
looper
and slurm:I suppose I am trying to identify or nail-down a potential gap between traditional workflows and the flexibility researchers often need for quick, ad hoc job submission.