Closed VAIgenomics closed 3 years ago
I think the Snakemake pipeline should be usable as is in terms of the cores but the memory requested by each job may need to be adjusted. In addition to the number of cores, do you know how many Gb of memory the new node would have? To be safe, it would be good to have a couple of test datasets to quickly run through once the new node is up instead of waiting till the pipeline needs to run for a real project.
256Gb I don't know the exact timeline, but once we change we can try to get a few test runs in. Thank!
Update: The node switch is potentially happening 10/18. Also it is going to be 2 nodes with 28 cores each (256Gb).
adjusted bcl2fastq rule to 24cores.
Our node setup is likely changing from 2 nodes with 8 cores each to 1 node with 28 cores (potentially with more to come later). I'd like to make sure that our current demux pipeline is setup to most efficiently use the 28 cores. Ideally I think we would be able to run at least 2 demux runs at a time to minimize waiting. Let me know if things are set up well for this or if there are changes that would need to be made when this node change occurs.