Open jcblemai opened 3 weeks ago
Including capability to run subpop specific configs would also be tied to this, but can table this for now if that's outside scope of this https://github.com/HopkinsIDD/RSV_USA/blob/main/SLURM_emcee_job_small_per_subpop.batch
Label
batch, inference, meta/workflow
Priority Label
high priority
Is your feature request related to a problem? Please describe.
One needs to write it's own batch script (see examples in Flu_USA) to run emcee.
Is your feature request related to a new application, scenario round, pathogen? Please describe.
No response
Describe the solution you'd like
We run job like this (Submitting A Batch Inference Job To Slurm heading in https://iddynamics.gitbook.io/flepimop/how-to-run/advanced-run-guides/running-on-a-hpc-with-slurm) using inference_job.py, which is very convenient. This scripts, which we could pull into gempyor so it has access to memory footprint and test runs, allows to run local, slurm or aws jobs. When this script detects method: emcee in the inference config (see https://iddynamics.gitbook.io/flepimop/model-inference/inference-with-emcee), it should build and run a slurm file like this one:
with rules like this: