HopkinsIDD / flepiMoP

The Flexible Epidemic Modeling Pipeline
https://flepimop.org
GNU General Public License v3.0
9 stars 4 forks source link

[Feature request]: EMCEE integration with inference_job.py #365

Open jcblemai opened 3 weeks ago

jcblemai commented 3 weeks ago

Label

batch, inference, meta/workflow

Priority Label

high priority

Is your feature request related to a problem? Please describe.

One needs to write it's own batch script (see examples in Flu_USA) to run emcee.

Is your feature request related to a new application, scenario round, pathogen? Please describe.

No response

Describe the solution you'd like

We run job like this (Submitting A Batch Inference Job To Slurm heading in https://iddynamics.gitbook.io/flepimop/how-to-run/advanced-run-guides/running-on-a-hpc-with-slurm) using inference_job.py, which is very convenient. This scripts, which we could pull into gempyor so it has access to memory footprint and test runs, allows to run local, slurm or aws jobs. When this script detects method: emcee in the inference config (see https://iddynamics.gitbook.io/flepimop/model-inference/inference-with-emcee), it should build and run a slurm file like this one:

#!/bin/bash
#SBATCH -N 1
#SBATCH -n 1
#SBATCH -p general
#SBATCH --mem=100g
#SBATCH -c 48
#SBATCH -t 00-20:00:00
flepimop-calibrate -c config_rsvnet_2024_1_emcee.yml --nwalkers 100 --jobs 48 --niterations 500 --nsamples 100 > out_fit_rsv_emcee_1.out 2>&1

with rules like this:

saraloo commented 3 weeks ago

Including capability to run subpop specific configs would also be tied to this, but can table this for now if that's outside scope of this https://github.com/HopkinsIDD/RSV_USA/blob/main/SLURM_emcee_job_small_per_subpop.batch