malramsay64 / experi

An interface for managing computational experiments with many independent variables.
MIT License
4 stars 1 forks source link

Problems running job with multiple commands (on SLURM?) #65

Open TomNicholas opened 5 years ago

TomNicholas commented 5 years ago

I have managed to get jobs to submit on SLURM, but they failed once they started with the very cryptic error that " mkdir has no option 's' ".

My .yml file looks like

jobs:
  - command:
    - mkdir /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_{viscosity}_vortloss_{vorticity_loss}
    - cd /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_{viscosity}_vortloss_{vorticity_loss}

variables:
    product:
    viscosity:
            - 1e1
            - 3.7e1
            - 1e0
            - 3.7e0
        vorticity_loss:
            - 1e0
            - 3.7e0
            - 1e1
            - 3.7e1

slurm:
    job-name: core_phi_visc_scan
    nodes: 1
    tasks-per-node: 48
    walltime: 8:00:00
    p: skl_fua_prod
    A: FUA32_SOL_BOUT
    mail-user: thomas.nicholas@york.ac.uk
    mail-type: END,FAIL
    error: /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/slurm-%A_%a.err
    output: /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue
    setup:
    - export OMP_NUM_THREADS=1

which with my altered version of the code (supplied in this PR) produces the batch script file experi_00.slurm which contains:

#!/bin/bash
#SBATCH --job-name core_phi_visc_scan
#SBATCH --time 8:00:00
#SBATCH --output /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/slurm-%A_%a.out
#SBATCH --nodes 1
#SBATCH --tasks-per-node 48
#SBATCH -p skl_fua_prod
#SBATCH -A FUA32_SOL_BOUT
#SBATCH --mail-user thomas.nicholas@york.ac.uk
#SBATCH --mail-type END,FAIL
#SBATCH --error /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/slurm-%A_%a.err
#SBATCH --array 0-15

cd "$SLURM_SUBMIT_DIR"
export OMP_NUM_THREADS=1

COMMAND=( \
"mkdir /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_1e1_vortloss_1e0 && cd /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_1e1_vortloss_1e0" \
"mkdir /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_1e1_vortloss_3.7e0 && cd /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_1e1_vortloss_3.7e0" \
"mkdir /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_1e1_vortloss_1e1 && cd /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_1e1_vortloss_1e1" \
"mkdir /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_1e1_vortloss_3.7e1 && cd /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_1e1_vortloss_3.7e1" \
"mkdir /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_3.7e1_vortloss_1e0 && cd /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_3.7e1_vortloss_1e0" \
"mkdir /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_3.7e1_vortloss_3.7e0 && cd /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_3.7e1_vortloss_3.7e0" \
"mkdir /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_3.7e1_vortloss_1e1 && cd /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_3.7e1_vortloss_1e1" \
"mkdir /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_3.7e1_vortloss_3.7e1 && cd /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_3.7e1_vortloss_3.7e1" \
"mkdir /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_1e0_vortloss_1e0 && cd /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_1e0_vortloss_1e0" \
"mkdir /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_1e0_vortloss_3.7e0 && cd /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_1e0_vortloss_3.7e0" \
"mkdir /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_1e0_vortloss_1e1 && cd /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_1e0_vortloss_1e1" \
"mkdir /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_1e0_vortloss_3.7e1 && cd /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_1e0_vortloss_3.7e1" \
"mkdir /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_3.7e0_vortloss_1e0 && cd /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_3.7e0_vortloss_1e0" \
"mkdir /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_3.7e0_vortloss_3.7e0 && cd /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_3.7e0_vortloss_3.7e0" \
"mkdir /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_3.7e0_vortloss_1e1 && cd /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_3.7e0_vortloss_1e1" \
"mkdir /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_3.7e0_vortloss_3.7e1 && cd /marconi_work/FUA32_SOL_BOUT/tnichola/runs/core_phi_issue/visc_3.7e0_vortloss_3.7e1" \
)

${COMMAND[$SLURM_ARRAY_TASK_ID]}

I can't see anything wrong with this, but it's still failing. I don't know a lot about bash scripting (I always try to use python as much as possible instead!), but I think it has something to do with creating a bash array whose elements are multiple chained commands, i.e. at the command prompt entering

COMMAND=( "echo 0 && echo 1" "echo 2 && echo 3"  )

${COMMAND[1]}

This example is in the format experi produces, but prints 2 && echo 3, which is clearly not the desired output.

malramsay64 commented 5 years ago

Thank you for a thoroughly detailed issue. In short each individual command needs to be run using bash -c. So for your example

❯ COMMAND=( "echo 0 && echo 1" "echo 2 && echo 3" )
❯ bash -c ${COMMAND[1]}
0
1

This tells me I really need to run tests of the generated scheduler files, since I solved this same problem for the shell 'scheduler', so I'll leave this open until I implement those tests.