c-scale-community / workflow-coastal-hydrowaq

Porting and deploying the HiSea use case on C-SCALE
Apache License 2.0
3 stars 1 forks source link

Running slurm job in HPC #19

Closed avgils closed 2 years ago

avgils commented 2 years ago

@nikosT

I cannot submit a job with sbatch. I followed these steps:

echo "hisea"

I tried some different options for the slurm command but I get an error: sbatch: error: Batch job submission failed: Invalid account or account/partition combination specified

How can I solve this?

Another question: Do we have a project drive available to share jobs and data between team members?

Anna

ntellgrnet commented 2 years ago

Extremely minimal job script. Have a look in https://doc.aris.grnet.gr/scripttemplate/ how a job script should be. In your case you should use --account=hisea and --partition=el7taskp. The syntax of sbatch should be --KEY=VALUE for example --ntasks=1.

No there is no project drive. What's the reason for separate accounts when you share jobs and data ? Probably the best solution is to allow multiple public keys in one username ?.

avgils commented 2 years ago

Thanks, I looked at this documentation: https://doc.aris.grnet.gr/run/job_submission/, is it maybe an idea to refer to the script template here?

I submited a job succesfully :)

In other environments I'm used to own accounts that can work on multiple projects. You can login with one account but work on multiple projects with that account. A shared drive is mostly available to store projectdata and is available for every projectmemeber.

I could share my key with Lornic since I made a key specific for this environment. Can we login both at the same time? Should we store our data in the homefolder?

ntellgrnet commented 2 years ago

Regarding the key, you could simply add Lornic public key to the authorized_keys and both can connect to the same ARIS user. Obviously you can login both and multiple connections. Data location : It depends on the number and size of data. It the files are small, yes in home folder. If they are large, use the three in $WORKDIR : echo $WORKDIR to see the location.