Closed mikeboyle321 closed 3 years ago
A quick update:
I retried running everything this morning without the mpi commands. specifically, I replaced:
"scg_optimize -in_dir exp_01/ -gmx gmx_mpi -mpi 3 -out_dir angle_function_2/"
with
"scg_optimize -in_dir exp_01/ -gmx gmx -out_dir angle_function_2/"
It is running without issues now, and looks like it is utilizing all the gpus and cpus available. So maybe the error I encountered is related to the mpi argument parsing?
I hope that helps.
Cheers, Mike
P.S. This program is awesome!! Thank you for writing and publishing it! I can't wait to see the results of the first run.
The issue is in line 1440 of swarmCG.py
Incorrect reference of _gmxcmd as object of the input arguments ns: gmx_cmd = f"mpirun -np {ns.mpi_tasks} {ns.gmx_cmd}"
Correcting to gmx_cmd = f"mpirun -np {ns.mpi_tasks} {gmx_cmd}"
resolves the issue and allows MPI gmx calls.
I hope that helps.
Cheers, Mike Boyle
Hi Mike, thank you very much for you input. I am in the process of refactoring a bit the code starting from the simulation execution. I am going to integrate this fix asap in the new implementation! Cheers
Hello,
I'm running into a pesky AttributeError: 'Namespace' object has no attribute 'gmx_cmd' when running scg_optimize on a polymer I am trying to map using your program.
It seems everything is running smoothly until the minimization step, but then an error is triggered when it tries run the equilibration step of the first CG iteration. I attached my input and output files here. I didn't include the atomistic trajectory as the file is large, and I think this might be a bug on the cg-swarm side given that the minimization runs fine and this is a python based error. If you need the trajectory to reproduce the error, let me know and I can drop a google drive link here.
I use slurm to run the bash script 'cg-swarm-run' in the zip file below. I think if you look there first, the rest of the files should be arranged according to your example.
cg_swarm_testing(2).zip
Please let me know if I made a mistake anywhere!
Cheers, Mike Boyle