Open mkoiral opened 4 months ago
Please, use the -cp topol.top
instead
Thank you!
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Thu, Jul 25, 2024 at 1:26 PM Mario Sergio Valdés Tresanco < @.***> wrote:
Please, use the -cp topol.top instead
— Reply to this email directly, view it on GitHub https://github.com/Valdes-Tresanco-MS/gmx_MMPBSA/issues/524#issuecomment-2251336839, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALXYSZ7EHIQ4XOTSU6RFYY3ZOFNQLAVCNFSM6AAAAABLPH4N4SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENJRGMZTMOBTHE . You are receiving this because you authored the thread.Message ID: @.***>
Hi,
We submitted the job assuming it will finish after 120 hours but it appears it's not finishing up by that time. So we want to know if there are any ways we can see the progress on our calculations or any restart methods available.
Thank you!
Mahesh
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Thu, Jul 25, 2024 at 2:49 PM Mahesh Koirala @.***> wrote:
Thank you!
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Thu, Jul 25, 2024 at 1:26 PM Mario Sergio Valdés Tresanco < @.***> wrote:
Please, use the -cp topol.top instead
— Reply to this email directly, view it on GitHub https://github.com/Valdes-Tresanco-MS/gmx_MMPBSA/issues/524#issuecomment-2251336839, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALXYSZ7EHIQ4XOTSU6RFYY3ZOFNQLAVCNFSM6AAAAABLPH4N4SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENJRGMZTMOBTHE . You are receiving this because you authored the thread.Message ID: @.***>
Please, attach the gmx_MMPBSA.log file to check your config. Do you check the documentation?
Please see attached.
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Tue, Jul 30, 2024 at 8:48 AM Mario Sergio Valdés Tresanco < @.***> wrote:
Please, attach the gmx_MMPBSA.log file to check your config. Do you check the documentation https://valdes-tresanco-ms.github.io/gmx_MMPBSA/dev/gmx_MMPBSA_running/#running-gmx_mmpbsa ?
— Reply to this email directly, view it on GitHub https://github.com/Valdes-Tresanco-MS/gmx_MMPBSA/issues/524#issuecomment-2258670904, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALXYSZ74HALXTPT3ACNUR5LZO6YV7AVCNFSM6AAAAABLPH4N4SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENJYGY3TAOJQGQ . You are receiving this because you authored the thread.Message ID: @.***>
Please, use GitHub to attach the file since it is not working via email
gmx_MMPBSA.log please see this!
Well, the problem is that you are running gmx_MMPBSA in serial mode. To parallelize the calculations you must use mpirun
. For example:
mpirun -np 120 gmx_MMPBSA -i input.in -cp topol.top -cs production_2.tpr -ci index.ndx -cg 18 17 -ct production_noPBC_2.xtc -rt traj_chainB.xtc -lt traj_chainA.xtc -o output_MMPBSA.dat -eo output_energy.dat
Which means that you will use 120 CPUs. Please, check the link below to get an example of how to run it in an HPC.
Thanks, I'll check it out.
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Tue, Jul 30, 2024 at 9:11 AM Mario Sergio Valdés Tresanco < @.***> wrote:
Well, the problem is that you are running gmx_MMPBSA in serial mode. To parallelize the calculations you must use mpirun. For example:
mpirun -np 120 gmx_MMPBSA -i input.in -cp topol.top -cs production_2.tpr -ci index.ndx -cg 18 17 -ct production_noPBC_2.xtc -rt traj_chainB.xtc -lt traj_chainA.xtc -o output_MMPBSA.dat -eo output_energy.dat
Which means that you will use 120 CPUs. Please, check the link below to get an example of how to run it in an HPC.
— Reply to this email directly, view it on GitHub https://github.com/Valdes-Tresanco-MS/gmx_MMPBSA/issues/524#issuecomment-2258715061, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALXYSZ7PI6PPKGYVMIYYTH3ZO63KDAVCNFSM6AAAAABLPH4N4SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENJYG4YTKMBWGE . You are receiving this because you authored the thread.Message ID: @.***>
It says gmx_mpi cannot be used for gmx_MMPBSA with MPI. Can you suggest some changes to my job script thren?
Mahesh Koirala, PhD#!/bin/bash
source /home/mahesh.koirala/miniconda3/etc/profile.d/conda.sh
conda activate gmxMMPBSA
module load openmpi gromacs/2024.1
gmx_MMPBSA -i input.in \
-cp topol.top\
-cs production_2.tpr \
-ci index.ndx \
-cg 18 17 \
-ct production_noPBC_2.xtc \
-rt traj_chainB.xtc \
-lt traj_chainA.xtc \
-o output_MMPBSA.dat \
-eo output_energy.dat
Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Tue, Jul 30, 2024 at 9:19 AM Mahesh Koirala @.***> wrote:
Thanks, I'll check it out.
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Tue, Jul 30, 2024 at 9:11 AM Mario Sergio Valdés Tresanco < @.***> wrote:
Well, the problem is that you are running gmx_MMPBSA in serial mode. To parallelize the calculations you must use mpirun. For example:
mpirun -np 120 gmx_MMPBSA -i input.in -cp topol.top -cs production_2.tpr -ci index.ndx -cg 18 17 -ct production_noPBC_2.xtc -rt traj_chainB.xtc -lt traj_chainA.xtc -o output_MMPBSA.dat -eo output_energy.dat
Which means that you will use 120 CPUs. Please, check the link below to get an example of how to run it in an HPC.
— Reply to this email directly, view it on GitHub https://github.com/Valdes-Tresanco-MS/gmx_MMPBSA/issues/524#issuecomment-2258715061, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALXYSZ7PI6PPKGYVMIYYTH3ZO63KDAVCNFSM6AAAAABLPH4N4SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENJYG4YTKMBWGE . You are receiving this because you authored the thread.Message ID: @.***>
This error is related to issues reported with the combination of gmx_mpi and gmx_MMPBSA. You can probably use a gromacs version installed in your conda environment.
conda install -c conda-forge "gromacs<=2023.4" pocl -y -q
But still I can use the trajectory generated by gmx_mpi?
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Tue, Jul 30, 2024 at 9:51 AM Mario Sergio Valdés Tresanco < @.***> wrote:
This error is related to issues reported with the combination of gmx_mpi and gmx_MMPBSA. You can probably use a gromacs version installed in your conda environment.
conda install -c conda-forge "gromacs<=2023.4" pocl -y -q
— Reply to this email directly, view it on GitHub https://github.com/Valdes-Tresanco-MS/gmx_MMPBSA/issues/524#issuecomment-2258790587, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALXYSZY3BVTADOXNT55QFIDZO677TAVCNFSM6AAAAABLPH4N4SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENJYG44TANJYG4 . You are receiving this because you authored the thread.Message ID: @.***>
Yes, gmx_mpi only modified how the calculations are done, parallelizing it with mpi, not the result. Here, gmx_MMPBSA only uses gromacs to generate indexes, PDB files, and clean up the trajectory. Make sure the gromacs you will use is greater than or equal to the one used to generate the trajectory.
ok thank you. One last question, can we use this calculation with GPU on in HPC?
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Tue, Jul 30, 2024 at 10:01 AM Mario Sergio Valdés Tresanco < @.***> wrote:
Yes, gmx_mpi only modified how the calculations are done, parallelizing it with mpi, not the result. Here, gmx_MMPBSA only uses gromacs to generate indexes, PDB files, and clean up the trajectory. Make sure the gromacs you will use is greater than or equal to the one used to generate the trajectory.
— Reply to this email directly, view it on GitHub https://github.com/Valdes-Tresanco-MS/gmx_MMPBSA/issues/524#issuecomment-2258808230, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALXYSZ36XQHD5YE2AAVGB3DZO7BGZAVCNFSM6AAAAABLPH4N4SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENJYHAYDQMRTGA . You are receiving this because you authored the thread.Message ID: @.***>
Unfortunately, no. We tried to implement a GPU base calculation but the efficiency was terrible. You can check our discussion here. We are looking for and implementing new methods, including PBDelphi from your University. At the moment, we don't have any financial support, so it is frozen for now.
No problem, I understand.
Mahesh
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Tue, Jul 30, 2024 at 10:13 AM Mario Sergio Valdés Tresanco < @.***> wrote:
Unfortunately, no. We tried to implement a GPU base calculation but the efficiency was terrible. You can check our discussion here. We are looking for and implementing new methods, including PBDelphi from your University. At the moment, we don't have any financial support, so it is frozen for now.
— Reply to this email directly, view it on GitHub https://github.com/Valdes-Tresanco-MS/gmx_MMPBSA/issues/524#issuecomment-2258829540, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALXYSZ5THLM5X4EUWSPPLN3ZO7CSDAVCNFSM6AAAAABLPH4N4SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENJYHAZDSNJUGA . You are receiving this because you authored the thread.Message ID: @.***>
Hi,
I have the output result for some of the jobs and I am having hard time how to plot data to show the total energy change due to different component. gmx_MMPBSA_ana is not working on mine. Do you have some idea or default code to plot the result in a simple way?
Mahesh Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Tue, Jul 30, 2024 at 10:15 AM Mahesh Koirala @.***> wrote:
No problem, I understand.
Mahesh
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Tue, Jul 30, 2024 at 10:13 AM Mario Sergio Valdés Tresanco < @.***> wrote:
Unfortunately, no. We tried to implement a GPU base calculation but the efficiency was terrible. You can check our discussion here. We are looking for and implementing new methods, including PBDelphi from your University. At the moment, we don't have any financial support, so it is frozen for now.
— Reply to this email directly, view it on GitHub https://github.com/Valdes-Tresanco-MS/gmx_MMPBSA/issues/524#issuecomment-2258829540, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALXYSZ5THLM5X4EUWSPPLN3ZO7CSDAVCNFSM6AAAAABLPH4N4SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENJYHAZDSNJUGA . You are receiving this because you authored the thread.Message ID: @.***>
The better way is using gmx_MMPBSA_ana. Trying to plot by yourself can be complicated
I understand, but do this require GUI interface installed in HPC?
Thank you
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Sat, Aug 3, 2024 at 4:50 PM Mario Sergio Valdés Tresanco < @.***> wrote:
The better way is using gmx_MMPBSA_ana. Trying to plot by yourself can be complicated
— Reply to this email directly, view it on GitHub https://github.com/Valdes-Tresanco-MS/gmx_MMPBSA/issues/524#issuecomment-2267203263, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALXYSZY3CUJZR23XH4DBIZDZPVUGDAVCNFSM6AAAAABLPH4N4SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENRXGIYDGMRWGM . You are receiving this because you authored the thread.Message ID: @.***>
Since HPCs are not designed to work with GUIs it is not recommended trying to install it. You can copy the working folder to your Linux PC and open the result using gmx_MMPBSA_ana
Thank you. I will try it on Monday and let you know.
Mahesh
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Sat, Aug 3, 2024 at 5:23 PM Mario Sergio Valdés Tresanco < @.***> wrote:
Since HPCs are not designed to work with GUIs it is not recommended trying to install it. You can copy the working folder to your Linux PC and open the result using gmx_MMPBSA_ana https://valdes-tresanco-ms.github.io/gmx_MMPBSA/dev/gmx_MMPBSA_ana_running/
— Reply to this email directly, view it on GitHub https://github.com/Valdes-Tresanco-MS/gmx_MMPBSA/issues/524#issuecomment-2267212856, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALXYSZ7YYX6GPEVBZ2DAVV3ZPVX7HAVCNFSM6AAAAABLPH4N4SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENRXGIYTEOBVGY . You are receiving this because you authored the thread.Message ID: @.***>
Hi,
I installed gmx_MMPBSA on my personal Linux, but still gmx_MMPBSA_ana is not working. Can you look into it? I spend some time to install correct verson of different libraries but still not working.
(gmxMMPBSA) @.***:~/gmx_MMPBSA$ gmx_MMPBSA_ana --help
Traceback (most recent call last):
File "/home/koirala123/miniconda3/envs/gmxMMPBSA/bin/gmx_MMPBSA_ana",
line 33, in
Thank you
Mahesh
Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Sat, Aug 3, 2024 at 5:24 PM Mahesh Koirala @.***> wrote:
Thank you. I will try it on Monday and let you know.
Mahesh
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Sat, Aug 3, 2024 at 5:23 PM Mario Sergio Valdés Tresanco < @.***> wrote:
Since HPCs are not designed to work with GUIs it is not recommended trying to install it. You can copy the working folder to your Linux PC and open the result using gmx_MMPBSA_ana https://valdes-tresanco-ms.github.io/gmx_MMPBSA/dev/gmx_MMPBSA_ana_running/
— Reply to this email directly, view it on GitHub https://github.com/Valdes-Tresanco-MS/gmx_MMPBSA/issues/524#issuecomment-2267212856, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALXYSZ7YYX6GPEVBZ2DAVV3ZPVX7HAVCNFSM6AAAAABLPH4N4SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENRXGIYTEOBVGY . You are receiving this because you authored the thread.Message ID: @.***>
Please, share the conda list
output
Here it is:
_libgcc_mutex 0.1 conda_forge conda-forge _openmp_mutex 4.5 2_gnu conda-forge alsa-lib 1.2.12 h4ab18f5_0 conda-forge ambertools 23.3 py310h05519df_6 conda-forge amberutils 21.0 pypi_0 pypi arpack 3.8.0 nompi_h0baa96a_101 conda-forge attr 2.5.1 h166bdaf_1 conda-forge blas 1.0 openblas blosc 1.21.6 hef167b5_0 conda-forge bottleneck 1.3.7 py310ha9d4c09_0 brotli 1.1.0 hd590300_1 conda-forge brotli-bin 1.1.0 hd590300_1 conda-forge bzip2 1.0.8 h5eee18b_6 c-ares 1.32.3 h4bc722e_0 conda-forge ca-certificates 2024.7.4 hbcca054_0 conda-forge cairo 1.18.0 h3faef2a_0 conda-forge certifi 2024.7.4 py310h06a4308_0 contourpy 1.2.1 py310hd41b1e2_0 conda-forge cycler 0.12.1 pyhd8ed1ab_0 conda-forge cython 3.0.10 py310h5eee18b_0 dbus 1.13.18 hb2f20db_0 edgembar 0.2 pypi_0 pypi expat 2.6.2 h6a678d5_0 fftw 3.3.10 nompi_hf1063bd_110 conda-forge font-ttf-dejavu-sans-mono 2.37 hd3eb1b0_0 font-ttf-inconsolata 2.001 hcb22688_0 font-ttf-source-code-pro 2.030 hd3eb1b0_0 font-ttf-ubuntu 0.83 h8b1ccd4_0 fontconfig 2.14.2 h14ed4e7_0 conda-forge fonts-anaconda 1 h8fa9717_0 fonts-conda-ecosystem 1 hd3eb1b0_0 fonttools 4.53.1 py310h5b4e0ec_0 conda-forge freetype 2.12.1 h267a509_2 conda-forge gettext 0.22.5 h59595ed_2 conda-forge gettext-tools 0.22.5 h59595ed_2 conda-forge giflib 5.2.2 hd590300_0 conda-forge glib 2.80.3 h8a4344b_1 conda-forge glib-tools 2.80.3 h73ef956_1 conda-forge gmx-mmpbsa 0+untagged.2143.g27929e0 pypi_0 pypi graphite2 1.3.14 h295c915_1 gst-plugins-base 1.24.4 h9ad1361_0 conda-forge gstreamer 1.24.4 haf2f30d_0 conda-forge harfbuzz 8.5.0 hfac3d4d_0 conda-forge hdf4 4.2.15 h2a13503_7 conda-forge hdf5 1.14.3 nompi_hdf9ad27_105 conda-forge icu 73.2 h59595ed_0 conda-forge joblib 1.4.2 pyhd8ed1ab_0 conda-forge keyutils 1.6.1 h166bdaf_0 conda-forge kiwisolver 1.4.5 py310hd41b1e2_1 conda-forge krb5 1.21.3 h659f571_0 conda-forge lame 3.100 h7b6447c_0 lcms2 2.16 hb7c19ff_0 conda-forge ld_impl_linux-64 2.38 h1181459_1 lerc 4.0.0 h27087fc_0 conda-forge libaec 1.1.3 h59595ed_0 conda-forge libasprintf 0.22.5 h661eb56_2 conda-forge libasprintf-devel 0.22.5 h661eb56_2 conda-forge libblas 3.9.0 23_linux64_openblas conda-forge libboost 1.82.0 h6fcfa73_6 conda-forge libbrotlicommon 1.1.0 hd590300_1 conda-forge libbrotlidec 1.1.0 hd590300_1 conda-forge libbrotlienc 1.1.0 hd590300_1 conda-forge libcap 2.69 h0f662aa_0 conda-forge libcblas 3.9.0 23_linux64_openblas conda-forge libclang-cpp15 15.0.7 default_h127d8a8_5 conda-forge libclang13 18.1.8 default_h9def88c_1 conda-forge libcups 2.3.3 h4637d8d_4 conda-forge libcurl 8.9.1 hdb1bdb2_0 conda-forge libdeflate 1.21 h4bc722e_0 conda-forge libedit 3.1.20191231 he28a2e2_2 conda-forge libev 4.33 hd590300_2 conda-forge libevent 2.1.12 hdbd6064_1 libexpat 2.6.2 h59595ed_0 conda-forge libffi 3.4.4 h6a678d5_1 libflac 1.4.3 h59595ed_0 conda-forge libgcc-ng 14.1.0 h77fa898_0 conda-forge libgcrypt 1.11.0 h4ab18f5_1 conda-forge libgettextpo 0.22.5 h59595ed_2 conda-forge libgettextpo-devel 0.22.5 h59595ed_2 conda-forge libgfortran-ng 14.1.0 h69a702a_0 conda-forge libgfortran5 14.1.0 hc5f4f2c_0 conda-forge libglib 2.80.3 h8a4344b_1 conda-forge libgomp 14.1.0 h77fa898_0 conda-forge libgpg-error 1.50 h4f305b6_0 conda-forge libiconv 1.17 hd590300_2 conda-forge libjpeg-turbo 3.0.0 hd590300_1 conda-forge liblapack 3.9.0 23_linux64_openblas conda-forge libllvm15 15.0.7 hb3ce162_4 conda-forge libllvm18 18.1.8 h8b73ec9_1 conda-forge libnetcdf 4.9.2 nompi_h135f659_114 conda-forge libnghttp2 1.58.0 h47da74e_1 conda-forge libnsl 2.0.1 hd590300_0 conda-forge libogg 1.3.5 h27cfd23_1 libopenblas 0.3.27 pthreads_hac2b453_1 conda-forge libopus 1.3.1 h7b6447c_0 libpng 1.6.43 h2797004_0 conda-forge libpq 16.3 ha72fbe1_0 conda-forge libsndfile 1.2.2 hc60ed4a_1 conda-forge libsqlite 3.45.2 h2797004_0 conda-forge libssh2 1.11.0 h0841786_0 conda-forge libstdcxx-ng 14.1.0 hc0a3c3a_0 conda-forge libsystemd0 255 h3516f8a_1 conda-forge libtiff 4.6.0 h46a8edc_4 conda-forge libuuid 2.38.1 h0b41bf4_0 conda-forge libvorbis 1.3.7 h7b6447c_0 libwebp 1.4.0 h2c329e2_0 conda-forge libwebp-base 1.4.0 hd590300_0 conda-forge libxcb 1.15 h7f8727e_0 libxcrypt 4.4.36 hd590300_1 conda-forge libxkbcommon 1.7.0 h662e7e4_0 conda-forge libxml2 2.12.7 h4c95cb1_3 conda-forge libzip 1.10.1 h2629f0a_3 conda-forge libzlib 1.3.1 h4ab18f5_1 conda-forge lz4-c 1.9.4 hcb278e6_0 conda-forge matplotlib 3.5.2 pypi_0 pypi mmpbsa-py 16.0 pypi_0 pypi mpg123 1.32.6 h59595ed_0 conda-forge mpi 1.0 openmpi conda-forge mpi4py 3.1.5 py310h2a790f2_1 conda-forge munkres 1.1.4 pyh9f0ad1d_0 conda-forge mysql-common 8.3.0 h70512c7_5 conda-forge mysql-libs 8.3.0 ha479ceb_5 conda-forge ncurses 6.4 h6a678d5_0 netcdf-fortran 4.6.1 nompi_h228c76a_104 conda-forge nspr 4.35 h6a678d5_0 nss 3.98 h1d7d5a4_0 conda-forge numexpr 2.8.7 py310h286c3b5_0 numpy 1.26.4 py310hb13e2d6_0 conda-forge openjpeg 2.5.2 h488ebb8_0 conda-forge openmpi 4.1.6 hc5af2df_101 conda-forge openssl 3.3.1 h4bc722e_2 conda-forge packaging 24.1 pyhd8ed1ab_0 conda-forge packmol 20.15.0 hc8b2c43_0 conda-forge packmol-memgen 2023.2.24 pypi_0 pypi pandas 1.3.3 pypi_0 pypi parmed 4.2.2 py310hc6cd4ac_1 conda-forge pcre2 10.44 h0f59acf_0 conda-forge pdb4amber 22.0 pypi_0 pypi perl 5.32.1 7_hd590300_perl5 conda-forge pillow 10.3.0 py310hf73ecf8_0 conda-forge pip 24.2 pypi_0 pypi pixman 0.43.2 h59595ed_0 conda-forge ply 3.11 py310h06a4308_0 pthread-stubs 0.4 h36c2ea0_1001 conda-forge pulseaudio-client 17.0 hb77b528_0 conda-forge pymsmt 22.0 pypi_0 pypi pyparsing 3.1.2 pyhd8ed1ab_0 conda-forge pyqt5 5.15.11 pypi_0 pypi pyqt5-qt5 5.15.14 pypi_0 pypi pyqt5-sip 12.15.0 pypi_0 pypi pyqt6-qt6 6.7.2 pypi_0 pypi pyqt6-sip 13.8.0 pypi_0 pypi python 3.10.13 hd12c33a_1_cpython conda-forge python-dateutil 2.9.0 pyhd8ed1ab_0 conda-forge python-tzdata 2024.1 pyhd8ed1ab_0 conda-forge python_abi 3.10 4_cp310 conda-forge pytraj 2.0.6 pypi_0 pypi pytz 2024.1 pyhd8ed1ab_0 conda-forge qhull 2020.2 h434a139_5 conda-forge qt 5.15.8 hf11cfaa_0 conda-forge qt-main 5.15.8 hc9dc06e_21 conda-forge qt-webengine 5.15.8 h3e791b3_6 conda-forge qtpy 2.4.1 py310h06a4308_0 readline 8.2 h5eee18b_0 sander 22.0 pypi_0 pypi scipy 1.14.0 py310h93e2701_1 conda-forge seaborn 0.11.2 pypi_0 pypi setuptools 72.1.0 py310h06a4308_0 sip 6.7.12 py310h6a678d5_0 six 1.16.0 pyh6c4a22f_0 conda-forge snappy 1.2.1 ha2e4443_0 conda-forge sqlite 3.45.2 h2c6b66d_0 conda-forge tk 8.6.13 noxft_h4845f30_101 conda-forge tomli 2.0.1 py310h06a4308_0 tqdm 4.66.5 pypi_0 pypi tzdata 2024a h04d1e81_0 unicodedata2 15.1.0 py310h2372a71_0 conda-forge wheel 0.44.0 pypi_0 pypi xcb-util 0.4.0 hd590300_1 conda-forge xcb-util-image 0.4.0 h8ee46fc_1 conda-forge xcb-util-keysyms 0.4.0 h8ee46fc_1 conda-forge xcb-util-renderutil 0.3.9 hd590300_1 conda-forge xcb-util-wm 0.4.1 h8ee46fc_1 conda-forge xkeyboard-config 2.42 h4ab18f5_0 conda-forge xorg-compositeproto 0.4.2 h7f98852_1001 conda-forge xorg-damageproto 1.2.1 h7f98852_1002 conda-forge xorg-fixesproto 5.0 h7f98852_1002 conda-forge xorg-inputproto 2.3.2 h7f98852_1002 conda-forge xorg-kbproto 1.0.7 h7f98852_1002 conda-forge xorg-libice 1.1.1 hd590300_0 conda-forge xorg-libsm 1.2.4 h7391055_0 conda-forge xorg-libx11 1.8.9 h8ee46fc_0 conda-forge xorg-libxau 1.0.11 hd590300_0 conda-forge xorg-libxcomposite 0.4.6 h0b41bf4_1 conda-forge xorg-libxdamage 1.1.5 h7f98852_1 conda-forge xorg-libxdmcp 1.1.3 h7f98852_0 conda-forge xorg-libxext 1.3.4 h0b41bf4_2 conda-forge xorg-libxfixes 5.0.3 h7f98852_1004 conda-forge xorg-libxi 1.7.10 h4bc722e_1 conda-forge xorg-libxrandr 1.5.2 h7f98852_1 conda-forge xorg-libxrender 0.9.11 hd590300_0 conda-forge xorg-libxt 1.3.0 hd590300_1 conda-forge xorg-libxtst 1.2.5 h4bc722e_0 conda-forge xorg-randrproto 1.5.0 h7f98852_1001 conda-forge xorg-recordproto 1.14.2 h7f98852_1002 conda-forge xorg-renderproto 0.11.1 h7f98852_1002 conda-forge xorg-util-macros 1.19.0 h27cfd23_2 xorg-xextproto 7.3.0 h0b41bf4_1003 conda-forge xorg-xf86vidmodeproto 2.3.1 h7f98852_1002 conda-forge xorg-xproto 7.0.31 h7f98852_1007 conda-forge xz 5.4.6 h5eee18b_1 zlib 1.3.1 h4ab18f5_1 conda-forge zstd 1.5.6 ha6fb4c9_0 conda-forge
Thank you
Mahesh
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Tue, Aug 6, 2024 at 2:14 PM Mario Sergio Valdés Tresanco < @.***> wrote:
Please, share the conda list output
— Reply to this email directly, view it on GitHub https://github.com/Valdes-Tresanco-MS/gmx_MMPBSA/issues/524#issuecomment-2272164343, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALXYSZZ2WUK5JHNHLFSPPQ3ZQE4E7AVCNFSM6AAAAABLPH4N4SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENZSGE3DIMZUGM . You are receiving this because you authored the thread.Message ID: @.***>
Hi ,
The issue of gmx_MMPBSA_ana is still there but the job is very slow due to a single thread. I tried to use MPI on our HPC but I was told to use srun instead. I got following message after this:
"The application appears to have been direct launched using "srun",
but OMPI was not built with SLURM's PMI support and therefore cannot
execute. There are several options for building PMI support under
SLURM, depending upon the SLURM version you are using:
version 16.05 or later: you can use SLURM's PMIx support. This
requires that you configure and build SLURM --with-pmix.
Versions earlier than 16.05: you must use either SLURM's PMI-1 or
PMI-2 support. SLURM builds PMI-1 by default, or you can manually
install PMI-2. You must then build Open MPI using --with-pmi pointing
to the SLURM PMI library location"
The SLURM version installed is 23.11.6
Please suggest if I need to change anything with source or executable ? or everything I need to change is within the HPC only?
Thank you
Mahesh
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Wed, Aug 7, 2024 at 6:55 AM Mahesh Koirala @.***> wrote:
Here it is:
Name Version Build Channel
_libgcc_mutex 0.1 conda_forge conda-forge _openmp_mutex 4.5 2_gnu conda-forge alsa-lib 1.2.12 h4ab18f5_0 conda-forge ambertools 23.3 py310h05519df_6 conda-forge amberutils 21.0 pypi_0 pypi arpack 3.8.0 nompi_h0baa96a_101 conda-forge attr 2.5.1 h166bdaf_1 conda-forge blas 1.0 openblas blosc 1.21.6 hef167b5_0 conda-forge bottleneck 1.3.7 py310ha9d4c09_0 brotli 1.1.0 hd590300_1 conda-forge brotli-bin 1.1.0 hd590300_1 conda-forge bzip2 1.0.8 h5eee18b_6 c-ares 1.32.3 h4bc722e_0 conda-forge ca-certificates 2024.7.4 hbcca054_0 conda-forge cairo 1.18.0 h3faef2a_0 conda-forge certifi 2024.7.4 py310h06a4308_0 contourpy 1.2.1 py310hd41b1e2_0 conda-forge cycler 0.12.1 pyhd8ed1ab_0 conda-forge cython 3.0.10 py310h5eee18b_0 dbus 1.13.18 hb2f20db_0 edgembar 0.2 pypi_0 pypi expat 2.6.2 h6a678d5_0 fftw 3.3.10 nompi_hf1063bd_110 conda-forge font-ttf-dejavu-sans-mono 2.37 hd3eb1b0_0 font-ttf-inconsolata 2.001 hcb22688_0 font-ttf-source-code-pro 2.030 hd3eb1b0_0 font-ttf-ubuntu 0.83 h8b1ccd4_0 fontconfig 2.14.2 h14ed4e7_0 conda-forge fonts-anaconda 1 h8fa9717_0 fonts-conda-ecosystem 1 hd3eb1b0_0 fonttools 4.53.1 py310h5b4e0ec_0 conda-forge freetype 2.12.1 h267a509_2 conda-forge gettext 0.22.5 h59595ed_2 conda-forge gettext-tools 0.22.5 h59595ed_2 conda-forge giflib 5.2.2 hd590300_0 conda-forge glib 2.80.3 h8a4344b_1 conda-forge glib-tools 2.80.3 h73ef956_1 conda-forge gmx-mmpbsa 0+untagged.2143.g27929e0 pypi_0 pypi graphite2 1.3.14 h295c915_1 gst-plugins-base 1.24.4 h9ad1361_0 conda-forge gstreamer 1.24.4 haf2f30d_0 conda-forge harfbuzz 8.5.0 hfac3d4d_0 conda-forge hdf4 4.2.15 h2a13503_7 conda-forge hdf5 1.14.3 nompi_hdf9ad27_105 conda-forge icu 73.2 h59595ed_0 conda-forge joblib 1.4.2 pyhd8ed1ab_0 conda-forge keyutils 1.6.1 h166bdaf_0 conda-forge kiwisolver 1.4.5 py310hd41b1e2_1 conda-forge krb5 1.21.3 h659f571_0 conda-forge lame 3.100 h7b6447c_0 lcms2 2.16 hb7c19ff_0 conda-forge ld_impl_linux-64 2.38 h1181459_1 lerc 4.0.0 h27087fc_0 conda-forge libaec 1.1.3 h59595ed_0 conda-forge libasprintf 0.22.5 h661eb56_2 conda-forge libasprintf-devel 0.22.5 h661eb56_2 conda-forge libblas 3.9.0 23_linux64_openblas conda-forge libboost 1.82.0 h6fcfa73_6 conda-forge libbrotlicommon 1.1.0 hd590300_1 conda-forge libbrotlidec 1.1.0 hd590300_1 conda-forge libbrotlienc 1.1.0 hd590300_1 conda-forge libcap 2.69 h0f662aa_0 conda-forge libcblas 3.9.0 23_linux64_openblas conda-forge libclang-cpp15 15.0.7 default_h127d8a8_5 conda-forge libclang13 18.1.8 default_h9def88c_1 conda-forge libcups 2.3.3 h4637d8d_4 conda-forge libcurl 8.9.1 hdb1bdb2_0 conda-forge libdeflate 1.21 h4bc722e_0 conda-forge libedit 3.1.20191231 he28a2e2_2 conda-forge libev 4.33 hd590300_2 conda-forge libevent 2.1.12 hdbd6064_1 libexpat 2.6.2 h59595ed_0 conda-forge libffi 3.4.4 h6a678d5_1 libflac 1.4.3 h59595ed_0 conda-forge libgcc-ng 14.1.0 h77fa898_0 conda-forge libgcrypt 1.11.0 h4ab18f5_1 conda-forge libgettextpo 0.22.5 h59595ed_2 conda-forge libgettextpo-devel 0.22.5 h59595ed_2 conda-forge libgfortran-ng 14.1.0 h69a702a_0 conda-forge libgfortran5 14.1.0 hc5f4f2c_0 conda-forge libglib 2.80.3 h8a4344b_1 conda-forge libgomp 14.1.0 h77fa898_0 conda-forge libgpg-error 1.50 h4f305b6_0 conda-forge libiconv 1.17 hd590300_2 conda-forge libjpeg-turbo 3.0.0 hd590300_1 conda-forge liblapack 3.9.0 23_linux64_openblas conda-forge libllvm15 15.0.7 hb3ce162_4 conda-forge libllvm18 18.1.8 h8b73ec9_1 conda-forge libnetcdf 4.9.2 nompi_h135f659_114 conda-forge libnghttp2 1.58.0 h47da74e_1 conda-forge libnsl 2.0.1 hd590300_0 conda-forge libogg 1.3.5 h27cfd23_1 libopenblas 0.3.27 pthreads_hac2b453_1 conda-forge libopus 1.3.1 h7b6447c_0 libpng 1.6.43 h2797004_0 conda-forge libpq 16.3 ha72fbe1_0 conda-forge libsndfile 1.2.2 hc60ed4a_1 conda-forge libsqlite 3.45.2 h2797004_0 conda-forge libssh2 1.11.0 h0841786_0 conda-forge libstdcxx-ng 14.1.0 hc0a3c3a_0 conda-forge libsystemd0 255 h3516f8a_1 conda-forge libtiff 4.6.0 h46a8edc_4 conda-forge libuuid 2.38.1 h0b41bf4_0 conda-forge libvorbis 1.3.7 h7b6447c_0 libwebp 1.4.0 h2c329e2_0 conda-forge libwebp-base 1.4.0 hd590300_0 conda-forge libxcb 1.15 h7f8727e_0 libxcrypt 4.4.36 hd590300_1 conda-forge libxkbcommon 1.7.0 h662e7e4_0 conda-forge libxml2 2.12.7 h4c95cb1_3 conda-forge libzip 1.10.1 h2629f0a_3 conda-forge libzlib 1.3.1 h4ab18f5_1 conda-forge lz4-c 1.9.4 hcb278e6_0 conda-forge matplotlib 3.5.2 pypi_0 pypi mmpbsa-py 16.0 pypi_0 pypi mpg123 1.32.6 h59595ed_0 conda-forge mpi 1.0 openmpi conda-forge mpi4py 3.1.5 py310h2a790f2_1 conda-forge munkres 1.1.4 pyh9f0ad1d_0 conda-forge mysql-common 8.3.0 h70512c7_5 conda-forge mysql-libs 8.3.0 ha479ceb_5 conda-forge ncurses 6.4 h6a678d5_0 netcdf-fortran 4.6.1 nompi_h228c76a_104 conda-forge nspr 4.35 h6a678d5_0 nss 3.98 h1d7d5a4_0 conda-forge numexpr 2.8.7 py310h286c3b5_0 numpy 1.26.4 py310hb13e2d6_0 conda-forge openjpeg 2.5.2 h488ebb8_0 conda-forge openmpi 4.1.6 hc5af2df_101 conda-forge openssl 3.3.1 h4bc722e_2 conda-forge packaging 24.1 pyhd8ed1ab_0 conda-forge packmol 20.15.0 hc8b2c43_0 conda-forge packmol-memgen 2023.2.24 pypi_0 pypi pandas 1.3.3 pypi_0 pypi parmed 4.2.2 py310hc6cd4ac_1 conda-forge pcre2 10.44 h0f59acf_0 conda-forge pdb4amber 22.0 pypi_0 pypi perl 5.32.1 7_hd590300_perl5 conda-forge pillow 10.3.0 py310hf73ecf8_0 conda-forge pip 24.2 pypi_0 pypi pixman 0.43.2 h59595ed_0 conda-forge ply 3.11 py310h06a4308_0 pthread-stubs 0.4 h36c2ea0_1001 conda-forge pulseaudio-client 17.0 hb77b528_0 conda-forge pymsmt 22.0 pypi_0 pypi pyparsing 3.1.2 pyhd8ed1ab_0 conda-forge pyqt5 5.15.11 pypi_0 pypi pyqt5-qt5 5.15.14 pypi_0 pypi pyqt5-sip 12.15.0 pypi_0 pypi pyqt6-qt6 6.7.2 pypi_0 pypi pyqt6-sip 13.8.0 pypi_0 pypi python 3.10.13 hd12c33a_1_cpython conda-forge python-dateutil 2.9.0 pyhd8ed1ab_0 conda-forge python-tzdata 2024.1 pyhd8ed1ab_0 conda-forge python_abi 3.10 4_cp310 conda-forge pytraj 2.0.6 pypi_0 pypi pytz 2024.1 pyhd8ed1ab_0 conda-forge qhull 2020.2 h434a139_5 conda-forge qt 5.15.8 hf11cfaa_0 conda-forge qt-main 5.15.8 hc9dc06e_21 conda-forge qt-webengine 5.15.8 h3e791b3_6 conda-forge qtpy 2.4.1 py310h06a4308_0 readline 8.2 h5eee18b_0 sander 22.0 pypi_0 pypi scipy 1.14.0 py310h93e2701_1 conda-forge seaborn 0.11.2 pypi_0 pypi setuptools 72.1.0 py310h06a4308_0 sip 6.7.12 py310h6a678d5_0 six 1.16.0 pyh6c4a22f_0 conda-forge snappy 1.2.1 ha2e4443_0 conda-forge sqlite 3.45.2 h2c6b66d_0 conda-forge tk 8.6.13 noxft_h4845f30_101 conda-forge tomli 2.0.1 py310h06a4308_0 tqdm 4.66.5 pypi_0 pypi tzdata 2024a h04d1e81_0 unicodedata2 15.1.0 py310h2372a71_0 conda-forge wheel 0.44.0 pypi_0 pypi xcb-util 0.4.0 hd590300_1 conda-forge xcb-util-image 0.4.0 h8ee46fc_1 conda-forge xcb-util-keysyms 0.4.0 h8ee46fc_1 conda-forge xcb-util-renderutil 0.3.9 hd590300_1 conda-forge xcb-util-wm 0.4.1 h8ee46fc_1 conda-forge xkeyboard-config 2.42 h4ab18f5_0 conda-forge xorg-compositeproto 0.4.2 h7f98852_1001 conda-forge xorg-damageproto 1.2.1 h7f98852_1002 conda-forge xorg-fixesproto 5.0 h7f98852_1002 conda-forge xorg-inputproto 2.3.2 h7f98852_1002 conda-forge xorg-kbproto 1.0.7 h7f98852_1002 conda-forge xorg-libice 1.1.1 hd590300_0 conda-forge xorg-libsm 1.2.4 h7391055_0 conda-forge xorg-libx11 1.8.9 h8ee46fc_0 conda-forge xorg-libxau 1.0.11 hd590300_0 conda-forge xorg-libxcomposite 0.4.6 h0b41bf4_1 conda-forge xorg-libxdamage 1.1.5 h7f98852_1 conda-forge xorg-libxdmcp 1.1.3 h7f98852_0 conda-forge xorg-libxext 1.3.4 h0b41bf4_2 conda-forge xorg-libxfixes 5.0.3 h7f98852_1004 conda-forge xorg-libxi 1.7.10 h4bc722e_1 conda-forge xorg-libxrandr 1.5.2 h7f98852_1 conda-forge xorg-libxrender 0.9.11 hd590300_0 conda-forge xorg-libxt 1.3.0 hd590300_1 conda-forge xorg-libxtst 1.2.5 h4bc722e_0 conda-forge xorg-randrproto 1.5.0 h7f98852_1001 conda-forge xorg-recordproto 1.14.2 h7f98852_1002 conda-forge xorg-renderproto 0.11.1 h7f98852_1002 conda-forge xorg-util-macros 1.19.0 h27cfd23_2 xorg-xextproto 7.3.0 h0b41bf4_1003 conda-forge xorg-xf86vidmodeproto 2.3.1 h7f98852_1002 conda-forge xorg-xproto 7.0.31 h7f98852_1007 conda-forge xz 5.4.6 h5eee18b_1 zlib 1.3.1 h4ab18f5_1 conda-forge zstd 1.5.6 ha6fb4c9_0 conda-forge
Thank you
Mahesh
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Tue, Aug 6, 2024 at 2:14 PM Mario Sergio Valdés Tresanco < @.***> wrote:
Please, share the conda list output
— Reply to this email directly, view it on GitHub https://github.com/Valdes-Tresanco-MS/gmx_MMPBSA/issues/524#issuecomment-2272164343, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALXYSZZ2WUK5JHNHLFSPPQ3ZQE4E7AVCNFSM6AAAAABLPH4N4SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDENZSGE3DIMZUGM . You are receiving this because you authored the thread.Message ID: @.***>
For gmx_MMPBSA_ana, try to install PyQt6 only. Here you had installed both PyQt5 and PyQt6. If the error persists, try to use PyQt5 instead.
I never used srun
before, so I don't know its dependencies and compatibility. But, according to the SLURM forum, this problem requires an admin, so this is not a problem of gmx_MMPBSA. According to other forums, you can use mpirun
in SLURM with no problem
Thanks, I'll check it out.
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Tue, Aug 13, 2024 at 11:14 AM Mario Sergio Valdés Tresanco < @.***> wrote:
For gmx_MMPBSA_ana, try to install PyQt6 only. Here you had installed both PyQt5 and PyQt6. If the error persists, try to use PyQt5 instead. I never used srun before, so I don't know its dependencies and compatibility. But, according to the SLURM forum https://groups.google.com/g/slurm-users/c/TK7UHqjTxM8, this problem requires an admin, so this is not a problem of gmx_MMPBSA. According to other forums, you can use mpirun in SLURM with no problem
— Reply to this email directly, view it on GitHub https://github.com/Valdes-Tresanco-MS/gmx_MMPBSA/issues/524#issuecomment-2286843285, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALXYSZZLZUNPDTWWI7ORRK3ZRJEH7AVCNFSM6AAAAABLPH4N4SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEOBWHA2DGMRYGU . You are receiving this because you authored the thread.Message ID: @.***>
Hi,
Can we get the entropy from gmx_MMPBSA, I got a very unusual number from gromacs, so I want to check it.
Thank you
Mahesh
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Tue, Aug 13, 2024 at 11:20 AM Mahesh Koirala @.***> wrote:
Thanks, I'll check it out.
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Tue, Aug 13, 2024 at 11:14 AM Mario Sergio Valdés Tresanco < @.***> wrote:
For gmx_MMPBSA_ana, try to install PyQt6 only. Here you had installed both PyQt5 and PyQt6. If the error persists, try to use PyQt5 instead. I never used srun before, so I don't know its dependencies and compatibility. But, according to the SLURM forum https://groups.google.com/g/slurm-users/c/TK7UHqjTxM8, this problem requires an admin, so this is not a problem of gmx_MMPBSA. According to other forums, you can use mpirun in SLURM with no problem
— Reply to this email directly, view it on GitHub https://github.com/Valdes-Tresanco-MS/gmx_MMPBSA/issues/524#issuecomment-2286843285, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALXYSZZLZUNPDTWWI7ORRK3ZRJEH7AVCNFSM6AAAAABLPH4N4SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEOBWHA2DGMRYGU . You are receiving this because you authored the thread.Message ID: @.***>
Hi,
To use mpi run on my HPC I had to reinstall gmx_MMPBSA using Amber tools. I did and it says gmx_MMPBSA-1.6.1 is successfully installed. But I am having difficulty to execute or test it. None of the following are working:
amber.python -m pip show gmx_MMPBSA gmx_MMPBSA --help amber.python -m pip show gmx_MMPBSA-1.6.1 To my directory GMXMMPBSA and bin are generated.
Thank you
Mahesh Koirala, PhD Computational Biophysics Physics & Astronomy Clemson University, 9152622465 http://compbio.clemson.edu
On Sat, Aug 31, 2024 at 7:50 AM Mario Sergio Valdés Tresanco < @.***> wrote:
Yes. Please, check this section https://valdes-tresanco-ms.github.io/gmx_MMPBSA/dev/input_file/#entropy-options for Quasic-harmonic, Interaction, and C2 Entropies, and this section https://valdes-tresanco-ms.github.io/gmx_MMPBSA/dev/input_file/#nmode-namelist-variables for Normal modes. Examples, here https://valdes-tresanco-ms.github.io/gmx_MMPBSA/dev/examples/Entropy_calculations/Interaction_Entropy/?h=entrop, here https://valdes-tresanco-ms.github.io/gmx_MMPBSA/dev/examples/Entropy_calculations/C2_Entropy/ and here https://valdes-tresanco-ms.github.io/gmx_MMPBSA/dev/examples/Entropy_calculations/nmode/
— Reply to this email directly, view it on GitHub https://github.com/Valdes-Tresanco-MS/gmx_MMPBSA/issues/524#issuecomment-2322922188, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALXYSZ3EAET57QLL4RF5ENTZUHJZ3AVCNFSM6AAAAABLPH4N4SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDGMRSHEZDEMJYHA . You are receiving this because you authored the thread.Message ID: @.***>
Bug summary
I was calculating MMPBSA using my gromacs result and ended up with error.
Terminal output
gmx_MMPBSA.log
Operating system
Linux
gmx_MMPBSA Version
No response
Python version
No response
Installation
conda AmberTools + conda