Closed ninatym closed 7 years ago
Hello Nina, this seems to be closely related to the specifics of your system. My suggestion is to look at the initial configurations for the jobs that crash. If the restraint force is too high at the beginning for certain windows, you may want to change the starting coordinates, or turn gradually on the harmonic restraints with targetForceConstant
.
For the issue of lack of correspondence between PMFs, this could be due to insufficient sampling or other differences in the protocol. You should compare these with the person who did the previous study.
Hello Giacomo
Thank you for your prompt reply. I appreciate your suggestions but I'm fairly certain that the problem is not with the system I'm trying to model. Perhaps by system you mean the computing cluster?
As far as I understand the protocol I did submit the same set of calculations as the person who gave me the reference data...
More so I would think that for exactly the same job if it crashes because I chose a wrong potential or the starting geometry is bad, etc., it would still crash each time it is submitted... but that is not the case. Meaning that the calculations which crashed upon resubmission with a modification done only in "dump custom" command to get the trajectories (all other parameters were the same) completed.
I'll keep looking for possible reason for this odd behavior.
Again thank you for your help,
Nina
From: Giacomo Fiorin notifications@github.com Sent: Thursday, May 4, 2017 9:40:36 AM To: colvars/colvars Cc: NINA TYMINSKA; Author Subject: Re: [colvars/colvars] Issues with running MD using fix colvars command in LAMMPS (13 Apr 2017) (#117)
Hello Nina, this seems to be closely related to the specifics of your system. My suggestion is to look at the initial configurations for the jobs that crash. If the restraint force is too high at the beginning for certain windows, you may want to change the starting coordinates, or turn gradually on the harmonic restraints with targetForceConstant.
For the issue of lack of correspondence between PMFs, this could be due to insufficient sampling or other differences in the protocol. You should compare these with the person who did the previous study.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHubhttps://github.com/colvars/colvars/issues/117#issuecomment-299205376, or mute the threadhttps://github.com/notifications/unsubscribe-auth/AbC1YmUGWEm9J-Dqi6iz58hRiNioO3VHks5r2eNkgaJpZM4NPyTL.
Hi Nina, again, if you are trying to reproduce results and you are in contact with the person who generated the first data the best is to get in touch with that person. Keep in mind also that depending on how far you are from the center of the restraint, you may be applying a relatively large harmonic force on a single atom (the rest of the atoms are dummies), and be seeing instabilities due to the large forces.
Thank you for your reply. I'll contact the person again. Perhaps you are right and I'm missing some key information...
Again, thanks for your help.
Nina
From: Giacomo Fiorin notifications@github.com Sent: Thursday, May 4, 2017 1:27:20 PM To: colvars/colvars Cc: NINA TYMINSKA; Author Subject: Re: [colvars/colvars] Issues with running MD using fix colvars command in LAMMPS (13 Apr 2017) (#117)
Hi Nina, again, if you are trying to reproduce results and you are in contact with the person who generated the first data the best is to get in touch with that person. Keep in mind also that depending on how far you are from the center of the restraint, you may be applying a relatively large harmonic force on a single atom (the rest of the atoms are dummies), and be seeing instabilities due to the large forces.
— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHubhttps://github.com/colvars/colvars/issues/117#issuecomment-299269830, or mute the threadhttps://github.com/notifications/unsubscribe-auth/AbC1YtofLRUsBualR-Gb5o-pq0belIuXks5r2hiIgaJpZM4NPyTL.
Hello,
I have a couple of issues with running MD using fix colvars command in LAMMPS (13 Apr 2017) and collective variables module, version 2017-03-09 for a flexible zeolite like crystal with a single gas molecule diffusing through it, which system is simulated with periodic boundary conditions.
1) It seems that for certain values of colvar centers the calculations crash randomly. By this I mean that for the same calculation, with exactly the same files (input, data, FF and colvar) the job either crushes at some step with an error "bond atoms %d %d missing on proc %d at step %ld" or it finishes properly. This behavior seems to be independent of number of processors/nodes used as well as initial position of the atom whose position is measured.
2) Even when a set of calculations finishes properly my PMF along set of colvars obtained via umbrella sampling is shifted (by about 7.44 Angstrom) compared to data I am trying to reproduce. Note that I'm using WHAM to remove bias. I don't understand the origin of this shift since the posintion of atom #2209 used as main and ref/ref2 I have are the same as in data I want to reproduce.
As mentioned above the lammps input and the data files are the same as for the set of calculations run successfully with LAMMPS (15 Sep 2016-ICMS) and the collective variables module version 2016-09-14 by someone else on a different computer cluster. The colvar file looks as shown below, where the only thing that is changing is the ctre value, i.e., for given umbrella it is shifted by 0.5 to sample reaction coordinate (rc) of the gas molecule (represented by united-atom model #2209) along 111 axis. Ref corresponds to the center of the pore 1 of the material and ref2 to the center of the pore 2.
TEMPLATE
colvarsTrajFrequency 1
colvarsRestartFrequency 10000
colvar { name rc
distanceZ { main {atomNumbers 2209} scalable off ref {dummyAtom (8.453,8.453,8.453)} ref2 {dummyAtom (16.907,16.907,16.907)} axis {(1,1,1)} } }
harmonic { name spring colvars rc forceConstant 25.0 centers ctre }
I would appreciate help with solving these issues.
Thank you, Nina