molmod / openmm-tutorial-msbs

OpenMM tutorial for the MSBS course
144 stars 46 forks source link

About heating step in OpenMM #32

Open shaoqx opened 1 year ago

shaoqx commented 1 year ago

Thanks for the amazing tutorial! It is very helpful!

I wonder why in most OpenMM protocols the temperature is directly set to the production temperature? In comparison, most Amber tutorials have an individual heating step that gradually changes the temperature from 0K to the production temperature. Is there any implicit mechanism in OpenMM that does the heating or the heating itself is not so necessary?

All the Best, QZ

tovrstra commented 1 year ago

Thanks for the question!

Before commenting on it, I need to say that there are simulation protocols in which a slow heating is meaningful, e.g. to estimate a melting temperature. I'm assuming this is not related to your question and that you are primarily interested in reaching an equilibrium configuration at a single temperature of interest.

I have never found a convincing justification for slowly raising the temperature, when just trying to equilibrate the system. (If you could provide more insight to change my mind, please do. Never too old to learn something new.)

When starting an MD simulation from an optimized geometry, the atomic positions are obviously "too cold". During the MD, they need to "warm up" to reach a thermodynamic equilibrium. Proper thermostats (i.e. not Berendsen) are designed to handle this and do so effectively at the timescale of the thermostat. If you want to speed up this process, it may help to sample the initial velocities of the atoms with a Boltzmann distribution at double the temperature of the thermostat, to compensate for the fact that the atoms are effectively at 0K after a geometry optimization. (It is only a small win, but it's a win.)

Let me try to think of a potential good reason for slowly heating the system. (Still, no guarantee that it is worth it.) Let's assume the initial state is at an unfavorable (unrealistic) structure, far away from and high above the free energy minimum of the equilibrium state. When the trajectory escapes this metastable initial state, a significant amount of potential energy is released. This may temporarily overheat the system, bringing it into a configuration that is not of interest (e.g. denaturated state). It may be possible, but it still seems unlikely. For example, a Langevin thermostat is very effective at damping out such local excess energies. Even if this would be a problem, I'm not sure whether a slow increase of the temperature is an effective solution. The efficiency with which the thermostat can absorb the extra energy is more relevant, and is not necessarily related to its temperature.

Another concern with the slow heating is that it introduces extra hyperparameters that need to be fixed somehow, probably ad hoc, e.g. at which temperature should one start, how slowly should one increase the temperature, etc. It can be difficult to make such choices objectively.

I hope this helps, and I'd be happy to hear more thoughts on this. It is a common question, so it would be nice to have a more definitive answer.

shaoqx commented 11 months ago

Thank you so much for the answer! I agree that there are no hard backups for the "heating" step. It just shows up in most Amber tutorials.

I like your way of trying to think of a potential good reason, my thought on this is that it will influence the variance of replica simulations brought by the initial velocity assignment. If I understand thermostats correctly, directly given the target temperature will assign large initial velocities compared to starting from 0K and changing gradually. However, it is not clear which way can sample better, or could the former one lead to an unwanted conformational space under the target temperature.

I will update if I have more insights/data about this.