Some common debugging steps could be partially automated, possibly increasing our efficiency when diagnosing problems in specific systems. I'm collecting here some of the things we've thought to check during debugging sessions with @proteneer and @schmolly , with an eye towards adding .debug() or .validate() or .report() methods to some of our simulation classes.
[ ] Assert that system contains no force field parameters that would cause instability, regardless of starting configuration
[ ] Nonbonded terms
[ ] vdW
[ ] Assert all epsilons > 0
[ ] Electrostatics
[ ] ?
[ ] vdW + electrostatics jointly
[ ] Assert no pairs where energy(distance) -> -inf when distance -> 0
[ ] Assert no "naked charges"
[ ] Bonded terms
[ ] Harmonic{Bond|Angle} terms
[ ] Assert all force constants >= 0
[ ] {Proper|Improper}Torsion terms
[ ] Warn if equilibrium angle in numerically unstable range? (only relevant for certain methods of computing torsion angles)
[ ] Assert that starting configuration contains no clashes, given the system parameters
[ ] Assert that force magnitudes on all atoms < threshold (and save / visualize atoms where this assertion fails)
[ ] Assert that du_dl trajectories are reliable for TI
[ ] Assert that du_dl appears stable for all lambda (e.g. assert that stddev(du_dl) < 1e+5 or similar for all lambda)
[ ] If unstable, determine if the problem arises during minimization, equilibration, or production
[ ] Warn under other conditions
[ ] Warning sign: du_dl time series drifts during production
[ ] Warning sign: autocorrelation time for du_dl time series appears to be a large fraction of production simulation length
[ ] Warning sign: stddev(du_dl) / sqrt(n_effective_samples) > threshold for any lambda window
[ ] Assert atom mapping is reasonable (for single topology relative free energy calculations)
[ ] ??? Many considerations, difficult for me to wrap my head around
[ ] Warn under geometric conditions (e.g. warn if any mapped atoms are far apart in their respective conformers)
[ ] In all cases, save atom mapping indices and a depiction for visual inspection
[ ] Other informative checks:
[ ] Compare results across different machines / environments
[ ] Warn if any of the above assertions / warnings are triggered on one machine but not another
[ ] Warn if final estimate is significantly different across machines
Some common debugging steps could be partially automated, possibly increasing our efficiency when diagnosing problems in specific systems. I'm collecting here some of the things we've thought to check during debugging sessions with @proteneer and @schmolly , with an eye towards adding
.debug()
or.validate()
or.report()
methods to some of our simulation classes.