As discussed in our roadmap, we want to have units associated with entry points.
Two key spots in the code that require this:
loading hdf5 files
The use of the curated datasets should already ensure proper units, but checking the units against expected ones would provide an additional consistency check.
The expected units should be defined in the context of the 'available_properties' for a given dataset (i.e. convert the list of properties to a dict).
If checking units is prohibitively slow for all records, we can have the default behavior be to simply check the first entry (full checking can be enabled by an optional flag)
We should print to the log and screen the specific units that are being used for clarity.
Since we are using curated datasets, I do not think we need to implement unit conversions (although this is would be trivial); this is mostly to ensure we do not have inconsistencies
passing cutoffs (e.g., distance cutoffs, force cutoffs):
Rather than just encoding the expected unit in the doctoring, we should require a user to assign the units to the variable as an openff-unit quantity (e.g., 0.5*unit.nanometer).
In this case I think we should absolutely support conversion to the appropriate unit (with a warning/note to the user that we are converting to the expected unit)
As discussed in our roadmap, we want to have units associated with entry points.
Two key spots in the code that require this:
loading hdf5 files
The expected units should be defined in the context of the 'available_properties' for a given dataset (i.e. convert the list of properties to a dict).
passing cutoffs (e.g., distance cutoffs, force cutoffs):