Closed wiederm closed 7 months ago
Input dimensions for inputs["atomic_numbers"] has shape (nr_of_atoms_in_batch, 1), which was flattended in the BaseNNP before. Now we keep dimensions and let the NNP decide what to do with it.
inputs["atomic_numbers"]
BaseNNP
Should we change the name of the input property atomic_number https://github.com/choderalab/modelforge/blob/6dab8a1a376a8b0cf83d84ac927793154239e272/modelforge/potential/models.py#L172 to atomic_properties? The way we are approaching the embedding in this PR makes it flexible to have other properties defined in this tensor with the now flexible shape (nr_of_atoms_in_batch, *, *). What do you think @ArnNag ?
atomic_number
atomic_properties
(nr_of_atoms_in_batch, *, *)
Merging #48 (0d5e0a1) into main (09cbfb6) will increase coverage by 0.17%. The diff coverage is 100.00%.
0.17%
100.00%
Description
Input dimensions for
inputs["atomic_numbers"]
has shape (nr_of_atoms_in_batch, 1), which was flattended in theBaseNNP
before. Now we keep dimensions and let the NNP decide what to do with it.Question
Should we change the name of the input property
atomic_number
https://github.com/choderalab/modelforge/blob/6dab8a1a376a8b0cf83d84ac927793154239e272/modelforge/potential/models.py#L172 toatomic_properties
? The way we are approaching the embedding in this PR makes it flexible to have other properties defined in this tensor with the now flexible shape(nr_of_atoms_in_batch, *, *)
. What do you think @ArnNag ?Status