grimme-lab / dxtb

Efficient And Fully Differentiable Extended Tight-Binding
https://dxtb.readthedocs.io
Apache License 2.0
65 stars 10 forks source link

IndexHelper reduce/spread not working in batched mode for two dimensions #106

Closed marvinfriede closed 1 year ago

marvinfriede commented 1 year ago

Reducing/Spreading a tensor of shape (b, n, n) does not work. Something like reduce_orbitals_to_atom(x, dim=(-2, -1)) fails.

hoelzerC commented 1 year ago

When calculating the overlap gradient as orbital-wise tensor and applying IndexHelper to obtain atom-wise representation.

# example H2O
norb = 8

calc = Calculator(numbers, par)
doverlap = torch.zeros([norb, norb, 3])
# doverlap = torch.unsqueeze(doverlap, dim=0) # doesn't help

new = calc.ihelp.reduce_orbital_to_atom(doverlap, dim=(-2,-1))

results in src/dxtb/utils/scattergather.py:78: RuntimeError: The expanded size of the tensor (-1) isn't allowed in a leading, non-existing dimension 0

hoelzerC commented 1 year ago

It would be nice to have the functionality to operate the Indexhelper on cases like [norb, norb, X] whereby X is a tensor of arbitrary shape. However, this is currently not crucial and can be deprioritized.

marvinfriede commented 1 year ago

This should now work, however, only for these specific cases. But there are currently no other applications.

# example H2O
norb = 8

calc = Calculator(numbers, par)
doverlap = torch.zeros([norb, norb, 3])
print(d.shape)
# torch.Size([8, 8, 3])

d = calc.ihelp.reduce_orbital_to_atom(doverlap, dim=-3, extra=True)
print(d.shape)
# torch.Size([3, 8, 3])