Recently we've discussed implementing automatic differentiation (AD) in FTorch using the underlying autograd AD tool built into Torch. The first step to achieve this would be to overload elementary operations for the Fortran torch_tensor class: assignment, addition, multiplication of tensors, multiplication of tensor by scalar, etc.
Related to #111.
Recently we've discussed implementing automatic differentiation (AD) in FTorch using the underlying autograd AD tool built into Torch. The first step to achieve this would be to overload elementary operations for the Fortran
torch_tensor
class: assignment, addition, multiplication of tensors, multiplication of tensor by scalar, etc.