Description of the Change: MLIR changes needed for custom gradients. Not yet integrated into the frontend.
Adds:
ForwardOp
ReverseOp
ReturnOp
CustomGradOp
The ForwardOp and ReverseOp are intended to be set from the frontend to have the calling convention from Enzyme. E.g.,
# The return is the tape type
gradient.forward @foo(%in : tensor<f64>, %diff : tensor<f64>, %out : tensor<f64>, %cotangent: tensor<f64>) -> (tensor<f64>, ... tensor<..>)
gradient.reverse @bar(%in : tensor<f64>, %diff : tensor<f64>, %out : tensor<f64>, %cotangent: tensor<f64>, %tape0: tensor<f64>, ... %tapeN: tensor<f64>)
The returnOp is necessary because func.return is only valid inside func.func. If there is no tape, the tape is a null pointer. Having an empty struct resulted in an error, but it might be possible to fix. There is no real disadvantage of using null pointer since it is never dereferenced.
Description of the Change: MLIR changes needed for custom gradients. Not yet integrated into the frontend. Adds:
The ForwardOp and ReverseOp are intended to be set from the frontend to have the calling convention from Enzyme. E.g.,
The returnOp is necessary because func.return is only valid inside func.func. If there is no tape, the tape is a null pointer. Having an empty struct resulted in an error, but it might be possible to fix. There is no real disadvantage of using null pointer since it is never dereferenced.
[sc-60512] [sc-60498]