PennyLaneAI / catalyst

A JIT compiler for hybrid quantum programs in PennyLane
https://docs.pennylane.ai/projects/catalyst
Apache License 2.0
109 stars 28 forks source link

[MLIR] Register custom gradients with Enzyme #822

Closed erick-xanadu closed 4 weeks ago

erick-xanadu commented 1 month ago

Description of the Change: MLIR changes needed for custom gradients. Not yet integrated into the frontend. Adds:

The ForwardOp and ReverseOp are intended to be set from the frontend to have the calling convention from Enzyme. E.g.,

# The return is the tape type
gradient.forward @foo(%in : tensor<f64>, %diff : tensor<f64>, %out : tensor<f64>, %cotangent: tensor<f64>) -> (tensor<f64>, ... tensor<..>) 

gradient.reverse @bar(%in : tensor<f64>, %diff : tensor<f64>, %out : tensor<f64>, %cotangent: tensor<f64>, %tape0: tensor<f64>, ... %tapeN: tensor<f64>) 

The returnOp is necessary because func.return is only valid inside func.func. If there is no tape, the tape is a null pointer. Having an empty struct resulted in an error, but it might be possible to fix. There is no real disadvantage of using null pointer since it is never dereferenced.

[sc-60512] [sc-60498]

erick-xanadu commented 1 month ago

@dime10 please approve :)