PennyLaneAI / catalyst

A JIT compiler for hybrid quantum programs in PennyLane
https://docs.pennylane.ai/projects/catalyst
Apache License 2.0
122 stars 27 forks source link

Add automatic differentiation for accelerated functions #920

Closed erick-xanadu closed 1 month ago

erick-xanadu commented 1 month ago

Context: Functions that have been accelerated can be are compatible with JAX. This means that JAX can also auto-differentiate them. So, let's leverage JAX to auto-differentiate them.

Description of the Change:

Benefits: Better UI

Possible Drawbacks: Callback code is becoming harder to reason about with all the special cases, accelerate, pure_callback, with/without gradient.

Related GitHub Issues:

[sc-60775]

codecov[bot] commented 1 month ago

Codecov Report

All modified and coverable lines are covered by tests :white_check_mark:

Project coverage is 97.98%. Comparing base (f1c846d) to head (792b0f1). Report is 1 commits behind head on main.

Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #920 +/- ## ======================================= Coverage 97.98% 97.98% ======================================= Files 71 71 Lines 10497 10546 +49 Branches 956 960 +4 ======================================= + Hits 10285 10334 +49 Misses 169 169 Partials 43 43 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.