Closed sritchie closed 9 months ago
Attention: 18 lines
in your changes are missing coverage. Please review.
Comparison is base (
e674ac5
) 87.57% compared to head (c1c4a86
) 87.73%.:exclamation: Current head c1c4a86 differs from pull request most recent head eb3905b. Consider uploading reports for the commit eb3905b to get more accurate results
Files | Patch % | Lines |
---|---|---|
src/emmy/tape.cljc | 94.91% | 6 Missing and 9 partials :warning: |
src/emmy/differential.cljc | 94.82% | 0 Missing and 3 partials :warning: |
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
Having read this twice I think it's a beautifully compact implementation of the idea. I'm eager to try to understand how it works by printing out the tape structure before the reverse process.
This implementation of reverse-mode AD is modeled after the approach described in the dysvunctional language docs. The current version:
gradient
callsFrom the CHANGELOG:
154:
Adds
emmy.tape
with an implementation of reverse-mode automatic differentiation. The implementation is based on Alexey Radul's implementation in dvl, and seems to be higher-performance by quite a bit and capable of replacing our forward-mode implementation.The centerpiece of the implementation is
emmy.tape/gradient
, which can handle $R^n \to R^m$ functions, as well as nested derivatives.All operations supported by [[emmy.differential/Differential]] are supported by the backing [[emmy.tape/TapeCell]] instance.
What we're missing:
Grad
operator?lift-2
to be able to handle tapes interacting with differential instances.