JuliaDiff / ReverseDiff.jl

Reverse Mode Automatic Differentiation for Julia
Other
348 stars 56 forks source link

Document the interaction behaviour of `@grad` and `compile`. #250

Open jacobusmmsmit opened 7 months ago

jacobusmmsmit commented 7 months ago

TLDR:

This PR adds documentation to @grad that specifies what the @graded function should return, and documents that any variables defined outside of the returned adjoint function have their values frozen to their compiled ones.

Context:

In this discussion from issue #243 I fumbled my way through working out how @grad functions in relation to compile. Unsurprisingly to anyone who understands adjoints, I found out that any intermediate variables defined in the body of the function could be used but not updated when the gradient is compiled with compile.

This PR is just a bit of documentation added to the unexported macro @grad so that anyone else who dives into ReverseDiff might have an easier time concretely understanding the expected effects of compile.

Any suggestions on changes to language or wording are more than welcome.

codecov[bot] commented 7 months ago

Codecov Report

All modified and coverable lines are covered by tests :white_check_mark:

Comparison is base (c982cde) 81.53% compared to head (2571a16) 81.53%.

Additional details and impacted files ```diff @@ Coverage Diff @@ ## master #250 +/- ## ======================================= Coverage 81.53% 81.53% ======================================= Files 18 18 Lines 1587 1587 ======================================= Hits 1294 1294 Misses 293 293 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.