dfdx / Yota.jl

Reverse-mode automatic differentiation in Julia
MIT License
158 stars 12 forks source link

Revert single-pass tracing/rrule transformation #124

Closed dfdx closed 1 year ago

dfdx commented 1 year ago

Use two-step approach again. Although this makes the first call to grad() slower, it still doesn't seem to be a bottleneck

codecov-commenter commented 1 year ago

Codecov Report

Merging #124 (4b9d669) into main (57347d2) will decrease coverage by 1.23%. The diff coverage is 97.95%.

@@            Coverage Diff             @@
##             main     #124      +/-   ##
==========================================
- Coverage   79.41%   78.18%   -1.24%     
==========================================
  Files           8        8              
  Lines         447      440       -7     
==========================================
- Hits          355      344      -11     
- Misses         92       96       +4     
Impacted Files Coverage Δ
src/grad.jl 87.74% <97.72%> (+0.98%) :arrow_up:
src/cr_api.jl 91.07% <100.00%> (+0.40%) :arrow_up:
src/rulesets.jl 81.90% <100.00%> (-5.48%) :arrow_down:
src/helpers.jl 28.78% <0.00%> (-0.45%) :arrow_down:
src/gradcheck.jl 92.85% <0.00%> (+0.54%) :arrow_up:
src/update.jl 95.00% <0.00%> (+0.55%) :arrow_up:

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.