mlcolab / Pathfinder.jl

Preheat your MCMC
https://mlcolab.github.io/Pathfinder.jl/
MIT License
75 stars 6 forks source link

Use LogDensityProblemsAD with ADTypes #198

Closed sethaxen closed 2 months ago

sethaxen commented 2 months ago

LogDensityProblemsAD has ADTypes integration for defining gradients and hessians of the log-density. DynamicPPL.jl then customizes this integration for DynamicPPL.LogDensityFunction (see e.g. https://github.com/TuringLang/DynamicPPL.jl/blob/master/ext/DynamicPPLForwardDiffExt.jl).

Switching to use LogDensityProblemsAD gradient and hessian functions instead of Optimization's ensures that when such customizations are made to improve performance or work around some issue unique to that LogDensityProblem, we then do the right thing. A potential downside is that Optimization.jl may support more ADs than LogDensityProblemsAD.jl, and we would not support these extra ADs. However, a simple workaround is that a user could pass Base.Fix1(LogDensityProblems.log_density, prob) to pathfinder or multipathfinder, which would not be identifiable as a LogDensityProblem and would then result in Optimization's ADTypes integration being used.

Edit: relates #93

codecov[bot] commented 2 months ago

Codecov Report

All modified and coverable lines are covered by tests :white_check_mark:

Project coverage is 80.83%. Comparing base (3b29881) to head (b9ee54a).

Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #198 +/- ## ========================================== + Coverage 80.61% 80.83% +0.21% ========================================== Files 13 13 Lines 614 621 +7 ========================================== + Hits 495 502 +7 Misses 119 119 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.