Closed sethaxen closed 2 months ago
All modified and coverable lines are covered by tests :white_check_mark:
Project coverage is 80.83%. Comparing base (
3b29881
) to head (b9ee54a
).
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
LogDensityProblemsAD has ADTypes integration for defining gradients and hessians of the log-density. DynamicPPL.jl then customizes this integration for
DynamicPPL.LogDensityFunction
(see e.g. https://github.com/TuringLang/DynamicPPL.jl/blob/master/ext/DynamicPPLForwardDiffExt.jl).Switching to use LogDensityProblemsAD gradient and hessian functions instead of Optimization's ensures that when such customizations are made to improve performance or work around some issue unique to that LogDensityProblem, we then do the right thing. A potential downside is that Optimization.jl may support more ADs than LogDensityProblemsAD.jl, and we would not support these extra ADs. However, a simple workaround is that a user could pass
Base.Fix1(LogDensityProblems.log_density, prob)
topathfinder
ormultipathfinder
, which would not be identifiable as a LogDensityProblem and would then result in Optimization's ADTypes integration being used.Edit: relates #93