Closed mhauru closed 2 months ago
@mhauru with #178 and #198 I made it so we use ADTypes internally and LogDensityProblemsAD for LogDensityProblems inputs. Then support for Turing models only requires 1) converting the Turing model to a LogDensityProblem (using DynamicPPL.LogDensityFunction
) and 2) converting unconstrained draws to constrained draws (using a utility function). This greatly simplified the implementations. I also removed our own machinery for getting parameter names in favor of Turing.Inference.getparams
, which also allows us to support models with dynamic constraints.
Tests now check for correctness (including of the Jacobian) by ensuring that an IID normal transformed with a bijector to a constrained space can be exactly fitted with Pathfinder. This is possible because Pathfinder will always fit an IID normal exactly, and in such cases the log-density of the unconstrained distribution is equivalent to that of the IID normal. These tests fail for previous Pathfinder versions.
If you have a chance, take a look and let me know if anything looks off.
All modified and coverable lines are covered by tests :white_check_mark:
Project coverage is 86.85%. Comparing base (
93253e7
) to head (823a515
). Report is 1 commits behind head on main.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
As discussed in https://github.com/TuringLang/Turing.jl/issues/2268. This reimplements the relevant parts of
Turing.optim_problem
andTuring.optim_functions
that were removed in Turing v0.33. The new Turing MAP/MLE interface is unfortunately a bit too streamlined for Pathfinder, which needs access to the internalOptimizationFunction
s.@sethaxen, feel free to take over and edit the code to your taste. Please also check the logic of how I'm transforming any initial values provided, and generating defaults from the prior when none are provided; I think the test suite wouldn't catch it if I had done it wrong.