Closed jlperla closed 2 years ago
@wupeifan Save out c.h_x_p
and c.g_x_p
from the old code for the gradient wrt only the beta
parameter. I think all we really need to do is have the c.h_x_p
to see if things changed. If it did, it gives us a target to find the discrepancy and we can get fancier with unit tests after we fix the problem. If these values are the same as the current (buggy) code then it leads us down a different path.
When we have those values, add it to the unit test hardcoded in https://github.com/HighDimensionalEconLab/DifferentiableStateSpaceModels.jl/blob/main/test/FVGQ20.jl#L112
But like I said, I think all we really need is the c.h_x_p
for the beta one for now.
Unit tests now passing for gradients
_p
on it. We can save out those valuesIf that passes, at least we know that the underlying calculations aren't broken.
We should be very careful to ensure we don't let floating point calculations be loose for this. It should be very close.