Open albertomercurio opened 1 week ago
This is worrisome. @DhairyaLGandhi do you know what could cause this one?
I'm taking a look
Looks like when passing in a ComponentVector, the DiffEqCallbacks machinery doesn't actually run any of the internal recursive_*
methods properly and therefore doesn't update the accumulation_cache
in https://github.com/SciML/DiffEqCallbacks.jl/blob/735ea17a5dc9618e7309b30db4fc3e85c416f0dc/src/integrating_sum.jl#L52. This is likely due to all structs being assumed functors in Functors@0.5
Also important to note that https://github.com/SciML/DiffEqCallbacks.jl/blob/735ea17a5dc9618e7309b30db4fc3e85c416f0dc/src/integrating_sum.jl#L39 returns (p1 = nothing, p2 = nothing, p3 = nothing, p4 = nothing)
.
Bumping Functors to v0.4 returns the gradients as expected, but we need to check the values here.
julia> Zygote.gradient(my_f, p2)
((p1 = -8.15610781018363, p2 = -1.7862859982677435, p3 = 7.645628735586522, p4 = -12.55380170354163),)
This change in behaviour is introduced in Functors@0.5.
Likely also related to https://github.com/SciML/DiffEqCallbacks.jl/issues/239
@avik-pal did you end up looking into this?
Describe the bug 🐞
The gradient returns a null vector when computing the gradient using complex ComponentArrays as Parameters.
Expected behavior
A non-null gradient result.
Minimal Reproducible Example 👇
But if I use
param = ComponentArray((p1 = p[1], p2 = p[2], p3 = p[3], p4 = p[4],))
instead, I get a null vectorThe same doesn't happen for a
Float64
initial vector.Environment (please complete the following information):
using Pkg; Pkg.status()
versioninfo()