EnzymeAD / Enzyme.jl

Julia bindings for the Enzyme automatic differentiator
https://enzyme.mit.edu
MIT License
443 stars 62 forks source link

Gradient returns all zeros #1389

Closed kiranshila closed 5 months ago

kiranshila commented 5 months ago

I'm working on a relatively complex function, and I am getting a gradient of all zeros both Forward and Reverse mode. I think this is a bug, as ForwardDiff yields correct results. There might be something with all the in-place mutation, but I'm not sure.

Here's the code:

using Enzyme, StaticArrays

function input_matching_network_abcd(widths::AbstractVector{T1}, total_length::T2, freq::T3) where {T1,T2,T3}
    RT = promote_type(T1, T2, T3)
    CT = Complex{RT}
    tmp = zero(MMatrix{2,2,CT})
    acc = MMatrix{2,2,CT}(I)
    δ = total_length / length(widths)
    s = CT(2π * im * freq)
    for w in widths
        # Compute RLGC parameters
        (r, l, g, c) = ss_rlgc(w, freq)
        # Compute transmission line parameters
        Z = r + s * l
        Y = g + s * c
        γ = sqrt(Z * Y)
        z0 = sqrt(Z / Y)
        # Build ABCD in-place
        θ = γ * δ
        sh = sinh(θ)
        ch = cosh(θ)
        abcd = @SMatrix [ch sh*z0; sh/z0 ch]
        # Accumulate ABCD
        acc .= mul!(tmp, acc, abcd)
    end
    # Return the ABCD
    SMatrix(acc)
end

f(ws) = abs2(input_matching_network_abcd(ws, 100.0f-3, 1.0f9)[1])

ws = fill(1.0f-3, 10)
Enzyme.gradient(Forward, f, ws)
kiranshila commented 5 months ago

I think this has to do with https://enzymead.github.io/Enzyme.jl/dev/faq/#Activity-of-temporary-storage, but the function itself doesn't accept temporary storage, so I'm not sure if it applies.

wsmoses commented 5 months ago

@kiranshila what version of Enzyme/Julia are you on, and what would be the correct/intended output here?

kiranshila commented 5 months ago

My mistake, I must have had a Julia in a weird state, I get the correct output on a fresh session. Apologies!