JuliaDiff / ReverseDiff.jl

Reverse Mode Automatic Differentiation for Julia
Other
348 stars 57 forks source link

Get a error when calculating the gradient for LSTM #213

Open Chengfeng-Jia opened 1 year ago

Chengfeng-Jia commented 1 year ago

Hi, I build a network. ReverseDiff.gradient works well for Dense layer. However, when I changed to a LSTM ReverseDiff.gradient(loss_mean, params), it throw an error, MethodError: no method matching (::Flux.LSTMCell{ReverseDiff.TrackedArray{Float64, Float64, 2, Matrix{Float64}, Matrix{Float64}}, ReverseDiff.TrackedArray{Float64, Float64, 2, Matrix{Float64}, Matrix{Float64}}...