ggebbie / TMI.jl

Total Matrix Intercomparison for Julia
MIT License
8 stars 3 forks source link

ForwardDiff.gradient fails for global map #17

Open ggebbie opened 2 years ago

ggebbie commented 2 years ago

The global map relies upon an optimization that uses a explicitly supplied gradient. It would be nice to check that the gradient is accurate using a comprehensive method, such as that of ForwardDiff.jl or Zygote.jl. The TMI.jl is not sufficiently generic to be used with ForwardDiff. It appears that a ForwardDiff container for the control vector cannot be handled by the TMI.jl routine. Other issues related to initialization of any real number seem to be fixed.

ggebbie commented 2 years ago

After eliminating unnecessary type definitions in the function, and separating the constant parameters in the argument list, we are left with an error related to the matrix left divide and the sparse LU decomposition

ggebbie commented 2 years ago

Note that ReverseDiff.jl, which is the better choice because of the large number of inputs (~2806) and small number of outputs (1), also has trouble with the sparse matrix left divide.

One workaround is the ILUZero.jl package. When differentiating a function with a sparse matrix left divide given by ILUZero.jl, I get the following error. """ ERROR: MethodError: no method matching adjoint(::ILU0Precon{Float64, Int64})
Closest candidates are: adjoint(::FillArrays.Fill{T, 2, Axes} where Axes) where T at /home/gebbie/.julia/packages/FillArrays/Vzxer/src/fillalgebra.jl:16 adjoint(::LinearAlgebra.Transpose{var"#s46", var"#s45"} where {var"#s46"<:Real, var"#s45"<:Union{StaticArrays.StaticVector{N, T} where {N, T}, StaticArrays.StaticMatrix{N, M, T} where {N, M, T}}}) at /home/gebbie/.julia/packages/StaticArrays/OWJK7/src/linalg.jl:78 adjoint(::LinearAlgebra.Transpose{var"#s814", S} where {var"#s814"<:Real, S}) at /buildworker/worker/package_linux64/build/usr/share/julia/stdlib/v1.6/LinearAlgebra/src/adjtrans.jl:166 ... Stacktrace: [1] misfit_gridded_data(uvec::Vector{Float64}, Alu::ILU0Precon{Float64, Int64}, y::Array{Float64, 3}, d::Array{Float64, 3}, Wⁱ::LinearAlgebra.Diagonal{Float64, Vector{Float64}}, wet::BitArray{3})
@ TMI ~/projects/TMI.jl/src/TMI.jl:988 [2] fg(x::Vector{Float64}) @ Main ./REPL[41]:1 [3] top-level scope @ REPL[43]:1 """

ggebbie commented 2 years ago

Got rid of adjoint call by defining a function to compute just the cost function value, but ForwardDiff.gradient fails at """ ERROR: MethodError: no method matching ldiv!(::SuiteSparse.UMFPACK.UmfpackLU{Float64, Int64}, ::Vector{ForwardDiff.Dual{ForwardDiff.Tag{typeof(f_diffable), Float64}, Float64, 12}}) """