Closed tmigot closed 1 year ago
Base: 99.44% // Head: 99.45% // Increases project coverage by +0.01%
:tada:
Coverage data is based on head (
9e87cd7
) compared to base (6e45688
). Patch coverage: 100.00% of modified lines in pull request are covered.
:umbrella: View full report at Codecov.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.
I also added a complementary tests with problems from OptimizationProblems.jl:
using ADNLPModels, LinearAlgebra, NLPModels, NLPModelsJuMP, OptimizationProblems
for prob in intersect(names(ADNLPProblems), names(PureJuMP))
if prob == :hs61
continue
end
if occursin("tetra", string(prob)) || occursin("triangle", string(prob))
continue # too slow
end
nlp = eval(Meta.parse("ADNLPProblems.$(prob)()"))
model = MathOptNLPModel(eval(Meta.parse("PureJuMP.$(prob)()")))
if nlp.meta.ncon > 0
x = nlp.meta.x0
test = Matrix(jac(nlp, x)) ≈ Matrix(jac(model, x))
if !test
println("$prob : $(nlp.meta.nvar), $(nlp.meta.ncon)")
@show norm(Matrix(jac(nlp, x)) - Matrix(jac(model, x)))
end
end
end
and all the tests passed (after this https://github.com/JuliaSmoothOptimizers/OptimizationProblems.jl/pull/223).
We really have to update the config of that dratted probot. I didn't even have a chance to look at this PR!
Sorry for that, if you have some feedback on this I can open a new PR to fix it.
https://github.com/JuliaSmoothOptimizers/ADNLPModels.jl/pull/91 should help with the bot.
I will split this into several PRs as it is a large change
c
in backend constructorsget_nln_nnzj
andget_nln_nnzh
functions