Closed tmigot closed 1 year ago
Patch coverage: 97.11
% and project coverage change: +0.14
:tada:
Comparison is base (
3d8e3ea
) 95.32% compared to head (a5b2d9d
) 95.47%.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Do you have feedback about the report comment? Let us know in this issue.
Connected to #165
When it works, the results are promising https://github.com/JuliaSmoothOptimizers/ADNLPModels.jl/blob/benchmark/benchmark/2023-07-07_adnlpmodels_benchmark_hprod_optimized_nscal_1000_mono.png
For instance, the following fails:
nlp = OptimizationProblems.ADNLPProblems.clnlbeam(n = 1000, hprod_backend = ADNLPModels.ReverseDiffADHvprod);
n = nlp.meta.nvar
v = [sin(T(i) / 10) for i=1:n]
hprod(nlp, get_x0(nlp), v)
This can straightforwardly be adapted to compute the sparse objective Hessian.
@jbcaillau @ocots @amontoison @BaptisteCbl
I added the hprod for the Lagrangian Hessian and optimized sparse hessian. This is a clear improvement, see the benchmarks:
Test with this file: https://github.com/ocots/JumpVSADNLP/blob/master/bench.jl
Hessian of Lagrangian
ADNLP
1.191 ms (157 allocations: 141.55 KiB)
JuMP
60.402 μs (137 allocations: 24.08 KiB)
Jacobian of constraints
ADNLP
10.509 μs (22 allocations: 20.70 KiB)
JuMP
13.329 μs (15 allocations: 19.80 KiB)
Seems to rock!
Great job!
New release please 🙏🙏🙏 (JuliaCon next Tuesday)
Let me add some documentation on the new features and improvements, and then the new release (likely tomorrow).
Many thanks @tmigot for the timely update. Doing some more tests with @PierreMartinon (Goddard test case).
Amazing results @tmigot! Congratulations!
Thanks @jbcaillau! Our computation of the sparse Hessian is 20x / 30x slower than JuMP. We should be able to improve it by taking into account the symmetry of the Hessian to limit the number of directional derivatives of the gradient.
@amontoison thanks for the feedback. probably some profiling to do on our side as our objective and constraints evaluations might be far from being efficiently evaluable compared to the pure Julia / JuMP code written for the comparison.
ps. greetings from JuliaCon2023 with @ocots @gergaud ... and @tmigot !
only for objective function hprod