Closed pkofod closed 5 months ago
Attention: 5 lines
in your changes are missing coverage. Please review.
Comparison is base (
1a649e8
) 84.73% compared to head (ac1a8a2
) 84.90%.
Files | Patch % | Lines |
---|---|---|
src/multivariate/solvers/first_order/adam.jl | 91.89% | 3 Missing :warning: |
src/multivariate/solvers/first_order/adamax.jl | 94.28% | 2 Missing :warning: |
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
Of course, the results of Adam and AdaMax will also depend on learning parameters etc... so
julia> result = optimize(rosenbrock, zeros(2), Optim.Adam(alpha=0.1), Optim.Options(iterations=100000))
* Status: success
* Candidate solution
Final objective value: 4.408001e-16
* Found with
Algorithm: Adam
* Convergence measures
|x - x'| = 1.00e+00 ≰ 0.0e+00
|x - x'|/|x'| = 1.00e+00 ≰ 0.0e+00
|f(x) - f(x')| = NaN ≰ 0.0e+00
|f(x) - f(x')|/|f(x')| = NaN ≰ 0.0e+00
|g(x)| = 9.91e-09 ≤ 1.0e-08
* Work counters
Seconds run: 0 (vs limit Inf)
Iterations: 2818
f(x) calls: 2819
∇f(x) calls: 2819
Fixes #1012
I don't use Adam and AdaMax myself, but I suppose the slow convergence of Adam from
zeros(2)
is sort of expected sometimes? Otherwise it may be good to compare against another implementation.