Open anriseth opened 6 years ago
Also for approximately 0? I find that odd. It tells you that the step takes you to something that is basically a local minimum, so should you just take that exact step and report succes?
I think we have two different understandings of the values phi_0 and dphi_0.
dphi_0 is the value of
vecdot(g(x), s)
where g(x)
is the gradient at the previous iteration x
and s
is the step direction.
I think all the line search algorithms except Static are built with the assumption that dphi_0 < 0, which means that they assume:
x
is not an extremizer.s
is a decent direction.Oh, I may have misunderstood you. When you say "basically a local minimum", do you mean of phi or of the objective f?
Oh, I may have misunderstood you. When you say "basically a local minimum", do you mean of phi or of the objective f?
Sorry, it was a misunderstanding of what phi0 and phi0 was!
Due to computational error, a theoretical descent direction can have 0 or even positive dphi0. Maybe check if small positive value, then just terminate the line search with step 0?
What do you mean with theoretical ?
As in using exact arithmetic.
In HagerZhang and MoreThuente we throw errors if the step direction is not a descent direction (that is, d\phi(0) \geq 0).
No tests are made in BackTracking and StrongWolfe, and it seems like they just return the given step length. I think the algorithms assume a descent direction, so we should probably be consistent here and throw an error.
I think we should leave Static alone, as my intention with it is for more "advanced" optimizers to decide exactly what the step should be (as long as it produces finite function values).
Ref: #91