Closed PerformanceCoder closed 1 year ago
I would say that is probably the best way. It's stopping because it is converged according to your metric. The only way you can avoid that is to tighten the tolerances. So it seems like you found the solution yourself :)
Hi!
Currently, I'm solving a series of optimization problems. To solve next optimization problem, the solution from previous problem is used as initial approximation. In some cases initial solution is already accurate enough for the next problem, therefore 0 optimization iterations is applied and the initial vector is not changed. This causes ladder-like patterns when plotting.
I understand that it is an expected behaviour: if approximation is already good, then there is no need to make it better. But is there a way to force optimizer calculate at least one iteration without checking the convergence metrics? Currently, I can optimize with a limit of 1 iteration and 0 tolerance, but maybe there is a better way to do it?