optimagic is a Python package for numerical optimization. It is a unified interface to optimizers from SciPy, NlOpt and other packages. optimagic's minimize function works just like SciPy's, so you don't have to adjust your code. You simply get more optimizers for free. On top you get diagnostic tools, parallel numerical derivatives and more.
tao_pounders has three gradient based convergence criteria that all have default values. They are processed in the following code snippet:
# Set user defined convergence tests. Beware that specifying multiple tests could
# overwrite others or lead to unclear behavior.
if stopping_maxiter is not None:
tao.setConvergenceTest(functools.partial(_max_iters, stopping_maxiter))
elif convergence_gtol_scaled is False and convergence_gtol_abs is False:
tao.setConvergenceTest(functools.partial(_grtol_conv, convergence_gtol_rel))
elif convergence_gtol_rel is False and convergence_gtol_scaled is False:
tao.setConvergenceTest(functools.partial(_gatol_conv, convergence_gtol_abs))
elif convergence_gtol_scaled is False:
tao.setConvergenceTest(
functools.partial(
_grtol_gatol_conv,
convergence_gtol_rel,
convergence_gtol_abs,
)
)
There are two potential problems:
The if conditions don't use is True but just a check for truthyness. This could lead to unexpected behavior.
It seems like by default all of the convergence criteria are active, even though the comment says that this can lead to unexpected behavior.
Problem
tao_pounders
has three gradient based convergence criteria that all have default values. They are processed in the following code snippet:There are two potential problems:
is True
but just a check for truthyness. This could lead to unexpected behavior.