Set of optimization algorithms.
from optalg.<subpackage> import <algo-name>
def f(x):
return <your-function value>(where argument x_i = x[i])
optimizer = <algo-name>(params...)
res = optimizer.optimize(f, [init_state]) #optimization result
For methods requiring gradient and hessian calculations, use autograd.numpy
instead of numpy
for define objective function.
For example:
import numpy as np
from autograd.numpy import sin
from optalg.line_search import ArmijoBacktracking
from optalg.unconstrained.descent import GradientDescent
from optalg.stop_criteria import GradientNormCriterion
def f(x):
return x[0]**2 + sin(x[1]**2)
gnCriterion = GradientNormCriterion(10**-3)
step_opt = ArmijoBacktracking(1, 0.5)
optimizer = GradientDescent(gnCriterion, step_opt)
res = optimizer.optimize(f, np.array([-3, 1]))
res.x #optimum
Methods that does not require differentiability of the objective function. For direction calculation uses another search methods.
Methods based on descent to minimum by gradient-like direction.
Gradient descent - simple gradient descent
Cojugate gradients descent - descent direction is the sum of gradient in current point and the weighted direction from the previous iteration. Avaliable variations:
Second-order descent algorithms
Newton - descent direction is dot product of the hessian and gradient.
Quasi Newton - inverse hessian replaces with an approximate value
Methods based on natural evolution. On each iteration methods select "best" individual from population, reproduce new generation and replace previous individuals.
Artificial immune system
Set of method for constrained optimization that use penalty function for representing constraints
On each step descent direction multiplies by step size. Avaliable descent's step size calculation methods: