ShkalikovOleh / OptAlg

Reimplementation of optimization algorithms.
MIT License
0 stars 3 forks source link
augmented-lagrange-multipliers barrier-method bfgs clonalg conjugate-gradient-descent dfp gradient-descent hooke-jeeves nelder-mead optimization-algorithms penalty university

OptAlg

Set of optimization algorithms.

Usage

from optalg.<subpackage> import <algo-name>

def f(x):
  return <your-function value>(where argument x_i = x[i])

optimizer = <algo-name>(params...)
res = optimizer.optimize(f, [init_state]) #optimization result

For methods requiring gradient and hessian calculations, use autograd.numpy instead of numpy for define objective function. For example:

import numpy as np
from autograd.numpy import sin
from optalg.line_search import ArmijoBacktracking
from optalg.unconstrained.descent import GradientDescent
from optalg.stop_criteria import GradientNormCriterion

def f(x):
  return x[0]**2 + sin(x[1]**2)

gnCriterion = GradientNormCriterion(10**-3)
step_opt = ArmijoBacktracking(1, 0.5)
optimizer = GradientDescent(gnCriterion, step_opt)

res = optimizer.optimize(f, np.array([-3, 1]))
res.x #optimum

Unconstrained algorithms

Gradient free

Methods that does not require differentiability of the objective function. For direction calculation uses another search methods.

Gradient

Methods based on descent to minimum by gradient-like direction.

Newton

Second-order descent algorithms

Evolutional

Methods based on natural evolution. On each iteration methods select "best" individual from population, reproduce new generation and replace previous individuals.

Immune

Artificial immune system

Constrained algorithms

Penalty

Set of method for constrained optimization that use penalty function for representing constraints

Line search

On each step descent direction multiplies by step size. Avaliable descent's step size calculation methods: