Closed yenicelik closed 1 year ago
Hi. No it is not missing but definitely not very well documented.
You have an example on this page where objective and constraint are defined as:
struct ParaboloidProblem;
impl CostFunction for ParaboloidProblem {
type Param = Vec<f64>;
type Output = Vec<f64>;
// Minimize 10*(x0+1)^2 + x1^2 subject to x0 >= 0
fn cost(&self, x: &Self::Param) -> Result<Self::Output, Error> {
Ok(vec![10. * (x[0] + 1.).powf(2.) + x[1].powf(2.), x[0]])
}
}
As you may know, the argmin
framework does not handle constrained optimization, but fortunatly I can implement this by defining the output of the cost function as a vector (the first component is the actual cost value, the remaining components are constraints values expected to be positive at the end of the optimization).
Here on this example we minimize subject to one constraint (x0 >= 0) hence the presence of x[0]
as the second component of cost function output vector.
Oh nice I see. So the argmin package supports multi-variable objectives? Also, to what extend is it a solution guaranteed to satisfy the constraints? I suppose the traditional cobyla package supports constraints "natively" (by traversing the gradient projection)?
You have to check the argmin solvers but I am not sure there is one handling multi-objective optimizations. In the case of this cobyla
crate I use the fact that the CostFunction
trait allows to define output as a vector to pass also constraints values.
Also, to what extend is it a solution guaranteed to satisfy the constraints?
Well... The software is provided as is without any warranty 😉
I suppose the traditional cobyla package supports constraints "natively" (by traversing the gradient projection)?
The COBYLA method is a gradient-free algorithm.
@relf thank you for the amazing crate! I just wanted to ask how to add constraints in the latest 0.3 version? I am not sure if this is missing, or unclear in the documentation. Thank you!