Closed fedemagnani closed 5 months ago
This is an interesting example. The issue is that the cost function, $f$, does not satisfy the requirements that OpEn expects - namely, $f$ should be a continuously differentiable function with Lipschitz gradient. As discussed in this paper, Algorithm 2, Step 5, a point $u^{\nu+1}$ is determined at which we then compute the gradient. In your example this point can be in $(-\infty, 0]$ where the gradient is not defined.
Describe the bug
I've encountered an issue with the optimizer where it enters an infinite loop when the starting point is close to the minimum. This seems to occur because the optimizer tries to evaluate points outside of the defined domain (in this case, defined by the constraint set
Rectangle
). Here is the objective function with its gradient$$ f(x) = \sqrt{x_0} + (x_1 - 1)^2 $$
$$ \nabla f(\mathbf{x}) = \left[\frac{1}{2\sqrt{x_0}}, 2(x_1 - 1)\right] $$
In the example, it is immediate to see that the minimum is attained at point (0, 1). However, even starting from a very close point to the minimum, like (0.1, 1.1), the optimizer will try to evaluate the function using points not feasible points ending up with looping through NaN points
To Reproduce
Setting vec![0.1, 1.1], the optimizer does not converge ending up with NaN attempts after using points not belonging to the maximal domain of the function (here represented as the constraint set
Rectangle
). This is the sequence of attemptsExpected behavior
The optimizer should converge to the global minimum (0, 1). In particular, I was expecting the optimizer to evaluate only feasible points
System information:
:warning: Please, provide the following information:
rustup show
?stable-aarch64-apple-darwin (default)
rustc -V
?rustc 1.76.0 (07dca489a 2024-02-04)