The equations and documentation are written in a way that's confusing (it took me 20 minutes to convince myself what was written is correct!). We should make it clearer that when a variable is at its upper bound, it is converged if the gradient is negative (below the tolerance), while if it is at its lower bound, it is converged if the gradient is positive (above the tolerance). The negative signs and inequalities make this tricky currently.
The equations and documentation are written in a way that's confusing (it took me 20 minutes to convince myself what was written is correct!). We should make it clearer that when a variable is at its upper bound, it is converged if the gradient is negative (below the tolerance), while if it is at its lower bound, it is converged if the gradient is positive (above the tolerance). The negative signs and inequalities make this tricky currently.
There is also a typo in one of the unit tests:
https://github.com/mphowardlab/relentless/blob/c330b4a685beae92f9fed239d0733d8a13eea2e2/tests/optimize/test_criteria.py#L88-L90
Line 88 should be
x.value = 2.0
so that the constrain is active.