Hipparchus-Math / hipparchus

An efficient, general-purpose mathematics components library in the Java programming language
Apache License 2.0
143 stars 42 forks source link

Add constrained optimization #296

Closed maisonobe closed 7 months ago

maisonobe commented 11 months ago

As part of a thread about Powel optimizer, one participant proposed to contribute constrained optimization features, which is something we wanted to have in Hipparchus for years!

So this issue is opened as a reference to handle this contribution.

maisonobe commented 10 months ago

A first version of the contribution has been added to the master branch. It works but triggers many warnings in the continuous integration process (see SonarQuve code smells).

Basically, it lacks testing and javadoc. I have fixed a number of javadoc failures (initially, there were more than 500 warnings), but clearly not all.

Could someone take over at this stage and help fix things, adding missing javadoc, replacing TBD in Javadoc with proper description and greatly improving test coverage?

Once SonarQube is back to green, we can think about refactoring some parts and improving performances.

Serrof commented 9 months ago

I don't want this to postpone the next minor release (which we need soonish for Orekit), so I'll open an MR for coverage.

Serrof commented 9 months ago

InequalityConstraintSet is never used and implements an empty interface (OptimizationData). It does not have any method, not even getters. I do not understand its purpose. @roccafrancesco

roccafrancesco commented 9 months ago

Hello Serrof if the optimization problem is always formulated in the standard form (g(x)<0, h(x)=0) it has no use. In the case, however, you want to leave more freedom in the construction of the problem and use a number of functions greater than one for each type of costraint, maybe because functions could provide the gradient in different manner(analytical, autodifferentiation, blackbox) g1(x).......gn(x),h1(x)....hm (x) ,would have the purpose to compose this functions in the standard form,accepted by the various algorithms.@Serrof

Serrof commented 9 months ago

Thanks for the quick reply. But it's missing getters no? The arguments passed to the constructor cannot be called back at the moment. And to be honest, if there is no test using it, this is "dead" code and should probably not be kept in Hipparchus.

roccafrancesco commented 9 months ago

All right. if you don't think it is useful it can be removed now and possibly implemented at a later@Serrof

Serrof commented 9 months ago

Alright, coverage is almost green, just a bit short on conditions (see here). I think ADMMQPOptimizer is a good target to boost the numbers, so I'll open a pull request for that.

Serrof commented 9 months ago

Still under 90%. I'll add more tests, maybe not just on the new optimization stuff actually

Serrof commented 9 months ago

Is the code well validated? There is a lot of commented code and there are combinations of options (necessary to cover all the branches) that lead to no convergence. For example you can play with them with this snippet:

    @Test
    public void test() {
        // GIVEN
        final OptimizationData[] data = createOptimizationData();
        final SQPOption option = new SQPOption();
        option.setMaxLineSearchIteration(2);
        option.setConvCriteria(0);
        data[data.length - 1] = option;

        // WHEN
        final SQPOptimizerS optimizer = new SQPOptimizerS();
        optimizer.parseOptimizationData(data);
        final LagrangeSolution    solution  = optimizer.optimize(data);

        // THEN
        final double[] expectedSolution = new double[] { 1, 1 };
        Assert.assertEquals(0.0,
                MatrixUtils.createRealVector(expectedSolution).subtract(solution.getX()).getL1Norm(), 2.5e-5);
        Assert.assertEquals(8., solution.getValue(), 2e-4);
    }

    private OptimizationData[] createOptimizationData() {
        final QuadraticFunction q = new QuadraticFunction(new double[][] { { 4.0, -2.0 }, { -2.0, 4.0 } },
                new double[] { 6.0, 0.0 },
                0.0);
        // y = 1
        final LinearEqualityConstraint eqc = new LinearEqualityConstraint(new double[][] { { 0.0, 1.0 } },
                new double[] { 1.0 });
        // x > 0, y > 0, x + y > 2
        final LinearInequalityConstraint ineqc = new LinearInequalityConstraint(new double[][] { { 1.0, 0.0 }, { 0.0, 1.0 }, { 1.0, 1.0 } },
                new double[] { 0.0, 0.0, 2.0 });

        final OptimizationData[] constraints = new OptimizationData[] { eqc, ineqc };
        final OptimizationData[] data = new OptimizationData[constraints.length + 3];
        final ObjectiveFunction objectiveFunction =  new ObjectiveFunction(q);
        data[0] = objectiveFunction;
        System.arraycopy(constraints, 0, data, 1, constraints.length);
        final double[] initialGuess = new double[] { -3.5, 3.5 };
        data[data.length - 2] = new InitialGuess(initialGuess);
        return data;
    }
roccafrancesco commented 9 months ago

HI it is not possible to use these methods by assigning any parameters. The implemented papers, in fact, use specific parameters and apply them to all set of problems. The ideal would be to use the same benchmarks used by optimizers developed in other programming languages.

Serrof commented 9 months ago

Hi Francesco and thanks for getting back to me. In that case maybe we can remove some options, at least for the next release. Again I don't want this issue to postpone it. It's already a solid new version in my opinion. I'll have a look at where we could trim the algorithms.

Serrof commented 8 months ago

The recent sonarqube reset means we're ok for coverage. So what are we still missing @maisonobe? Javadoc?