stanle / madopt

Apache License 2.0
16 stars 5 forks source link

Error in Jacobian #25

Closed dexterurbane closed 2 years ago

dexterurbane commented 4 years ago

The following short program demonstrates an apparent error in computing the Jacobian (at least on my setup, macOS, ma27). Swapping the order of multiplication in the constraint gives the correct result, as shown in the code.

#include <madopt/ipopt_model.hpp>

using namespace std;

int main(){
    MadOpt::IpoptModel m;
    m.show_solver = true;

    m.setStringOption("nlp_scaling_method", "none");

    // The following option runs the "derivative test":
    // It checks the Jacobian against a finite difference approximation.
    m.setStringOption("derivative_test", "first-order");

    auto x = m.addVar(0.999, 1.001, 1.0, "x");

    auto e = x * pow(pow(x, 2) + x, -1); // Fails derivative test.
    // auto e = pow(pow(x, 2) + x, -1) * x; // Passes derivative test.

    m.addEqConstr(e, 0.5);
    cout << e << endl;

    // Diagnostic: print the Jacobian at x = 1
    double x1[] = {1.0};
    double j1[1];
    m.eval_jac_g(x1, true, j1);
    cout << "Jacobian de / dx = " << j1[0] << endl;
    cout << "(Should be -0.25)" << endl;

    // m.solve(); // Look in output for result of derivative test.
}

Output:

x*(((x^2)+x)^-1)
Jacobian de / dx = -0.75
(Should be -0.25)
dexterurbane commented 4 years ago

Another (simpler) example:

using namespace std;

int main(){
    MadOpt::IpoptModel m;
    m.show_solver = true;
    m.setStringOption("nlp_scaling_method", "none");
    m.setStringOption("derivative_test", "first-order");

    auto x = m.addVar(0.999, 1.001, 1.0, "x");

    // auto e = pow(x, 3) + pow(x, 2); // J(1) = 5, OK
    // auto e = (pow(x, 2) + x) * x; // J(1) = 5, OK
    auto e = x * (pow(x, 2) + x); // J(1) = 3, wrong!
    cout << e << endl;

    m.addEqConstr(e, 2.0);

    double x1[] = {1.0};
    double r1[1];

    m.eval_g(x1, true, r1);
    cout << "G " << r1[0] << endl; // Should be 2

    m.eval_jac_g(x1, true, r1);
    cout << "J " << r1[0] << endl; // Should be 5

    // m.solve(); // Look in output for result of derivative test.
}
dexterurbane commented 4 years ago

I believe related to the above: turning on ENABLE_ASSERTS in logger.hpp causes the following to fail:

#include <madopt/ipopt_model.hpp>

using namespace std;

int main(){
    MadOpt::IpoptModel m;
    auto x = m.addVar(0.999, 1.001, 1.0, "x");
    auto e = x * x * x;
    m.addEqConstr(e, 1.0);
    double x1[] = {1.0};
    double r1[1];
    m.eval_jac_g(x1, true, r1);
    cout << "J " << r1[0] << endl; // Should be 5
}

with the following messages:

[hess_simstack.hpp][85][setLastStackPos] :: *** ASSERT FAILED ***
[hess_simstack.hpp][85][setLastStackPos] :: conflict !<= stack.size()-1
[hess_simstack.hpp][85][setLastStackPos] :: 2 !<= 1
[hess_simstack.hpp][85][setLastStackPos] :: 
stanle commented 4 years ago

confirming the issue. investigating ... FYI, assert happens in the addEqConstr. hence a shorter failing example is

#include <madopt/ipopt_model.hpp>

using namespace std;

int main(){
    MadOpt::IpoptModel m;
    auto x = m.addVar(0.999, 1.001, 1.0, "x");
    auto e = x * x * x;
    m.addEqConstr(e, 1.0);
}
dexterurbane commented 4 years ago

Thanks Karsten - let me know if you find anything. Context around this: I've added fully unbalanced OPF to SmartGridToolbox and was having convergence problems with certain problems. Turning on the derivative checker in IPOPT revealed the issue.

stanle commented 4 years ago

I pushed a fix to master.

Sorry it took a while. Whoever wrote this was very spare on comments and implemented a complex algorithm just to save a tiny bit of runtime. Took me a while to figure out how everything works.

dexterurbane commented 4 years ago

Hi Karsten, that's great - I appreciate it. I'll give it a go. Hope you're well!

dexterurbane commented 4 years ago

So far so good. With the limited tests I've run, I'm not seeing any first order derivative errors in my code. Thanks again, reckon you can close this one for now.