Closed cgavriluta closed 8 years ago
It seems that the problem is related to the lagrange multiplier. Fro some reason it jumps from 0 at the first iteration to 8891397050.194614 in the second one.
After some more digging I managed to find the issue. The problem was in the way I declared the arrays that hold the sparsity structure of the hessian and jacobian. They have to be explicitly declared as integers like this:
iRow = zeros(n*m, int)
Declaring them like iRow = zeros(n*m)
or iRow = zeros(n * n, float_)
is wrong, because the C interface ('callback.c') will try to cast them to (long):
PyArrayObject *row = (PyArrayObject *) PyTuple_GetItem(result, 0);
long *rdata = (long *)row->data;
Therefore, the problem came from my lack of experience with python.
I am experiencing some very strange behaviour on the attached problem (TestProblem.py.txt). When I try to solve the problem using IPOPT's java interface it works without issues as you can see in javaOutput.txt. I double checked the result with Matlab's fmincon and it is correct.
However, when I use pyipopt it converges to an infeasible point. To me this is incredibly strange as I checked the objective, constraint, gradient, and hessian functions against the ones implemented in java and they provide the exact same results. You can compare with the values from javaOutputCheckFunctions.txt (note that the hessian in the java implementation requires only the lower or upper triangular as the matrix is symmetric).
Moreover, the first iteration of the IPOPT solver is the same in both cases, i.e. java and python. However, from the second iteration they start to be different. It is as if the python interface doesn't pass correctly the jacobian/hessian information and the solver takes it into a different direction.
I would highly appreciate it if you have time to take a look at this.
javaOutput.txt javaOutputCheckFunctions.txt TestProblem.py.txt