Closed Simpleton123 closed 6 years ago
Duplicate of https://github.com/JuliaOpt/Ipopt.jl/issues/109. Note that if you are looking for more feedback on your issue, you will have better luck posting it on Discourse than on a Github issue ;)
Closing, since there's unlikely an issue with JuMP here. Questions about solver behavior could go to Discourse as @blegat suggested, or to the solvers' respective mailing lists.
I need to find the solution (x,y) to minimize a function. I choose trust-region-reflective algorithm on MATLAB, find the optimal solution is (1.45392043849310, 2.22057922774844e-14), and objective = -652.457827998786. Then I choose JuMP and use Ipopt algorithm on Julia, with the same initialized value (0.948683298050514, 0.500000000000000), same constraints and same tolerance = 1e-9, but algorithm stops, optimal solution is (0.948683638232005, 0.500), and objective = -600.58335. Since on MATLAB, I can call fmincon function, which can use gradient and also hessian of function. But on Julia, there is an information that hessian can not be used for multi-variable optimization, so I only use gradient. This is different between two code. Could someone help me with this problem? Thanks!!!
Julia code is as following:
Optimal report is as following: