JuliaOpt / juliaopt-notebooks

A collection of IJulia notebooks related to optimization
89 stars 50 forks source link

JuMP derivatives example #11

Closed rgiordan closed 6 years ago

rgiordan commented 9 years ago

Hi Miles,

A while ago I mentioned maybe putting an example of JuMP gradients and Hessians into this repository. I got sidetracked with some stuff at the end of the semester, but here it is at last.

Please let me know if you'd like to make any changes or improvements.

Thanks, Ryan

mlubin commented 9 years ago

Thanks!

Accessing the .col attribute is a bit ugly though, we don't want people doing this. @IainNZ @joehuchette, is it time to add a wrapper? We have ReverseDiffSparse.getplaceindex, maybe just reexport it from JuMP?

IainNZ commented 9 years ago

In this bit:

this_par = m.colVal;

println("x[1] == ", this_par[x[1].col], " == ", getValue(x[1]), " == 1.0")

Does not just using getValue work?

IainNZ commented 9 years ago

But maybe some JuMP.getIndex thing could be nice

Thanks @rgiordan !

joehuchette commented 9 years ago

JuMP.getCol(::Variable)?

IainNZ commented 9 years ago

Might be confused with getting the whole column of A or something like that

rgiordan commented 9 years ago

Sorry for the slow response, I was backpacking. getValue works fine for the value (and as such is maybe not the right place for it in the example), but you'll need the column index to get the derivative and hessian values.

Let me know what you come up with, and I'll update my example!

dpo commented 9 years ago

Just thought I'd mention this (very) preliminary wrapper here: https://github.com/dpo/NLPModels.jl, which makes the user's code substantially simpler and more readable (in my opinion).

rgiordan commented 9 years ago

I agree, that looks better. I'll update this notebook to use the NLPModels library and getLinearIndex once getLinearIndex is merged into a release. Thanks!

rgiordan commented 6 years ago

I guess this is probably out of date now.