Closed ConnorMallon closed 3 years ago
Hi Connor, glad to see you are giving Nonconvex a try. Gridap-TopOpt integration is still on my mind. Let me know if you want to help with that!
Re: your question, the way to define custom adjoints is using ChainRulesCore. Here is an example from TopOpt.jl https://github.com/mohamed82008/TopOpt.jl/blob/ad1cde7bfff891e9dd8ee03c928ba9b1fe946b3d/src/Functions/volume.jl#L92. So you would do something like this:
f(x::AbstractVector) = sqrt(x[2])
function ChainRulesCore.rrule(::typeof(f), x::AbstractVector)
val = f(x)
grad = [0.0, 0.5*x[2]^-0.5]
val, Δ -> (nothing, Δ * grad)
end
nothing
is the adjoint of f
wrt the function f
itself because f
is a simple function (not a closure or a callable struct). Δ * grad
is the adjoint wrt x
. I need to add this to the documentation.
Thanks Mohamed! That's good to hear, lets get it going. I didn't realise the rrule would be picked up like that. Very cool.
That's good to hear, lets get it going.
If you want to join a meeting with @yijiangh and I to talk specifics, I would be happy to arrange that. There is also the #topopt channel in the Julia slack for quick discussions.
I will close this issue for now.
What is the recommended way here to use a custom adjoint for an objective and/or constraints defined using ChainRulesCore.jl for the use in the value_jacobian function? Lets say I have defined my adjoint at the driver level and that I want that to be picked up rather than relying on the default autodiff. How should I tell Nonconvex I have the derivative information? If something like what is below worked that would be great:
Thanks for your help