Closed odow closed 12 months ago
All modified and coverable lines are covered by tests :white_check_mark:
Comparison is base (
283a99d
) 98.38% compared to head (65ac069
) 98.38%. Report is 2 commits behind head on master.
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
@blegat is this what you wanted:
https://jump.dev/JuMP.jl/previews/PR3577/manual/nonlinear/#User-defined-operators-with-vector-inputs
It might be more helpful to show how to deal with functions that cannot be traced and to show how to also get the derivative info with AD and pass it to @operator
Is this part not sufficient? https://jump.dev/JuMP.jl/stable/manual/nonlinear/#Multivariate-functions
It doesn't say that you can use AD
I'd rather not encourage people. We can already use ForwardDiff automatically. If anyone has a user-defined operator and wants to use a custom AD that isn't ForwardDiff, then I expect them to be very advanced users who can figure it out based on the signature of grad_f(g::AbstractVector{T}, x::T...) where {T}
.
If we tell people to use Enzyme/Zygote/ReverseDiff.jl then we'll have to deal with the questions and quirks.
This is good enough for now. A question of whether to add a third-party AD example is best left for a different PR. I'll make a note in the issue.
Closes #3576