JuliaDiff / DiffRules.jl

A simple shared suite of common derivative definitions
Other
74 stars 38 forks source link

Derivatives of max and min #68

Open amrods opened 2 years ago

amrods commented 2 years ago

Should we enforce that the derivatives of max(x, y) and min(x, y) are undefined when x == y?

mcabbott commented 2 years ago

I think the more useful result is to pick a something like a sub-gradient. The use case to imagine is some optimisation problem, where equality might be the best or worst point. We'd like gradient descent to be able to end up there, or not to get stuck there, both of which want a finite gradient.

You could argue that these should pick a symmetric convention, though:

julia> ForwardDiff.gradient(x -> max(x...), [1,1])
2-element Vector{Int64}:
 0
 1

julia> ForwardDiff.derivative(x -> clamp(x, 1, 2), 1)
1

julia> ForwardDiff.derivative(x -> clamp(x, 1, 2), 2)
1
amrods commented 2 years ago

Yes, the problem I had was encountered when optimizing a function with max(x, y) inside it. I was checking the gradients using ForwardDiff and realized that problem of no symmetry. Worse, though, is FiniteDiff:

julia> FiniteDiff.finite_difference_gradient(x -> max(x...), [1.0, 1.0])
2-element Vector{Float64}:
 0.4999999999934427
 0.4999999999934427