Closed joehuchette closed 8 years ago
I'm not sure. There's clearly something to be said for this, but I wonder if this is the first time we're providing a subgradient rather than a gradient. Maybe we should split the differentiation rules into two sets and allow users to specify whether differentiate should error out in the presence of non-differentiable functions?
@johnmyleswhite would the x/abs(x)
option work with you? At least then it's undefined at the origin.
That's not the right rule for JuMP though. On Jul 11, 2015 6:31 AM, "Joey Huchette" notifications@github.com wrote:
@johnmyleswhite https://github.com/johnmyleswhite would the x/abs(x) option work with you? At least then it's undefined at the origin.
— Reply to this email directly or view it on GitHub https://github.com/johnmyleswhite/Calculus.jl/pull/70#issuecomment-120613943 .
I would think you'd either want:
You guys can make the call on whether this appropriate or not here. One option would be to use the x/abs(x)
rule here and then override it in ReverseDiffSparse here, which as best I can tell should be sufficient to get the right behavior for JuMP (maybe need overloads in DualNumbers as well?).
This was implemented in ReverseDiffSparse as a special case: https://github.com/mlubin/ReverseDiffSparse.jl/commit/2530b758bb341d3c51e8c1195134922193e1cfb2
I'm in favor of not defining the derivative of abs
in Calculus.
Me too
The only potential issue is that this might be an unexpected and surprising symbolic differentiation rule to use. This definition is the right one for JuMP, but mathematically it's a bit iffy to return a value for the derivative of
abs
at zero. Wolfram alpha uses the rulex/abs(x)
which properly isn't defined at zero.