JuliaDiff / ForwardDiff.jl

Forward Mode Automatic Differentiation for Julia
Other
890 stars 142 forks source link

Incorrect computation of Hessian of squared euclidean norm #303

Open jomi2801 opened 6 years ago

jomi2801 commented 6 years ago

The computation of the Hessian of the squared euclidean norm results in an interesting output:

using ForwardDiff;
f(x::Vector) = .5*norm(x, 2)^2;
x = zeros(2);
ForwardDiff.hessian(f, x)

which is

 0.0  0.0
 0.0  1.0

However, the codes

f(x::Vector) = .5*x[1]^2 + .5*x[2]^2;
ForwardDiff.hessian(f, x)

and

f(x::Vector) = .5*dot(x, x);
ForwardDiff.hessian(f, x)

provide the correct result. Why is that?

jrevels commented 6 years ago

Possibly similar to https://github.com/JuliaDiff/ForwardDiff.jl/issues/197 i.e. Base's linear algebra code might be optimized in an "AD unsafe" way.

JoshuaLampert commented 5 months ago

Is there any chance this gets fixed? Unfortunately, in my case it is not really an option to replace the norm(x)^2 by sum(x.^2) since I have a function calling norm(x) and later another function squaring the result and because the concatenation gives NaN:

julia> using ForwardDiff

julia> f(x) = 0.5*sqrt(sum(x.^2))^2
f (generic function with 1 method)

julia> ForwardDiff.hessian(f, [0.0, 0.0])
2×2 Matrix{Float64}:
 NaN  NaN
 NaN  NaN

Enabling nansafe_mode also doesn't give the correct result:

julia> ForwardDiff.hessian(f, [0.0, 0.0])
2×2 Matrix{Float64}:
 0.0  0.0
 0.0  0.0

So this would require to rewrite my code base, which I would really like to avoid.