JuliaNLSolvers / Optim.jl

Optimization functions for Julia
Other
1.12k stars 221 forks source link

Type instability #820

Closed joaogoliveira1 closed 4 years ago

joaogoliveira1 commented 4 years ago

Hey, I'm getting type instability on the output of the Nelder-Mead optimization algorithm. Here is a simple example of the problem:


function bogus(x::Array{<:Real})
    abs(x[1])+abs(x[1])
end

@code_warntype optimize(bogus,[0.1,0.1],NelderMead())

The results from code_warntype are:

Body::Optim.MultivariateOptimizationResults{NelderMead{Optim.AffineSimplexer,Optim.AdaptiveParameters},_A,Array{Float64,1},Float64,Float64,Array{OptimizationState{Float64,NelderMead{Optim.AffineSimplexer,Optim.AdaptiveParameters}},1},Bool} where _A 1 ─ %1 = Base.NamedTuple()::Core.Compiler.Const(NamedTuple(), false) │ %2 = Optim.default_options(method)::Dict{Symbol,Any} │ %3 = Base.merge(%1, %2)::NamedTuple UNSTABLE │ %4 = Base.isempty(%3)::Bool └── goto #3 if not %4 2 ─ (@_5 = Optim.Options()) └── goto #4 3 ─ %8 = Core.kwfunc(Optim.Options)::Core.Compiler.Const(Core.var"#Type##kw"(), false) └── (@_5 = (%8)(%3, Optim.Options)) 4 ┄ %10 = @_5::Optim.Options{_A,_B} where _B where _A UNSTABLE │ %11 = (#self#)(f, initial_x, method, %10)::Optim.MultivariateOptimizationResults{NelderMead{Optim.AffineSimplexer,Optim.AdaptiveParameters},_A,Array{Float64,1},Float64,Float64,Array{OptimizationState{Float64,NelderMead{Optim.AffineSimplexer,Optim.AdaptiveParameters}},1},Bool} where _A UNSTABLE └── return %11

Type instability is signaled by UNSTABLE Is this normal? Is there a way to make it type stable?

pkofod commented 4 years ago

It's been kind of hit'n'miss over Julia versions. I guess it's related to constant propagation heuristics. I know there have been some changes that relate to keywords lately, I could try to see if it's better on Julia master. I don't think it matters though, depending on what you do. It's unsatisfactory though, I agree.

joaogoliveira1 commented 4 years ago

Well as long as there is no drag on the code, its fine for me. Thanks for your reply!