JuliaLang / julia

The Julia Programming Language
https://julialang.org/
MIT License
45.88k stars 5.49k forks source link

"Unreachable reached" bug during `Test.@inferred` on Julia 1.6.7 #53761

Open MilesCranmer opened 8 months ago

MilesCranmer commented 8 months ago

I'm getting the bug

Unreachable reached at 0x7bcccafee4dd

signal (4): Illegal instruction

According to discourse, this deserved a bug report.

I get this when running Julia 1.6.7 SymbolicRegression.jl unittests on one of my branches. You can see the log here: https://github.com/MilesCranmer/SymbolicRegression.jl/actions/runs/8309436354/job/22740784019. I can reproduce it on my other Linux machines as well.

It specifically comes from this line: https://github.com/MilesCranmer/SymbolicRegression.jl/blob/bc3d642281e4d99ee3380cea6c21a0abae901790/test/test_deterministic.jl#L22-L30. This is not a new test; it's never had issues before. It seems like something in the type inference is breaking.

I have uploaded an rr trace here, of just running this test (which gets the error): https://s3.amazonaws.com/julialang-dumps/reports/2024-03-16T21-43-57-MilesCranmer.tar.zst so you can walk through it.

You can also reproduce this by checking out commit bc3d642281e4d99ee3380cea6c21a0abae901790 of SymbolicRegression.jl and running the file test/test_deterministic.jl for Julia 1.6.7.

My guess is that it is related to type inference through Optim.optimize when the DynamicExpressions.OperatorEnum is underspecified (i.e., a UnionAll). I can get around this bug with this commit: https://github.com/MilesCranmer/SymbolicRegression.jl/pull/271/commits/d19f15b614a950b72371fbf4bcbc2994dea028c7, which specifies the full OperatorEnum. I'm not sure if that helps or not.

MilesCranmer commented 8 months ago

Update: I have uploaded an rr trace (see original post)

nsajko commented 8 months ago

Can't reproduce?

(@v1.6) pkg> add https://github.com/MilesCranmer/SymbolicRegression.jl#bc3d642281e4d99ee3380cea6c21a0abae901790
[...]
julia> include("/home/nsajko/.julia/packages/SymbolicRegression/0Tn2D/test/test_deterministic.jl");
[ Info: Precompiling SymbolicRegression [8254be44-1295-4e6a-a16d-46603ac705cb]
WARNING: using StaticArrays.setindex in module FiniteDiff conflicts with an existing identifier.

julia> versioninfo()
Julia Version 1.6.7
Commit 3b76b25b64 (2022-07-19 15:11 UTC)
Platform Info:
  OS: Linux (x86_64-pc-linux-gnu)
  CPU: AMD Ryzen 3 5300U with Radeon Graphics
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-11.0.1 (ORCJIT, znver2)
MilesCranmer commented 8 months ago

I uploaded the rr trace above. You can see and reproduce the error in that, no?

According to https://discourse.julialang.org/t/unreachable-reached-at-0x13929e935/38528/6?u=milescranmer I am supposed to submit a bug report.

inkydragon commented 7 months ago

I can reproduce this: With Julia v1.6.7 + SymbolicRegression v0.24.0 #bc3d642

add https://github.com/MilesCranmer/SymbolicRegression.jl#bc3d642281e4d99ee3380cea6c21a0abae901790
# run test code with julia v1.6
julia +1.6 .\53761.jl
full test code ```julia using SymbolicRegression using Test using Random X = 2 .* randn(MersenneTwister(0), Float32, 2, 1000); y = 3 * cos.(X[2, :]) + X[1, :] .^ 2 .- 2; options = SymbolicRegression.Options(; binary_operators=(+, *, /, -), unary_operators=(cos,), crossover_probability=0.0, # required for recording, as not set up to track crossovers. max_evals=10000, deterministic=true, seed=0, verbosity=0, progress=false, ); all_outputs = [] for i in 1:2 # vvvvv crash here hall_of_fame = @inferred equation_search( X, y; niterations=5, options=options, parallelism=:serial, v_dim_out=Val(1), return_state=Val(false), ) dominating = calculate_pareto_frontier(hall_of_fame) push!(all_outputs, dominating[end].tree) end @test string(all_outputs[1]) == string(all_outputs[2]) ```

And simplified test code:

using SymbolicRegression

X = zeros(2, 1);
Y = zeros(1);

# equation_search(X, Y)
equation_search(X, Y; parallelism=:serial)
Crash log ``` > julia +1.6 .\53761+s.jl ┌ Warning: You are using multithreading mode, but only one thread is available. Try starting julia with `--threads=auto`. └ @ SymbolicRegression C:\Users\cyhan\.julia\packages\SymbolicRegression\0Tn2D\src\Configure.jl:55 [ Info: Started! Unreachable reached at 00000000568e8152 Please submit a bug report with steps to reproduce this fault, and any error messages that follow (in their entirety). Thanks. Exception: EXCEPTION_ILLEGAL_INSTRUCTION at 0x568e8152 -- dispatch_optimize_constants at C:\Users\cyhan\.julia\packages\SymbolicRegression\0Tn2D\src\ConstantOptimization.jl:33 [inlined] optimize_constants at C:\Users\cyhan\.julia\packages\SymbolicRegression\0Tn2D\src\ConstantOptimization.jl:19 in expression starting at C:\Users\cyhan\Desktop\cyhan\gc\julia\bugs\53761+s.jl:13 dispatch_optimize_constants at C:\Users\cyhan\.julia\packages\SymbolicRegression\0Tn2D\src\ConstantOptimization.jl:33 [inlined] optimize_constants at C:\Users\cyhan\.julia\packages\SymbolicRegression\0Tn2D\src\ConstantOptimization.jl:19 unknown function (ip: 00000000568e8232) unknown function (ip: 00000000568e7bc9) Allocations: 66661981 (Pool: 66647076; Big: 14905); GC: 57 ``` Full Call stack (obtained by inserting `throw` statement) ``` Stacktrace: [1] dispatch_optimize_constants(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, NamedTuple{(), Tuple{}}, Nothing, Nothing, Nothing, Nothing}, member::PopMember{Float64, Float64, Node{Float64}}, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(-), typeof(/), typeof(*)}, Tuple{}}, Node, false, false, nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, idx::Nothing) @ SymbolicRegression.ConstantOptimizationModule C:\Users\cyhan\Desktop\cyhan\jl\SymbolicRegression.jl\src\ConstantOptimization.jl:25 [2] optimize_constants(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, NamedTuple{(), Tuple{}}, Nothing, Nothing, Nothing, Nothing}, member::PopMember{Float64, Float64, Node{Float64}}, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(-), typeof(/), typeof(*)}, Tuple{}}, Node, false, false, nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}) @ SymbolicRegression.ConstantOptimizationModule C:\Users\cyhan\Desktop\cyhan\jl\SymbolicRegression.jl\src\ConstantOptimization.jl:19 [3] optimize_and_simplify_population(dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, NamedTuple{(), Tuple{}}, Nothing, Nothing, Nothing, Nothing}, pop::Population{Float64, Float64, Node{Float64}}, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(-), typeof(/), typeof(*)}, Tuple{}}, Node, false, false, nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, curmaxsize::Int64, record::Dict{String, Any}) @ SymbolicRegression.SingleIterationModule C:\Users\cyhan\Desktop\cyhan\jl\SymbolicRegression.jl\src\SingleIteration.jl:122 [4] _dispatch_s_r_cycle(in_pop::Population{Float64, Float64, Node{Float64}}, dataset::Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, NamedTuple{(), Tuple{}}, Nothing, Nothing, Nothing, Nothing}, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(-), typeof(/), typeof(*)}, Tuple{}}, Node, false, false, nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}; pop::Int64, out::Int64, iteration::Int64, verbosity::Int64, cur_maxsize::Int64, running_search_statistics::SymbolicRegression.AdaptiveParsimonyModule.RunningSearchStatistics) @ SymbolicRegression C:\Users\cyhan\Desktop\cyhan\jl\SymbolicRegression.jl\src\SymbolicRegression.jl:1122 [5] macro expansion @ C:\Users\cyhan\Desktop\cyhan\jl\SymbolicRegression.jl\src\SymbolicRegression.jl:808 [inlined] [6] macro expansion @ C:\Users\cyhan\Desktop\cyhan\jl\SymbolicRegression.jl\src\SearchUtils.jl:112 [inlined] [7] _warmup_search!(state::SymbolicRegression.SearchUtilsModule.SearchState{Float64, Float64, Node{Float64}, Tuple{Population{Float64, Float64, Node{Float64}}, HallOfFame{Float64, Float64, Node{Float64}}, Dict{String, Any}, Float64}, Channel}, datasets::Vector{Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, NamedTuple{(), Tuple{}}, Nothing, Nothing, Nothing, Nothing}}, ropt::SymbolicRegression.SearchUtilsModule.RuntimeOptions{:serial, 1, false}, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(-), typeof(/), typeof(*)}, Tuple{}}, Node, false, false, nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}) @ SymbolicRegression C:\Users\cyhan\Desktop\cyhan\jl\SymbolicRegression.jl\src\SymbolicRegression.jl:803 [8] _equation_search(datasets::Vector{Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, NamedTuple{(), Tuple{}}, Nothing, Nothing, Nothing, Nothing}}, ropt::SymbolicRegression.SearchUtilsModule.RuntimeOptions{:serial, 1, false}, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(-), typeof(/), typeof(*)}, Tuple{}}, Node, false, false, nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, saved_state::Nothing) @ SymbolicRegression C:\Users\cyhan\Desktop\cyhan\jl\SymbolicRegression.jl\src\SymbolicRegression.jl:588 [9] equation_search(datasets::Vector{Dataset{Float64, Float64, Matrix{Float64}, Vector{Float64}, Nothing, NamedTuple{(), Tuple{}}, Nothing, Nothing, Nothing, Nothing}}; niterations::Int64, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(-), typeof(/), typeof(*)}, Tuple{}}, Node, false, false, nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, parallelism::Symbol, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, heap_size_hint_in_bytes::Nothing, runtests::Bool, saved_state::Nothing, return_state::Nothing, verbosity::Nothing, progress::Nothing, v_dim_out::Val{1}) @ SymbolicRegression C:\Users\cyhan\Desktop\cyhan\jl\SymbolicRegression.jl\src\SymbolicRegression.jl:564 [10] equation_search(X::Matrix{Float64}, y::Matrix{Float64}; niterations::Int64, weights::Nothing, options::Options{Int64, DynamicExpressions.OperatorEnumModule.OperatorEnum{Tuple{typeof(+), typeof(-), typeof(/), typeof(*)}, Tuple{}}, Node, false, false, nothing, StatsBase.Weights{Float64, Float64, Vector{Float64}}}, variable_names::Nothing, display_variable_names::Nothing, y_variable_names::Nothing, parallelism::Symbol, numprocs::Nothing, procs::Nothing, addprocs_function::Nothing, heap_size_hint_in_bytes::Nothing, runtests::Bool, saved_state::Nothing, return_state::Nothing, loss_type::Type{Nothing}, verbosity::Nothing, progress::Nothing, X_units::Nothing, y_units::Nothing, v_dim_out::Val{1}, multithreaded::Nothing, varMap::Nothing) @ SymbolicRegression C:\Users\cyhan\Desktop\cyhan\jl\SymbolicRegression.jl\src\SymbolicRegression.jl:405 [11] #equation_search#28 @ C:\Users\cyhan\Desktop\cyhan\jl\SymbolicRegression.jl\src\SymbolicRegression.jl:435 [inlined] [12] top-level scope @ C:\Users\cyhan\Desktop\cyhan\jl\SymbolicRegression.jl\bugs-53761.jl:7 in expression starting at C:\Users\cyhan\Desktop\cyhan\jl\SymbolicRegression.jl\bugs-53761.jl:7 ```

Julia v1.6 + SymbolicRegression v0.24.2 just gives LoadError.

> julia +1.6 .\53761.jl
ERROR: LoadError: return type HallOfFame{Float32, Float32, Node{Float32}} does not match inferred return type Any
Stacktrace:
 [1] error(s::String)
   @ Base .\error.jl:33
 [2] top-level scope
   @ C:\Users\cyhan\Desktop\cyhan\gc\julia\bugs\53761.jl:27
in expression starting at C:\Users\cyhan\Desktop\cyhan\gc\julia\bugs\53761.jl:26