JuliaAI / MLJ.jl

A Julia machine learning framework
https://juliaai.github.io/MLJ.jl/
Other
1.79k stars 158 forks source link

EvoTree Error #490

Closed azev77 closed 4 years ago

azev77 commented 4 years ago

First I will train a model that works LGBMRegressor, then two that don't: ARDRegressor (breaks @ fit(), "PyError", which you're aware of) EvoTreeRegressor (breaks @ machine(), "MethodError") @jeremiedb

using MLJ
X, y =  @load_boston;
train, test = partition(eachindex(y), .7, rng=333);

@load LGBMRegressor
mdl  = LGBMRegressor()
mach = machine(mdl, X, y)
fit!(mach, rows=train) 

@load ARDRegressor
mdl  = ARDRegressor()
mach = machine(mdl, X, y)
fit!(mach, rows=train) 

PyError ($(Expr(:escape, :(ccall(#= /Users/AZevelev/.julia/packages/PyCall/zqDXB/src/pyfncall.jl:43 =# @pysym(:PyObject_Call), PyPtr, (PyPtr, PyPtr, PyPtr), o, pyargsptr, kw))))) <class 'numpy.linalg.LinAlgError'>
LinAlgError('unrecoverable internal error.')
  File "/Users/AZevelev/.julia/conda/3/lib/python3.7/site-packages/sklearn/linear_model/_bayes.py", line 577, in fit
    sigma_ = update_sigma(X, alpha_, lambda_, keep_lambda, n_samples)
  File "/Users/AZevelev/.julia/conda/3/lib/python3.7/site-packages/sklearn/linear_model/_bayes.py", line 562, in update_sigma
    X[:, keep_lambda].T))
  File "/Users/AZevelev/.julia/conda/3/lib/python3.7/site-packages/sklearn/externals/_scipy_linalg.py", line 99, in pinvh
    s, u = decomp.eigh(a, lower=lower, check_finite=False)
  File "/Users/AZevelev/.julia/conda/3/lib/python3.7/site-packages/scipy/linalg/decomp.py", line 474, in eigh
    raise LinAlgError("unrecoverable internal error.")

pyerr_check at exception.jl:60 [inlined]
pyerr_check at exception.jl:64 [inlined]
_handle_error(::String) at exception.jl:81
macro expansion at exception.jl:95 [inlined]
#110 at pyfncall.jl:43 [inlined]
disable_sigint at c.jl:446 [inlined]
__pycall! at pyfncall.jl:42 [inlined]
_pycall!(::PyCall.PyObject, ::PyCall.PyObject, ::Tuple{Array{Float64,2},Array{Float64,1}}, ::Int64, ::Ptr{Nothing}) at pyfncall.jl:29
_pycall!(::PyCall.PyObject, ::PyCall.PyObject, ::Tuple{Array{Float64,2},Array{Float64,1}}, ::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at pyfncall.jl:11
(::PyCall.PyObject)(::Array{Float64,2}, ::Vararg{Any,N} where N; kwargs::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at pyfncall.jl:86
(::PyCall.PyObject)(::Array{Float64,2}, ::Vararg{Any,N} where N) at pyfncall.jl:86
fit!(::PyCall.PyObject, ::Array{Float64,2}, ::Vararg{Any,N} where N; kwargs::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at Skcore.jl:100
fit!(::PyCall.PyObject, ::Array{Float64,2}, ::Array{Float64,1}) at Skcore.jl:100
fit(::ARDRegressor, ::Int64, ::NamedTuple{(:Crim, :Zn, :Indus, :NOx, :Rm, :Age, :Dis, :Rad, :Tax, :PTRatio, :Black, :LStat),NTuple{12,Array{Float64,1}}}, ::Array{Float64,1}) at ScikitLearn.jl:157
fit!(::Machine{ARDRegressor}; rows::Array{Int64,1}, verbosity::Int64, force::Bool) at machines.jl:183
(::StatsBase.var"#fit!##kw")(::NamedTuple{(:rows,),Tuple{Array{Int64,1}}}, ::typeof(fit!), ::Machine{ARDRegressor}) at machines.jl:146
top-level scope at untitled-69d5ad375bc84d57ff345abae9b40d8f:15
@load EvoTreeRegressor
mdl  = EvoTreeRegressor()
mach = machine(mdl, X, y)

MethodError: no method matching machine(::EvoTreeRegressor{Float64,EvoTrees.Linear,Int64}, ::NamedTuple{(:Crim, :Zn, :Indus, :NOx, :Rm, :Age, :Dis, :Rad, :Tax, :PTRatio, :Black, :LStat),NTuple{12,Array{Float64,1}}}, ::Array{Float64,1})
Closest candidates are:
  machine(!Matched::MLJModelInterface.Model, ::Any...) at /Users/AZevelev/.julia/packages/MLJBase/J41c0/src/machines.jl:92
  machine(!Matched::Union{IO, String}, ::Any...; kwargs...) at /Users/AZevelev/.julia/packages/MLJBase/J41c0/src/machines.jl:296
  machine(!Matched::MLJModelInterface.Model, ::Any, !Matched::AbstractNode) at /Users/AZevelev/.julia/packages/MLJBase/J41c0/src/composition/networks.jl:456
  ...
top-level scope at untitled-69d5ad375bc84d57ff345abae9b40d8f:20
jeremiedb commented 4 years ago

Regarding EvoTrees, could you precise the versions used for MLJ and EvoTres. I couldn't reproduce with the latest versions: MLJ v0.10.3 and EvoTrees v0.4.6:

using MLJ
X, y =  @load_boston;
train, test = partition(eachindex(y), .7, rng=333);
@load EvoTreeRegressor
mdl  = EvoTreeRegressor()
mach = machine(mdl, X, y)
fit!(mach, rows=train, verbosity=1)

(MLJtest) pkg> st
    Status `~/Evovest/MLJtest/Project.toml`
  [f6006082] EvoTrees v0.4.6
  [add582a8] MLJ v0.10.3
azev77 commented 4 years ago

I see the issue: before I ran the code I deleted all my Julia libraries & downloaded MLJ EvoTrees etc from scratch. I have EvoTrees v0.3.0 Next I manually downloaded the latest EvoTrees: add https://github.com/Evovest/EvoTrees.jl and things now work.

I've had this type of issue before, when installing one package doesn't allow me to install the latest version of another package.

jeremiedb commented 4 years ago

To avoid those kind of package versioning bottlenecks, I think it's a good practice to create julia projects for every project. For example:

shell> mkdir MyProject
shell> cd MyProject
] activate .

Doing so will isolate the package within the given project and will minimize being exposed to potential dependencies conflicts.

ablaom commented 4 years ago

@azev77 Can we close this now?

azev77 commented 4 years ago

Yes. Yesterday they released EvoTrees 0.4.7 which fixed this.

@ablaom Question: What is the advantage to a mutating function: fit!(m, rows=train) As opposed to: res = fit(m, rows=train)

ablaom commented 4 years ago

From upcoming design paper:


Model interfaces

MLJ provides a basic fit/update/predict interface to be implemented by new supervised models. For unsupervised models predict is replaced with transform and an optional inverse_transform method. These methods operate on models which are mutable structs storing hyper-parameters, and nothing else. This model interface is purely functional for maximum flexibility. Presently the general MLJ user is encouraged to interact through a machine interface sitting on top. See more on this below.

The model interface

In MLJ a model is just a struct storing the hyper-parameters associated with some learning algorithm suggested by the struct name (e.g., DecisionTreeClassifier), and that is all. In the low-level, functional-style, model interface learned parameters are not stored, only passed around. Learned parameters are stored in machines (which additionally point to the hyperparameters stored in a model); see below. The separation of hyper-parameters and learned parameters is essential to flexible model composition.

For supervised models the fit method has this signature:

fit(model, verbosity, X, y)

where X is training input and y the training target. The method outputs a triple, typically denoted (fitresult, cache, report).

The fitresult stores the learned parameters, which must include everything needed by predict to make predictions, apart from model and new input data:

predict(model, fitresult, X)

The purpose of cache is to pass on "state" not included in the fitresult to an update method that the model implementer may optionally overload:

update(model, verbosity, fitresult, cache, X, y)

This method is to be called instead of fit (and passed the fitresult and cache returned by the fit call) when retraining using identical data. (The data X, y, are included for implementer convenience.) It provides an opportunity for the model implementer to avoid unnecessary repetition of code execution. The three main use-cases are:

In the future MLJ will add an update_data method to support models that can cary out on-line learning.

The machine interface

The general MLJ user trains models through its machine interface. This makes some work-flows more convenient, but more significantly, introduces a syntax closely aligned with that for model composition (see below).

A machine is a mutable struct that binds a model to data at construction

mach = machine(model, X, y)

When the user calls fit!(mach, rows=...) the fitresult, cache and report variables generated by lower-level calls to fit or update, are stored or updated in the machine struct, mach, with the training being optionally restricted to the specified rows of data. To retrain with new hyper-parameters, the user simply mutates model and repeats the fit! call.

Syntax for predicting using a machine is predict(mach, Xnew).