JuliaAI / MLJTuning.jl

Hyperparameter optimization algorithms for use in the MLJ machine learning framework
MIT License
67 stars 12 forks source link

add serialization #165

Closed olivierlabayle closed 2 years ago

olivierlabayle commented 2 years ago

This PR needs:

For some reason that I can't figure out, the tests from the dev branch fails on my local machine...

Todo:

ablaom commented 2 years ago

For some reason that I can't figure out, the tests from the dev branch fails on my local machine...

Mmm. That's weird. I just checked on my local machine and didn't run into any problems. What's your stack trace?

olivierlabayle commented 2 years ago

Mmm. That's weird. I just checked on my local machine and didn't run into any problems. What's your stack trace?

I get the following:

learning curves (accelerated with typename(CPUThreads)): Error During Test at /Users/olivierlabayle/Dev/MLJTuning.jl/test/test_utilities.jl:36
  Got exception outside of a @test
  MethodError: no method matching _tuning_results(::Vector{MersenneTwister}, ::CPU1{Nothing}, ::Machine{MLJTuning.DeterministicTunedModel{Grid, Main.Models.DeterministicEnsembleModel{FooBarRegressor}}, true}, ::Symbol, ::Int64)
  Closest candidates are:
    _tuning_results(::AbstractVector, ::CPU1, ::Any, ::Any, ::Any, ::Any) at ~/Dev/MLJTuning.jl/src/learning_curves.jl:210
    _tuning_results(::AbstractVector, ::CPUProcesses, ::Any, ::Any, ::Any, ::Any) at ~/Dev/MLJTuning.jl/src/learning_curves.jl:246
    _tuning_results(::AbstractVector, ::CPUThreads, ::Any, ::Any, ::Any, ::Any) at ~/Dev/MLJTuning.jl/src/learning_curves.jl:293
    ...
  Stacktrace:
    [1] _tuning_results(rngs::Vector{MersenneTwister}, acceleration::CPUThreads{Int64}, tuned::Machine{MLJTuning.DeterministicTunedModel{Grid, Main.Models.DeterministicEnsembleModel{FooBarRegressor}}, true}, rows::Nothing, rng_name::Symbol, verbosity::Int64)
      @ MLJTuning ~/Dev/MLJTuning.jl/src/learning_curves.jl:302
    [2] learning_curve(::Main.Models.DeterministicEnsembleModel{FooBarRegressor}, ::Source, ::Vararg{Source}; resolution::Int64, resampling::Holdout, weights::Nothing, measures::Nothing, measure::Nothing, rows::Nothing, operation::Nothing, ranges::Nothing, range::NumericRange{Int64, MLJBase.Bounded, Symbol}, repeats::Int64, acceleration::CPUThreads{Nothing}, acceleration_grid::CPU1{Nothing}, verbosity::Int64, rngs::Vector{MersenneTwister}, rng_name::Symbol, check_measure::Bool)
      @ MLJTuning ~/Dev/MLJTuning.jl/src/learning_curves.jl:173
    [3] #learning_curve#60
      @ ~/Dev/MLJTuning.jl/src/learning_curves.jl:92 [inlined]
    [4] macro expansion
      @ ~/Dev/MLJTuning.jl/test/learning_curves.jl:47 [inlined]
    [5] macro expansion
      @ /Applications/Julia-1.7.app/Contents/Resources/julia/share/julia/stdlib/v1.7/Test/src/Test.jl:1283 [inlined]
    [6] macro expansion
      @ ~/Dev/MLJTuning.jl/test/learning_curves.jl:24 [inlined]
    [7] top-level scope
      @ ~/Dev/MLJTuning.jl/test/test_utilities.jl:36
    [8] include(fname::String)
      @ Base.MainInclude ./client.jl:451
    [9] macro expansion
      @ /Applications/Julia-1.7.app/Contents/Resources/julia/share/julia/stdlib/v1.7/Test/src/Test.jl:445 [inlined]
   [10] macro expansion
      @ ~/Dev/MLJTuning.jl/test/runtests.jl:59 [inlined]
   [11] macro expansion
      @ /Applications/Julia-1.7.app/Contents/Resources/julia/share/julia/stdlib/v1.7/Test/src/Test.jl:1283 [inlined]
   [12] top-level scope
      @ ~/Dev/MLJTuning.jl/test/runtests.jl:59
   [13] include(fname::String)
      @ Base.MainInclude ./client.jl:451
   [14] top-level scope
      @ none:6
   [15] eval
      @ ./boot.jl:373 [inlined]
   [16] exec_options(opts::Base.JLOptions)
      @ Base ./client.jl:268
   [17] _start()
      @ Base ./client.jl:495

The test that triggers seems to be this line in the learning curves tests. Which later triggers a method error here. I don't know to what extent the fact that I run under a Mac M1 may be explaining this.

Otherwise everything runs fine, including the new serialization part.

ablaom commented 2 years ago

For some reason that I can't figure out, the tests from the dev branch fails on my local machine...

@olivierlabayle I think https://github.com/JuliaAI/MLJTuning.jl/pull/166 will fix this.

codecov-commenter commented 2 years ago

Codecov Report

Merging #165 (8937ce9) into dev (2505b48) will decrease coverage by 0.40%. The diff coverage is 0.00%.

@@            Coverage Diff             @@
##              dev     #165      +/-   ##
==========================================
- Coverage   87.62%   87.21%   -0.41%     
==========================================
  Files          11       12       +1     
  Lines         614      618       +4     
==========================================
+ Hits          538      539       +1     
- Misses         76       79       +3     
Impacted Files Coverage Δ
src/serialization.jl 0.00% <0.00%> (ø)
src/tuned_models.jl 93.47% <0.00%> (+0.43%) :arrow_up:

Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 2505b48...8937ce9. Read the comment docs.