JuliaAI / MLJTuning.jl

Hyperparameter optimization algorithms for use in the MLJ machine learning framework
MIT License
66 stars 12 forks source link

Allow hyper-parameter tuning for immutable models. #174

Open ablaom opened 2 years ago

ablaom commented 2 years ago

Some context: https://github.com/JuliaML/TableTransforms.jl/issues/67

I don't think this would be too bad, and useful preparation for making the MLJ model interface more flexible later.

The MLJTuning API doesn't really touch on this point. A tuning strategy needs to implement a models method to generate models to evaluate, but doesn't say how the models are generated. They needn't be mutations of a single object. However, the MLJ model interface currently states that models must be mutable, so some tuning strategies do use mutation to generate their models.

TODO:

tuning strategy assumes model types are mutable pkg providing strategy
Grid yes MLJTuning
RandomSearch yes MLJTuning
LatinHypercube yes MLJTuning.jl
MLJTreeParzenTuning() ? TreeParzen.jl TreeParzen.jl
ParticleSwarm ? MLJParticleSwarmOptimization.jl [MLJParticleSwarmOptimization.jl]
AdaptiveParticleSwarm ? MLJParticleSwarmOptimization.jl MLJParticleSwarmOptimization.jl
Explicit() no MLJTuning.jl

cc @juliohm