Open ParadaCarleton opened 11 months ago
That's possible. I'd be happy to support such an effort. I think it will be a bit of work.
At this point a lot of MLJ predictors do not have differentiable output. So if you're looking to apply optimisers that need this, you should know that is something yet to be sorted out (but worthwhile). Related: https://github.com/FluxML/MLJFlux.jl/issues/220 .
Now that we have Optimization.jl, would it be possible to support a broad class of optimizers for MLJTuning by just supporting the Optimization.jl interface?