arvganesh / SimpleExpSmoothing.jl

Implementing Simple Exponential Smoothing (SES) in Julia.
0 stars 0 forks source link

MLJ integration? #1

Open ablaom opened 3 years ago

ablaom commented 3 years ago

As I understand it, the smoother is a one-shot transformer in the sense the parameters you "learn" are not ever applied to any data except the data you "learned" from. Yes?

So, if there was some idea to implement the smoother as an MLJ model, then here is my suggestion:

Implement as a Static transformer. This means there is no MLJModelInterface.fit to implement, only an MLJModelInteface.transform method that will probably combine both the local fit! and predict methods.

Scitypes

MLJModelInterface.input_scitype(<:Type{<:ExponentialSmoothing}} = AbstractVector{Continuous}
MLJModelInterface.output_scitype(<:Type{<:ExponentialSmoothing}} = AbstractVector{Continuous}

Small suggestion: "ExponentialSmoother" is probably a better name than "ExponentialSmoothing". One tends to anthropomorphise these things ("transformer" not "transformation", "classifier" not "classification", and so forth).

@vollmersj

arvganesh commented 3 years ago

Hi @ablaom, thanks for your comments.

the parameters you "learn" are not ever applied to any data except the data you "learned" from.

Yes, this is correct.

The name suggestion makes sense. I plan on migrating this code into this repo: https://github.com/ababii/Pythia.jl to include along with other algorithms, so when I do, I will change the name.