JuliaAI / MLJ.jl

A Julia machine learning framework
https://juliaai.github.io/MLJ.jl/
Other
1.79k stars 157 forks source link

Meaning of the various methods for unsupervised models ? #748

Closed sylvaticus closed 3 years ago

sylvaticus commented 3 years ago

While the documentation is keen in describing the parameters and return values of the various functions for unsupervised methods (fit,transform, inverse_transform and predict) what I haven't clear is their meaning: what each of them is supposed to do ?

Personally I have three cases of interest:

For each of them, the normal workflow is a single function passage: get the X and either return a modified version of the X (the missing imputation) or a new information for each record with the class assigned (either a scalar or a probability).

How this one-function workflow translates to the {fit,transform, inverse_transform and predict} paradigm in MLJ ?

OkonSamuel commented 3 years ago

For cases 1 & 2 devs are allowed to implement predict functions-even for unsupervised models-that can predict categorical or Univariate objects, see here and here. For case 3 there aren't any implementations for this, although devs should be able to implement this feature in their transform(model, fitresult, X) function.

sylvaticus commented 3 years ago

Thank you, but what are supposed to make these functions for unsupervised models ?

for example, for supervised model we can define, in general terms:

for unsupervised models I am interested in finding some characteristics of my dataset, whether a hidden partition (clustering) or the dimensions that drive most of the variance (PCA).... but I can't understand the fit/predict/transform/inverse_transform abstraction for these cases...

OkonSamuel commented 3 years ago

Am not sure i understand your question. Take for example PCA model. Suppose i want users to have access to the pca projection matrix, principal variances and total variance which are characteristics of the dataset. This can be implemented as a unsupervised model as shown below.

#define PCA model struct as an unsupervised model
@mlj_model mutable struct PCA <: MMI.Unsupervised
    maxoutdim::Int = 0::(_ ≥ 0)
    method::Symbol = :auto::(_ in (:auto, :cov, :svd))
    pratio::Float64 = 0.99::(0.0 < _ ≤ 1.0)
    mean::Union{Nothing, Real, Vector{Float64}} = nothing
end
# define a fit method
function MMI.fit(model::PCA, verbosity::Int, X)
    Xarray = MMI.matrix(X)
    mindim = minimum(size(Xarray))
    maxoutdim = model.maxoutdim == 0 ? mindim : model.maxoutdim
    # NOTE: copy/transpose
    fitresult = MS.fit(
        MS.PCA, transpose(Xarray);
        method=model.method,
        pratio=model.pratio,
        maxoutdim=maxoutdim,
        mean=model.mean
    )
    cache = nothing
    report = (
        indim=MS.indim(fitresult),
        outdim=MS.outdim(fitresult),
        tprincipalvar=MS.tprincipalvar(fitresult),
        tresidualvar=MS.tresidualvar(fitresult),
        tvar=MS.tvar(fitresult),
        mean=copy(MS.mean(fitresult)),
        principalvars=copy(MS.principalvars(fitresult))
    )
    return fitresult, cache, report
end

#define fit_params methods to allow users extract important fit_params from the fitresult
function MMI.fitted_params(::PCA, fitresult)
    return (projection=copy(MS.projection(fitresult)),)
end

#define transform function to transform any input feature table X
function MMI.transform(::PCA, fitresult, X)
        # X is n x d, need to transpose twice
        Xarray = MMI.matrix(X)
        Xnew = transpose(MS.transform(fitresult, transpose(Xarray)))
        return MMI.table(Xnew, prototype=X)
end

# now define a pca_mach
pca_mach = machine(PCA(), X)

So to extract the projection matrix, projection from the pca_mach one would call fitted_params(pca_mach).projection and to get the principal variances, principalvars one would call report(pca_mach).principalvars. Note that parameters that aren't extractable from the fitresult are placed in the report variable.

I hope i understood your question?

sylvaticus commented 3 years ago

Yes, I think I got it for PCA... thank you for your replies...