JuliaML / LearnBase.jl

Abstractions for Julia Machine Learning Packages
Other
17 stars 10 forks source link

Refactoring of codebase #35

Open juliohm opened 4 years ago

juliohm commented 4 years ago

Dear all,

In this issue I would like to discuss a refactoring of LearnBase.jl to accommodate more general problems under transfer learning settings. Before I can do this, I would like to get your feedback on a few minor changes. These changes should facilitate a holistic view of the interface, and should help shape the workflow that developers are expected to follow (see #28).

Below are a few suggestions of improvement that I would like to consider.

Suggestions of improvement

  1. Split the main LearnBase.jl file into smaller source files with more specific concepts. For example, I'd like to review the Cost interface in a separate file called costs.jl. Similarly, we could move the data orientation interface to a separate file orientation.jl and include these two files in LearnBase.jl.

  2. Can we get rid of all exports in the module? I understand that this module is intended for use by developers who would import LearnBase; const LB = LearnBase in their code. Exporting all the names in LearnBase.jl can lead to problems downstream like the fact that LossFunctions.jl was not exporting the abstract SupervisedLoss type, and then users of LossFunctions.jl would also need to import LearnBase.jl just to get access to the name. My suggestion here is to define the interface without exports. And then each package in JuliaML can export the relevant concepts.

  3. The interface for learning models is currently spread over various different Julia ecosystems. In most cases, there are two functions that developers need to implement (e.g. fit/predict, model/update, fit/transform). I would like to do a literature review on the existing approaches, and generalize this to transfer learning settings. This generalization shouldn't force users to subtype their models from some Model type. A traits-based interface is ideal for developers who want to plug their models after the fact, and developers interested in fitting entire pipelines (e.g. AutoMLPipeline.jl).

I would like to start addressing (1) and (2) in the following weeks. In order to address (3) I need more time to investigate and brainstorm a more general interface.

juliohm commented 4 years ago

Issues (1) and (2) have been addressed. The next step in the roadmap is a literature review on existing interfaces for learning models.

CarloLucibello commented 4 years ago

Here @oxinabox outlines the current problems with JuliaML ecosystem, I'll paste them below for convenience:

CarloLucibello commented 4 years ago

Following those suggestions, I would ditch most of the types and type hierarchies in this repo, in order to improve interoperability with the ecosystem. E.g.

function mse_loss(ŷ, y; agg=mean)
    return agg((ŷ .- y).^2)     
end

mse_loss(ŷ, y)
mse_loss(ŷ, y, agg=sum)
mse_loss(ŷ, y, agg=x->sum(x, dims=2))  # partial reduction
mse_loss(ŷ, y, agg=x->mean(w .* x))  # weighted mean
mse_loss(ŷ, y, agg=identity)   # no aggregation. We could also support `nothing` here
juliohm commented 4 years ago

I really love this idea @CarloLucibello , we definitely need a new minor release with these improvements in the near future. I am finishing a cycle now with LearnBase.jl + LossFunctions.jl, but we should definitely come back to this issue here and simplify the code even further with dims and agg arguments :100:

My only divergence from the example above is the lack of types for losses. I think we should find a design in between where the type L2DistLoss is still available with a set of traits attached to it. Similar to the Distances.jl interface:

value(L2DistLoss(), yhat, y, agg=mean)

It is important to have explicit types for the losses because then we have flexibility to manipulate the concept.

joshday commented 4 years ago

@juliohm the latest LearnBase release breaks things downstream (PenaltyFunctions). Where did prox and similar functions go?

juliohm commented 4 years ago

@joshday, sorry, I remember prox and maybe prox! were not documented in LearnBase.jl. After doing a grep in packages inside JuliaML I couldn't find use cases. Could you please explain what they mean? I can submit a PR with an appropriate docstring reintroducing the names, and tag a patch release if you want.

joshday commented 4 years ago

It's the proximal mapping/operator (https://en.wikipedia.org/wiki/Proximal_operator)

Both prox and prox! are used in PenaltyFunctions.

No need for a PR, but could you add them back and make a new release?

juliohm commented 4 years ago

Of course. Will do it now :+1:

AriMKatz commented 4 years ago

Where do you see the MLJ and flux ecosystems in all this ?

juliohm commented 4 years ago

I cannot speak for everyone in JuliaML, but I can share my personal view on it. Hopefully at some point in the near future we will be able to lay out a foundational API that is free from specific academic or enterprise organizations, and that is organically evolved by the Julia community as a whole. We have many efforts spread around Flux, MLJ, JuliaStats, and so on, and I hope they will converge at some point to a carefully designed Julian API that all these projects can share. Some projects like MLJ are great to get things moving, and are also great as a playground to try out design ideas. However, I do feel that MLJ for instance is overusing macros, is not taking full advantage of Julia generics when it follows the simplistic fit/predict approach, and is overly complex for users with a beginner to moderate background in ML. My personal opinion is that we need to brainstorm a simple yet expressive API that does not compromise the experience of new Julia users and that can also handle very advanced usages. It is a challenging goal, but I think that is what the Julia community is about, pushing the boundaries of things.

Stay tuned, I should come back with some updates on this effort in the following weeks hopefully.

AriMKatz commented 4 years ago

Interesting. Looking forward to hearing more.