JuliaML / LossFunctions.jl

Julia package of loss functions for machine learning.
https://juliaml.github.io/LossFunctions.jl/stable
Other
146 stars 33 forks source link

Remove conditional dependencies #17

Closed Evizero closed 8 years ago

Evizero commented 8 years ago

right now there is still this placeholder code https://github.com/Evizero/LearnBase.jl/blob/master/src/optim/optim.jl#L6-L26 I was working on. I think the current best way would be to utilize Optim.jl using requires

tbreloff commented 8 years ago

I certainly like the idea of optional Optim functionality. I don't have enough experience with it to say which method of interfacing will be best. The plan is that we would have our own solvers that are used by default, though... right? Or maybe the optimizer (and the Optim dep) shouldn't be part of LearnBase? Could it be part of a meta-package Learn.jl which is more like a scikit-learn lab?

Evizero commented 8 years ago

It seems it is generally encouraged not to use Requires anymore

Evizero commented 8 years ago

https://github.com/shashi/ComputeFramework.jl/pull/33#issuecomment-219270869

cstjean commented 8 years ago

It seems it is generally encouraged not to use Requires anymore

That's a bummer. FWIW, I've been working with a few package maintainers to support the ScikitLearn.jl interface, and the dialog goes like

Hey, wanna support ScikitLearn.jl?

Maybe, but I really don't want to add dependencies

No worries, that's why I made ScikitLearnBase, which is super-minimal

Oh, OK then

So if I'm understanding LearnBase correctly, it might benefit from a similar split.

tbreloff commented 8 years ago

Heh. LearnBaseBase?!? I think the better approach would be to split optimization routines into a seperate repo (Learn.jl?) that also imports data iteration and other goodies from the JuliaML ecosystem.

On Sunday, May 15, 2016, Cédric St-Jean notifications@github.com wrote:

It seems it is generally encouraged not to use Requires anymore

That's a bummer. FWIW, I've been working with a few package maintainers to support the ScikitLearn.jl interface, and the dialog goes like

Hey, wanna support ScikitLearn.jl?

Maybe, but I really don't want to add dependencies

No worries, that's why I made ScikitLearnBase, which is super-minimal

Oh, OK then

So if I'm understanding LearnBase correctly, it might benefit from a similar split.

— You are receiving this because you commented. Reply to this email directly or view it on GitHub https://github.com/Evizero/LearnBase.jl/issues/17#issuecomment-219288532

Evizero commented 8 years ago

yes, well, the best way to split or accumulate functionality here is still being debated. Even the scope isn't fully decided upon on and a moving target.

I actually did come across your efforts with a julia interface to scikit learn. I think that it is very important work and a big benefit for the julia community, but the goal here is to work towards something completely in julia and also explore the design possibilities that are unfeasible in other languages (since we don't have to have any backend code in C or some other low level language). The degree to which we will succeed is unclear but I am hopeful.

I think the better approach would be to split optimization routines into a seperate repo (Learn.jl?) that also imports data iteration and other goodies from the JuliaML ecosystem.

Yes, I am also gravitating towards slimming LearnBase down a bit

Evizero commented 8 years ago

related to slimming down LearnBase: https://github.com/JuliaML/MLDataUtils.jl/issues/5

cstjean commented 8 years ago

the goal here is to work towards something completely in julia and also explore the design possibilities that are unfeasible in other languages (since we don't have to have any backend code in C or some other low level language). The degree to which we will succeed is unclear but I am hopeful.

I hope it works out! I'm waiting for the documentation to see if I can implement the ScikitLearnBase interface on top of the LearnBase models.

Heh. LearnBaseBase?!? I think the better approach would be to split optimization routines into a seperate repo (Learn.jl?) that also imports data iteration and other goodies from the JuliaML ecosystem.

Yes, I think we're talking about the same idea.

+1 for MLDataUtils. scikit-learn actually has a dozen KFold-like ways to split the data, and I've translated the two most common, but it would be nicer to share them across ML libraries.

tbreloff commented 8 years ago

I think would be amazing if your scikit learn wrapper used LearnBase abstractions. It would be super powerful to be able to incorporate those models alongside neural nets and other stuff in ensembles, and there's a lot of functionality that will take time to replace with pure julia implementations.

On Sunday, May 15, 2016, Cédric St-Jean notifications@github.com wrote:

the goal here is to work towards something completely in julia and also explore the design possibilities that are unfeasible in other languages (since we don't have to have any backend code in C or some other low level language). The degree to which we will succeed is unclear but I am hopeful.

I hope it works out! I'm waiting for the documentation to see if I can implement the ScikitLearnBase interface on top of the LearnBase models.

Heh. LearnBaseBase?!? I think the better approach would be to split optimization routines into a seperate repo (Learn.jl?) that also imports data iteration and other goodies from the JuliaML ecosystem.

Yes, I think we're talking about the same idea.

+1 for MLDataUtils. scikit-learn actually has a dozen http://scikit-learn.org/stable/modules/classes.html#module-sklearn.cross_validation KFold-like ways to split the data, and I've translated the two most common, but it would be nicer to share them across ML libraries.

— You are receiving this because you commented. Reply to this email directly or view it on GitHub https://github.com/Evizero/LearnBase.jl/issues/17#issuecomment-219308308

cstjean commented 8 years ago

Yes, agreed.