Closed tedkornish closed 9 years ago
Hi Ted.
Actually, HLearn used to have a Naive Bayes implementation, but I've temporarily removed it. I'm not happy with the current way Haskell handles random variables/distribution like things, and one of my future goals with HLearn is to have a more mathematically sound interface. So thanks for the interest, but I don't think this is a good intro project.
A better intro project would be to implement a new optimization method. The optimization interface is getting pretty well fleshed out, and other people have had some success contributing in this way.
Sounds good. I'll poke around the Git history and see what other people have contributed, then do some reading and try to pick a good diving-off point. Before I start, I'll probably shoot you an email to make sure that what I'm working on is productive and useful to the library.
Thanks!
The dbrent
method for univariate optimization is what I had in mind when I said that.
Solid! That sounds like a great start.
Hey Mike,
I'm interested in contributing as a fun exercise for my ML and Haskell chops. A disclaimer: I don't have a strong background in machine learning implementations and I haven't been writing Haskell for more than a few months. However, I'm willing to work hard, read papers and learn the ropes.
It looks like HLearn is currently lacking a Naive Bayesian Classifier implementation - do you think that would make a good initial project?
Cheers, T