elixir-nx / scholar

Traditional machine learning on top of Nx
Apache License 2.0
420 stars 43 forks source link

Implement regularized Bayesian linear algorithms #244

Closed JoaquinIglesiasTurina closed 5 months ago

JoaquinIglesiasTurina commented 6 months ago

In general, Bayesian methods trade in greater flexibility for computational expense.

Bayesian regularization algorithms extend the classical Ridge Regression with:

I've conducted a quick and very dirty proof of concept and concluded that these methods could be implemented within defn , and therefore, fit the goals of Scholar.

As such, I would like to keep working on this issue.

krstopro commented 6 months ago

Hi @JoaquinIglesiasTurina! I'd need to take a closer look at the theory and code, but from what I already see:

  1. There is a Bayesian ridge regression in scikit-learn.
  2. We don't have Bayesian ridge regression module in Scholar. :)

In which case PR is welcome. ^_^ (Speaking for myself, but I'm sure the others will agree on this)

As such, I would like to keep working on this issue.

I doubt anyone else is working on this, so please go ahead.

a more data driven approach, estimating regularization parameters from data in the case of Bayesian Ridge Regression.

This usually comes with a cost of providing a prior distribution on model parameters though. Still, it might be nice to have an implementation here.

JoaquinIglesiasTurina commented 6 months ago

This usually comes with a cost of providing a prior distribution on model parameters though. Still, it might be nice to have an implementation here.

A true practitioner of the Bayesian framework would state that the cost is of providing explicit prior distribution on model parameters. :)

I would recommend the MacKay paper referenced in the scikit-learn docs for a quick introduction.