Closed JoaquinIglesiasTurina closed 5 months ago
Hi @JoaquinIglesiasTurina! I'd need to take a closer look at the theory and code, but from what I already see:
In which case PR is welcome. ^_^ (Speaking for myself, but I'm sure the others will agree on this)
As such, I would like to keep working on this issue.
I doubt anyone else is working on this, so please go ahead.
a more data driven approach, estimating regularization parameters from data in the case of Bayesian Ridge Regression.
This usually comes with a cost of providing a prior distribution on model parameters though. Still, it might be nice to have an implementation here.
This usually comes with a cost of providing a prior distribution on model parameters though. Still, it might be nice to have an implementation here.
A true practitioner of the Bayesian framework would state that the cost is of providing explicit prior distribution on model parameters. :)
I would recommend the MacKay paper referenced in the scikit-learn docs for a quick introduction.
In general, Bayesian methods trade in greater flexibility for computational expense.
Bayesian regularization algorithms extend the classical Ridge Regression with:
I've conducted a quick and very dirty proof of concept and concluded that these methods could be implemented within
defn
, and therefore, fit the goals of Scholar.As such, I would like to keep working on this issue.