Implementation wishlist for simple online probabilistic regression meta-algorithms.
[ ] fit separate copies of regressor on the batches, and then return Mixture with weights by the number of samples in each batch. Batches under a certain size (hyperparameter) are ignored; or, collected, until a minimum size is reached (hyperparameter).
[ ] remember a smaller bootstrap sample of size n_remember from the fit data, and pool this with data in a new batch. At each update, bootstrap so the remembered sample stays n_remember size. Predict Mixture from fit_predict on the remembered plus new sample, in update, with appropriate weights.
Further, general Bayesian algorithms will support a Bayesian update, this should be included in the API design for Bayesian estimators.
Implementation wishlist for simple online probabilistic regression meta-algorithms.
Mixture
with weights by the number of samples in each batch. Batches under a certain size (hyperparameter) are ignored; or, collected, until a minimum size is reached (hyperparameter).n_remember
from thefit
data, and pool this with data in a new batch. At eachupdate
, bootstrap so the remembered sample staysn_remember
size. PredictMixture
fromfit_predict
on the remembered plus new sample, inupdate
, with appropriate weights.Further, general Bayesian algorithms will support a Bayesian
update
, this should be included in the API design for Bayesian estimators.