TorchEnsemble-Community / Ensemble-Pytorch

A unified ensemble framework for PyTorch to improve the performance and robustness of your deep learning model.
https://ensemble-pytorch.readthedocs.io
BSD 3-Clause "New" or "Revised" License
1.05k stars 95 forks source link

Allow Continuation of Training #148

Open jtpdowns opened 1 year ago

jtpdowns commented 1 year ago

It appears that the fit method for ensembles is also where the estimators are instantiated. It would be convenient (for example for fine-tuning pretrained ensembles) if the instantiation and training happened in separate steps. Would it be possible either to decouple the instantiation and training steps to allow for the continuation of training? Is the functionality for continuation of training already available in some other way?

jtpdowns commented 1 year ago

It seems like this might be straightforward to implement for any class where all estimators are initialized at once (ie I think adversarial, bagging, fusion, gradient boosting, soft gradient boosting, and voting):

    # Instantiate a pool of base estimators, optimizers, and schedulers.
    estimators = []
    for _ in range(self.n_estimators):
        estimators.append(self._make_estimator())

For fast geometric and snapshot, it seems like you could still manage a list and just update from the last element in the list for continuation (instantiating the first estimator into an otherwise empty list for a new ensemble).

xuyxu commented 1 year ago

Hi @jtpdowns, I think you are right. It would be convenient if we could decouple the training part and the model initialization part. Will appreciate a pull request very much.