TorchEnsemble-Community / Ensemble-Pytorch

A unified ensemble framework for PyTorch to improve the performance and robustness of your deep learning model.
https://ensemble-pytorch.readthedocs.io
BSD 3-Clause "New" or "Revised" License
1.09k stars 95 forks source link

[ENH] Add heterogeneous support for Voting #40

Closed xuyxu closed 3 years ago

xuyxu commented 3 years ago

Related issue: #5

This is an experimental feature that explores how to better support the training on heterogeneous base estimators in Ensemble-Pytorch. Unlike using base estimators of the same type, the training protocals on different base estimators can be quite different, therefore many methods used in existing ensembles cannot well support heterogeneous base estimators (e.g., set_optimizer, set_scheduler, fit).

An alternative solution is to enabling users to fit each base estimator in an incremental way. Specifically, the workflow below shows one way to support heterogeneous base estimators:

Obviously, there is no way to support parallelization under this paradigm, but I think it is acceptable as state-of-the-art deep learning models typically have a bottleneck on computation, which makes parallelization less useful when only one GPU is available.

In addition, users can also pop base estimators from the ensemble if they want.

xuyxu commented 3 years ago

@all-contributors please add @xuyxu for example

allcontributors[bot] commented 3 years ago

@xuyxu

I've put up a pull request to add @xuyxu! :tada: