ContinualAI / avalanche

Avalanche: an End-to-End Library for Continual Learning based on PyTorch.
http://avalanche.continualai.org
MIT License
1.76k stars 287 forks source link

Updatable objects #1633

Closed AntonioCarta closed 5 months ago

AntonioCarta commented 6 months ago

attempt at #1611.

Right now there is:

I'm opening the PR for comments right now, just to understand if this may be useful or not. I'm finding it conveniente for some simple experiments that I'm running.

Regarding compatibility, I didn't find any major roadblocks. Everything seems easy to integrate, though some things may need to be adapted. For example, some methods like GEM may be rewritten as optimizer to make their use transparent with this API.

coveralls commented 6 months ago

Pull Request Test Coverage Report for Build 8892479082

Details


Changes Missing Coverage Covered Lines Changed/Added Lines %
avalanche/evaluation/plot_utils.py 15 16 93.75%
avalanche/training/losses.py 8 9 88.89%
avalanche/training/storage_policy.py 22 23 95.65%
avalanche/training/regularization.py 15 17 88.24%
tests/training/test_losses.py 0 2 0.0%
tests/training/test_regularization.py 1 3 33.33%
tests/evaluation/test_plots.py 6 9 66.67%
tests/models/test_models.py 0 3 0.0%
avalanche/core.py 40 46 86.96%
tests/evaluation/test_functional.py 9 18 50.0%
<!-- Total: 248 436 56.88% -->
Files with Coverage Reduction New Missed Lines %
avalanche/training/regularization.py 9 88.19%
<!-- Total: 9 -->
Totals Coverage Status
Change from base Build 8453173939: 0.05%
Covered Lines: 15081
Relevant Lines: 29111

💛 - Coveralls
AlbinSou commented 5 months ago

As a side comment. Do you think it can happen that one updatable object depends on another ? I am thinking in particular about the optimizer. If we want to use the existing functions for updating optimizer we would need to encapsulate the optimizer into an Updatable object, but then it would need to be updated after the model. In that case we could add some dependency system to each object to make sure they are updated in the correct order.

AntonioCarta commented 5 months ago

@AlbinSou I added the custom weights mode for the MetricCollector and a wrapper for the optimizers. I think the PR is ready to be merged if you agree.