sbdzdz / disco

Disentangled Continual Learning
https://arxiv.org/abs/2312.16731
Apache License 2.0
2 stars 1 forks source link
continual-learning deep-learning machine-learning research-paper

🪩 Disco: Disentangled Continual Learning

Code style: black

Infinite dSprites for Disentangled Continual Learning: Separating Memory Edits from Generalization

Published at CoLLAs 2024.

Install

Install the requirements and the package (ideally in a virtual environment):

python -m pip install -r requirements.txt
python -m pip install -e .

Getting started

Here's how to use the dataset:

from torch.utils.data import DataLoader
from disco.data import InfiniteDSprites

dataset = InfiniteDSprites()
dataloader = DataLoader(dataset, batch_size=4)

batch = next(iter(dataloader))
draw_batch(batch, show=True)

For other use cases and a more detailed introduction, see the notebooks in the examples folder.

Plots

To reproduce the paper plots, see plots.sh script.

Rendering the figures requires TeX Live. To install it on macOS, use Homebrew:

brew install --cask mactex

Make sure the executables are in your PATH:

find / -name kpsewhich 2>/dev/null

Add the directory from the output to your PATH, e.g.:

export PATH=/usr/local/texlive/2023/bin/universal-darwin:$PAT

Citation

If you use this work in your research, please consider citing:

@article{dziadzio2023disentangled,
  title={Disentangled Continual Learning: Separating Memory Edits from Model Updates},
  author={Dziadzio, Sebastian and Y{\i}ld{\i}z, {\c{C}}a{\u{g}}atay and van de Ven, Gido M and Trzci{\'n}ski, Tomasz and Tuytelaars, Tinne and Bethge, Matthias},
  journal={arXiv preprint arXiv:2312.16731},
  year={2023}
}

Thanks!