ai-safety-foundation / sparse_autoencoder

Sparse Autoencoder for Mechanistic Interpretability
https://ai-safety-foundation.github.io/sparse_autoencoder/
MIT License
168 stars 38 forks source link

New torch version breaks optimizer code #202

Open chefexperte opened 6 months ago

chefexperte commented 6 months ago

Since the release of torch 2.2.0, there is no "params_t" anymore, meaning you get the following error, since the requirements of the pyproject.toml are torch=">=2.1.2".

[/usr/local/lib/python3.10/dist-packages/sparse_autoencoder/optimizer/adam_with_reset.py](https://localhost:8080/#) in <module>
     10 from torch.nn.parameter import Parameter
     11 from torch.optim import Adam
---> 12 from torch.optim.optimizer import params_t
     13 
     14 from sparse_autoencoder.optimizer.abstract_optimizer import AbstractOptimizerWithReset

ImportError: cannot import name 'params_t' from 'torch.optim.optimizer' (/usr/local/lib/python3.10/dist-packages/torch/optim/optimizer.py)
Final-Industrialist commented 4 months ago

I'm sure a patch is coming for this real soon, but if you just wanna run demo.ipynb you can just add the line:

%pip install torch=2.1.2

To the installation cell and everything should run smoothly (at least in colab).

ehzawad commented 3 months ago

@Final-Industrialist thanks! %pip install torch==2.1.2 works Screenshot 2024-05-26 at 12 12 21 AM Looks like new PyTorch version has broken things up. I guess the author could put the exact package versions, rather than putting range of the versions.

EMZEDI commented 2 months ago

Same issue here with the new versions of Torch. A workaround could be not setting the params_t variable to have the value from torch.optim. You could simply set it to None and things temporarily work.