ai-safety-foundation / sparse_autoencoder

Sparse Autoencoder for Mechanistic Interpretability
https://ai-safety-foundation.github.io/sparse_autoencoder/
MIT License
175 stars 38 forks source link

ImportError: cannot import name 'params_t' from 'torch.optim.optimizer' #210

Open seansica opened 3 months ago

seansica commented 3 months ago

Ran this from the demo code:

import os

# Check if we're in Colab
try:
    import google.colab  # noqa: F401 # type: ignore

    in_colab = True
except ImportError:
    in_colab = False

#  Install if in Colab
if in_colab:
    %pip install sparse_autoencoder transformer_lens transformers wandb

# Otherwise enable hot reloading in dev mode
if not in_colab:
    %load_ext autoreload
    %autoreload 2

from sparse_autoencoder import (
    ActivationResamplerHyperparameters,
    AutoencoderHyperparameters,
    Hyperparameters,
    LossHyperparameters,
    Method,
    OptimizerHyperparameters,
    Parameter,
    PipelineHyperparameters,
    SourceDataHyperparameters,
    SourceModelHyperparameters,
    SweepConfig,
    sweep,
)

Throws:

---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
[<ipython-input-3-ef2a52074e18>](https://localhost:8080/#) in <cell line: 3>()
      1 import os
      2 
----> 3 from sparse_autoencoder import (
      4     ActivationResamplerHyperparameters,
      5     AutoencoderHyperparameters,

1 frames
[/usr/local/lib/python3.10/dist-packages/sparse_autoencoder/optimizer/adam_with_reset.py](https://localhost:8080/#) in <module>
     10 from torch.nn.parameter import Parameter
     11 from torch.optim import Adam
---> 12 from torch.optim.optimizer import params_t
     13 
     14 from sparse_autoencoder.optimizer.abstract_optimizer import AbstractOptimizerWithReset

ImportError: cannot import name 'params_t' from 'torch.optim.optimizer' (/usr/local/lib/python3.10/dist-packages/torch/optim/optimizer.py)

---------------------------------------------------------------------------
NOTE: If your import is failing due to a missing package, you can
manually install dependencies using either !pip or !apt.

To view examples of installing some common dependencies, click the
"Open Examples" button below.
---------------------------------------------------------------------------

This occurs in Colab.

> python --version
Python 3.10.12

> pip list | grep -e torch -e sparse_autoencoder -e transformer_lens -e transformers -e wandb
sparse_autoencoder               1.10.0
torch                            2.3.0+cu121
torchaudio                       2.3.0+cu121
torchsummary                     1.5.1
torchtext                        0.18.0
torchvision                      0.18.0+cu121
transformers                     4.41.2
wandb                            0.17.4
seansica commented 3 months ago
> import torch.optim.optimizer as optim
> optim.__dir__()

['__name__',
 '__doc__',
 '__package__',
 '__loader__',
 '__spec__',
 '__file__',
 '__cached__',
 '__builtins__',
 '__annotations__',
 'math',
 'functools',
 'warnings',
 'OrderedDict',
 'defaultdict',
 'deepcopy',
 'chain',
 'Any',
 'Callable',
 'DefaultDict',
 'Dict',
 'Hashable',
 'Iterable',
 'List',
 'Optional',
 'Set',
 'Tuple',
 'TypeVar',
 'Union',
 'cast',
 'overload',
 'ParamSpec',
 'Self',
 'TypeAlias',
 'torch',
 'hooks',
 'RemovableHandle',
 'Indices',
 'TensorListList',
 '_get_foreach_kernels_supported_devices',
 '_get_fused_kernels_supported_devices',
 'is_compiling',
 '_group_tensors_by_device_and_dtype',
 'Args',
 'Kwargs',
 'StateDict',
 'GlobalOptimizerPreHook',
 'GlobalOptimizerPostHook',
 '__all__',
 '_global_optimizer_pre_hooks',
 '_global_optimizer_post_hooks',
 '_foreach_supported_types',
 '_RequiredParameter',
 'required',
 '_use_grad_for_differentiable',
 '_get_value',
 '_stack_if_compiling',
 '_dispatch_sqrt',
 '_default_to_fused_or_foreach',
 '_view_as_real',
 '_get_scalar_dtype',
 '_foreach_doc',
 '_fused_doc',
 '_capturable_doc',
 '_differentiable_doc',
 '_maximize_doc',
 'register_optimizer_step_pre_hook',
 'register_optimizer_step_post_hook',
 'ParamsT',
 '_P',
 'R',
 'T',
 'Optimizer']

Could be that params_t needs to be changed to ParamsT

seansica commented 3 months ago

Also just noted that the issue does not occur when building from source (poetry install). I think that suggests there is a discrepancy with the pip install, maybe an incorrectly specified dependency in [tool.poetry.dependencies] or something like that.