facebookresearch / esm

Evolutionary Scale Modeling (esm): Pretrained language models for proteins
MIT License
3.26k stars 643 forks source link

'esm.pretrained' has no attribute 'esm2_t6_8M_UR50D' #531

Open LZhang98 opened 1 year ago

LZhang98 commented 1 year ago

Bug description While initializing my model (and loading this pretrained model), I noticed a strange error message that has never happened before in the last several months I've used ESM. I used to be able to train and evaluate the model, but now the entire ESM-2 pretrained model does not seem to exist. Looking at the available attributes to me, all of the ESM-2 pretrained models seem to be gone

Reproduction steps import esm.pretrained and attempt to call esm.pretrained.esm2_t6_8M_UR50D(), or any 'esm2_*' variant.

Expected behavior Pretrained model is loaded, and downloaded if not already done so.

Logs Please paste the command line output:

Traceback (most recent call last):
  File "/path/to/draw_model.py", line 13, in <module>
    my_model = Model('gpu', num_layers, model_dim, num_heads, ff_dim, dropout=0.3)
  File "/path/to/esm_pretrained.py", line 9, in __init__
    320: esm.pretrained.esm2_t6_8M_UR50D()
AttributeError: module 'esm.pretrained' has no attribute 'esm2_t6_8M_UR50D'. Did you mean: 'esm1_t6_43M_UR50S'?

Additional context WSL2 Ubuntu, conda environment... used to work and now doesn't seem to.

mmpust commented 1 year ago

Hey Luke, did you figure it out? I have the same problem

import torch
import sys
sys.path.append("esm/esm")
import pretrained

# Load ESM-2 model
model, alphabet = pretrained.esm2_t33_650M_UR50D()
AttributeError: module 'pretrained' has no attribute 'esm2_t33_650M_UR50D'

Thanks!

LZhang98 commented 1 year ago

Hi,

I decided to make a new virtual environment and perform a clean torch + fair-esm installation -- seems to be working in order. Let me know if this works for you, too.

Python 3.8.10 (default, Mar 13 2023, 10:26:41)
[GCC 9.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import torch
>>> import esm.pretrained
>>> dir(esm.pretrained)
['ESM2', 'Namespace', 'Path', '__builtins__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', '__spec__', '_download_model_and_regression_data', '_has_regression_weights', '_load_model_and_alphabet_core_v1', '_load_model_and_alphabet_core_v2', 'esm', 'esm1_t12_85M_UR50S', 'esm1_t34_670M_UR100', 'esm1_t34_670M_UR50D', 'esm1_t34_670M_UR50S', 'esm1_t6_43M_UR50S', 'esm1b_t33_650M_UR50S', 'esm1v_t33_650M_UR90S', 'esm1v_t33_650M_UR90S_1', 'esm1v_t33_650M_UR90S_2', 'esm1v_t33_650M_UR90S_3', 'esm1v_t33_650M_UR90S_4', 'esm1v_t33_650M_UR90S_5', 'esm2_t12_35M_UR50D', 'esm2_t30_150M_UR50D', 'esm2_t33_650M_UR50D', 'esm2_t36_3B_UR50D', 'esm2_t48_15B_UR50D', 'esm2_t6_8M_UR50D', 'esm_if1_gvp4_t16_142M_UR50', 'esm_msa1_t12_100M_UR50S', 'esm_msa1b_t12_100M_UR50S', 'esmfold_v0', 'esmfold_v1', 'has_emb_layer_norm_before', 'load_hub_workaround', 'load_model_and_alphabet', 'load_model_and_alphabet_core', 'load_model_and_alphabet_hub', 'load_model_and_alphabet_local', 'load_regression_hub', 're', 'torch', 'urllib', 'warnings']
>>> model, alphabet = esm.pretrained.esm2_t6_8M_UR50D()
Downloading: "https://dl.fbaipublicfiles.com/fair-esm/models/esm2_t6_8M_UR50D.pt" to /path/to/.cache/torch/hub/checkpoints/esm2_t6_8M_UR50D.pt
Downloading: "https://dl.fbaipublicfiles.com/fair-esm/regression/esm2_t6_8M_UR50D-contact-regression.pt" to /path/to/.cache/torch/hub/checkpoints/esm2_t6_8M_UR50D-contact-regression.pt