zjunlp / OntoProtein

[ICLR 2022] OntoProtein: Protein Pretraining With Gene Ontology Embedding
MIT License
138 stars 22 forks source link

No module named 'transformers.deepspeed' #3

Closed yuzhiguo07 closed 2 years ago

yuzhiguo07 commented 2 years ago

When I tried to run the sample command sh run_main.sh ......

Traceback (most recent call last):
  File "run_downstream.py", line 8, in <module>
    from src.models import model_mapping, load_adam_optimizer_and_scheduler
  File "/mnt/SSD2/pmtnet_proj/code/github/OntoProtein/src/models.py", line 16, in <module>
    from transformers.deepspeed import is_deepspeed_zero3_enabled
ModuleNotFoundError: No module named 'transformers.deepspeed'
yuzhiguo07 commented 2 years ago

And also it seems that there are no model_mapping, load_adam_optimizer_and_scheduler in src.models, I think it should be from src.benchmark.models import model_mapping, load_adam_optimizer_and_scheduler in run_downstream.py line 8 instead of from src.models import model_mapping, load_adam_optimizer_and_scheduler

Alexzhuan commented 2 years ago

Hi, thanks for your interest in our work!

This is our negligence when merging the code of benchmark into the original code of pre-training model. We will fix the error.

For the above error about transformers.deepspeed, you could update the version of transformers and install the library deepspeed.