AbaciNLP / InvestLM

96 stars 15 forks source link

Unable to run inference.py, there are various dependency issues #4

Open Zeeeeta opened 2 weeks ago

Zeeeeta commented 2 weeks ago

Running on macOS with Python 3.12

While using the default requirements.txt, when running inference.py:

  File "/site-packages/transformers/models/llama/modeling_llama.py", line 305, in __init__
    self.rotary_emb = LlamaRotaryEmbedding(config=self.config)
TypeError: ScaledRotaryEmbedding.__init__() got an unexpected keyword argument 'config'

When try locking version to transformers==4.28.1 and tokenizers==0.13.3 (need to install Rust Complier):

  File "/site-packages/peft/peft_model.py", line 37, in <module>
    from transformers import Cache, DynamicCache, EncoderDecoderCache, PreTrainedModel
ImportError: cannot import name 'Cache' from 'transformers' (/site-packages/transformers/__init__.py)

Tried with various transformers and tokenizers versions, as well as Python 3.10, but still unable to run it successfully.

  File "/site-packages/peft/peft_model.py", line 37, in <module>
    from transformers import Cache, DynamicCache, EncoderDecoderCache, PreTrainedModel
ImportError: cannot import name 'EncoderDecoderCache' from 'transformers' (/site-packages/transformers/__init__.py)

Please help.

adrwong commented 1 week ago

Peft version needs to be locked as well Please refer to my PR, which fixed the issue.

https://github.com/AbaciNLP/InvestLM/pull/5

@yixuantt Please help review and merge.

yixuantt commented 1 week ago

Merged. Thanks for your work.