While using the default requirements.txt, when running inference.py:
File "/site-packages/transformers/models/llama/modeling_llama.py", line 305, in __init__
self.rotary_emb = LlamaRotaryEmbedding(config=self.config)
TypeError: ScaledRotaryEmbedding.__init__() got an unexpected keyword argument 'config'
When try locking version to transformers==4.28.1 and tokenizers==0.13.3 (need to install Rust Complier):
File "/site-packages/peft/peft_model.py", line 37, in <module>
from transformers import Cache, DynamicCache, EncoderDecoderCache, PreTrainedModel
ImportError: cannot import name 'Cache' from 'transformers' (/site-packages/transformers/__init__.py)
Tried with various transformers and tokenizers versions, as well as Python 3.10, but still unable to run it successfully.
File "/site-packages/peft/peft_model.py", line 37, in <module>
from transformers import Cache, DynamicCache, EncoderDecoderCache, PreTrainedModel
ImportError: cannot import name 'EncoderDecoderCache' from 'transformers' (/site-packages/transformers/__init__.py)
Running on macOS with Python 3.12
While using the default requirements.txt, when running inference.py:
When try locking version to transformers==4.28.1 and tokenizers==0.13.3 (need to install Rust Complier):
Tried with various transformers and tokenizers versions, as well as Python 3.10, but still unable to run it successfully.
Please help.