Open mgagvani opened 2 months ago
pip install transformers==4.42.4 can fix it
The update is not timely. The transformers package is updated too quickly. It can be supported up to 4.42.4, but there will be problems with 4.43.0.
我的python=3.9
With Run Llama script from the README.md and When using python=3.9
and using pip install transformers==4.42.4
I got below error
ModuleNotFoundError: No module named 'transformers.modeling_rope_utils'
Same error here. No module named modeling_rope_utils.
I tried the full example https://intel.github.io/intel-npu-acceleration-library/llm.html from the NPU Library Documentation with pip install transformers==4.42.4
and python=3.12.6
and its working.
It seems the examples from the git repo and the code from the pip install do not work together yet, at least thats what I understood from here: #108
Same issue. Any updates on this would be great! It prevents you from using the newer models.
It has been fixed but not released, you need to install the library from source.
Running the TinyLlama Example shown in the README results in
TypeError: LlamaAttention.forward() got an unexpected keyword argument 'position_embeddings'
when executed.To Reproduce
Desktop (please complete the following information):