intel / intel-npu-acceleration-library

Intel® NPU Acceleration Library
Apache License 2.0
499 stars 54 forks source link

Example on README throws TypeError: LlamaAttention.forward() got an unexpected keyword argument 'position_embeddings'` when executed #117

Open mgagvani opened 2 months ago

mgagvani commented 2 months ago

Running the TinyLlama Example shown in the README results in TypeError: LlamaAttention.forward() got an unexpected keyword argument 'position_embeddings' when executed.

To Reproduce

  1. Run Llama script from the README

Desktop (please complete the following information):

hyppo6688 commented 2 months ago

pip install transformers==4.42.4 can fix it

hyppo6688 commented 2 months ago

The update is not timely. The transformers package is updated too quickly. It can be supported up to 4.42.4, but there will be problems with 4.43.0.

hyppo6688 commented 2 months ago

我的python=3.9

vunnyso commented 1 month ago

With Run Llama script from the README.md and When using python=3.9 and using pip install transformers==4.42.4 I got below error

ModuleNotFoundError: No module named 'transformers.modeling_rope_utils'
kv244 commented 1 month ago

Same error here. No module named modeling_rope_utils.

Escape9002 commented 1 month ago

I tried the full example https://intel.github.io/intel-npu-acceleration-library/llm.html from the NPU Library Documentation with pip install transformers==4.42.4 and python=3.12.6 and its working.

It seems the examples from the git repo and the code from the pip install do not work together yet, at least thats what I understood from here: #108

justindpnt commented 1 month ago

Same issue. Any updates on this would be great! It prevents you from using the newer models.

YadominJinta commented 2 weeks ago

It has been fixed but not released, you need to install the library from source.