Closed andytriboletti closed 2 weeks ago
make sure your transformers >= 4.39.1.
I have transformers 4.39.1
(newtest) andy@andys-pc:~/qwen$ pip show transformers
Name: transformers
Version: 4.39.1
Summary: State-of-the-art Machine Learning for JAX, PyTorch and TensorFlow
Home-page: https://github.com/huggingface/transformers
Author: The Hugging Face team (past and future) with the help of all our contributors (https://github.com/huggingface/transformers/graphs/contributors)
Author-email: transformers@huggingface.co
License: Apache 2.0 License
Location: /home/andy/.local/lib/python3.10/site-packages
Requires: filelock, huggingface-hub, numpy, packaging, pyyaml, regex, requests, safetensors, tokenizers, tqdm
Required-by: outlines, vllm
(newtest) andy@andys-pc:~/qwen$ python example.py
Traceback (most recent call last):
File "/home/andy/qwen/example.py", line 5, in <module>
model = AutoModelForCausalLM.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/andy/miniconda3/envs/newtest/lib/python3.11/site-packages/transformers/models/auto/auto_factory.py", line 526, in from_pretrained
config, kwargs = AutoConfig.from_pretrained(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/andy/miniconda3/envs/newtest/lib/python3.11/site-packages/transformers/models/auto/configuration_auto.py", line 1064, in from_pretrained
config_class = CONFIG_MAPPING[config_dict["model_type"]]
~~~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/andy/miniconda3/envs/newtest/lib/python3.11/site-packages/transformers/models/auto/configuration_auto.py", line 761, in __getitem__
raise KeyError(key)
KeyError: 'qwen2'
I was able to install all the pip requirements on Windows WSL Ubuntu.
After pip install -r requirements.txt, I tried the example.py script from the README
The output: