Open robertgshaw2-neuralmagic opened 6 months ago
Reason for issue is that the latest transformers version 4.39.3
does not have Qwen2MoE support yet.
Once they push up the next release this will be resolved.
If someone stumbles on this and can't wait :)
git clone https://github.com/huggingface/transformers
cd transformers
pip install -e .
Reason for issue is that the latest transformers version
4.39.3
does not have Qwen2MoE support yet.Once they push up the next release this will be resolved.
This is clearly at odds with what Qwen officials say.
You can just look on huggingface/transformers to see:
Qwen2
does exist in 4.39.3, but Qwen2MoE
does not
Your current environment
🐛 Describe the bug
Launch:
Result: