VincyZhang / intel-extension-for-transformers

Extending Hugging Face transformers APIs for Transformer-based models and improve the productivity of inference deployment. With extremely compressed models, the toolkit can greatly improve the inference efficiency on Intel platforms.
Apache License 2.0
0 stars 0 forks source link

missing dependencies #31

Closed VincyZhang closed 5 months ago

VincyZhang commented 5 months ago

Describe the issue When I use pip install intel-extension-for-transformers in a fresh conda environment, there are some packages missing that I have to manually install before I can run a model.

Please add these to setup.py as deps!

To reproduce, simply:

from intel_extension_for_transformers.transformers import AutoModelForCausalLM

model_name = "meta-llama/Llama-2-7b-hf"

model = AutoModelForCausalLM.from_pretrained(model_name, load_in_4bit=True)