intel / intel-extension-for-pytorch

A Python package for extending the official PyTorch that can easily obtain performance on Intel platform
Apache License 2.0
1.58k stars 244 forks source link

Any plans to add a vision language model (e.g. Llama 3.2)? #716

Open asmith26 opened 1 week ago

asmith26 commented 1 week ago

Hi there,

Just wondering if you have plans to add fine-tuning/inference support of VLMs like Llama 3.2 (https://ai.meta.com/blog/llama-3-2-connect-2024-vision-edge-mobile-devices/)?

Many thanks for any help! :)

ZailiWang commented 1 week ago

We have supported Llama 3.2 VLM & LLM models. Please check the link: https://github.com/intel/intel-extension-for-pytorch/blob/2.6-llama-3/docs/index.md#211-run-vision-text-generation-with-llama-32-11b-models-using-bf16-autotp-tensor-parallel-on-multiple-cpu-numa-nodes

asmith26 commented 1 week ago

Thanks for the info @ZailiWang it looks like this might only be for CPU (apologies if I'm misunderstanding)? Thanks again

ZailiWang commented 6 days ago

Hi @asmith26 , yes IPEX support for Llama3.2 is for CPU. Please check this page for Gaudi or AIPC (Core Ultra or discrete Arc graphics) solutions. Thanks.