bigscience-workshop / petals

🌸 Run LLMs at home, BitTorrent-style. Fine-tuning and inference up to 10x faster than offloading
https://petals.dev
MIT License
9.26k stars 524 forks source link

Feature Request - Support For VLM's #616

Closed asmit203 closed 3 weeks ago

asmit203 commented 3 weeks ago

Feature request: The SOTA models for VLM are getting bigger and bigger recent being Aria. Hence it would be great to add support to VLMs too.

I would also like to contribute to this matter, if possible, let me know.