NexaAI / Awesome-LLMs-on-device

Awesome LLMs on Device: A Comprehensive Survey
MIT License
762 stars 106 forks source link

Introducing another on-device LLM framework #7

Closed xumengwei closed 5 days ago

xumengwei commented 6 days ago

Impressive work! The survey is very helpful for the community. I've asked my students to read it carefully 👍

I'd like to introduce our on-device LLM framework: mllm, a fast and lightweight (multimodal) LLM inference engine for mobile and edge devices. It would be much appreciated if you could include it in the survey so more people would see it :)

zhiyuan8 commented 6 days ago

I have tried mllm on my andriod phone, it is impressive and we will include it into our arxiv paper. Prof. Xu, please let your student submit a PR to include your work into our Github repo, and we look forward to more collaborations in the near future.