MobileLLM / AutoDroid

Source code for the paper "Empowering LLM to use Smartphone for Intelligent Task Automation"
https://arxiv.org/abs/2308.15272
MIT License
260 stars 33 forks source link

How to use a local llm or open source model #13

Closed alexsiu398 closed 7 months ago

alexsiu398 commented 8 months ago

I saw the research paper mentioned "The local LLM Vicuna-7B [6] is deployed on the smartphone based on Machine Learning Compilation for LLM (MLCLLM)". How to setup the local llm and run the ./scripts with local llm?

wenh18 commented 8 months ago

We deployed Vicuna-7B on smartphone only to evaluate its per-step inference latency, and did not run the whole pipline of AutoDroid on it, as we believe this requires only engineering efforts. You could refer to our artifacts repo https://github.com/autodroid-sys/artifacts/tree/main/Fastchat for fine-tuning Vicuna-7B.

alexsiu398 commented 7 months ago

the local llm can be hosted on computer instead of the smartphone.