pytorch / executorch

On-device AI across mobile, embedded and edge for PyTorch
https://pytorch.org/executorch/
Other
1.69k stars 288 forks source link

Support Phi 3 model #3550

Open iseeyuan opened 4 months ago

iseeyuan commented 4 months ago

With limited memory on most of phones, there's community requests on supporting a model with a smaller size like Phi-3 mini. It may be supported out of box, but need to verification, evaluation and profiling.

mikekgfb commented 4 months ago

Exciting. Can we try this in torchchat?

I was also looking at how to run a model on watchOS because -- we have iOS and macOS with ET now, so looking for the next exciting platform. And then... Max size: 75 MB ?!

This might be an interesting experiment. It may also be an exciting item for a community member to prototype!

salykova commented 3 months ago

@iseeyuan

in addition to Phi3, here is a list of some of the most popular tiny llms used within the open-source community:

  1. OpenElm https://huggingface.co/apple/OpenELM
  2. TinyLlama 1.1B https://github.com/jzhang38/TinyLlama
  3. StableLM 1.6B and 3B https://huggingface.co/stabilityai/stablelm-2-zephyr-1_6b https://huggingface.co/stabilityai/stablelm-2-1_6b-chat https://huggingface.co/stabilityai/stablelm-zephyr-3b
  4. RWKV 1.6B and 3B models https://huggingface.co/RWKV/rwkv-5-world-3b https://huggingface.co/RWKV/rwkv-6-world-3b https://huggingface.co/RWKV/rwkv-6-world-1b6 https://huggingface.co/RWKV/rwkv-5-world-1b5
  5. Qwen 1.8B and 4B models https://huggingface.co/Qwen/Qwen1.5-1.8B-Chat https://huggingface.co/Qwen/Qwen1.5-4B-Chat

Tinyllama should work out of the box since the same model architecture and the same tokenizer. TinyLlama is of particular interest because almost every mobile phone can run it

helunwencser commented 3 months ago

I am working on getting Phi-3 enabled via ExecuTorch. Will post here once I finish it. After that, we can try it on TorchChat as well.

iseeyuan commented 3 months ago

@salykova Thank you for the list! We may picked Phi3 as it's relatively new and popular, but are definitely considering enabling other models. The long term goal is that we improve our infrastructure during the first couple of models, and the community can contribute to enabling other models using the infrastructure. With that said, you are welcome to enable other models and feel free to submit your PR!

devYonz commented 3 months ago

Would love to see 27tokens/s on mobile with Executorch and phi silica https://learn.microsoft.com/en-us/windows/ai/apis/phi-silica

helunwencser commented 4 weeks ago

We are able to export and run phi-3-mini on ExecuTorch. Please follow the readme page here.