mistralai / mistral-inference

Official inference library for Mistral models
https://mistral.ai/
Apache License 2.0
9.37k stars 817 forks source link

TinyMistral? small llm for phones and computers with no gpu? #110

Open agonzalezm opened 6 months ago

agonzalezm commented 6 months ago

Hi, is there any plan to release a good performance small model 1B/2B/3B like TinyLlama, phi-2, etc

Most people want to run open source llms on local for specific tasks but have no gpu, having these smalls models that can inference fast enough with low resources ( smartphones, no gpu, etc ) has lot of demand and can be use for finetuning specific tasks.

Thanks