Open NikosDi opened 1 week ago
Sorry, the built-in "LLM picker" should have provided Mistral. We'll fix that and release a patched version. https://github.com/intel/AI-Playground/commit/c2a86037b946fe3b6a7227805d35e9893d683f06
Thanks for that.
But it doesn't reply generally, how could someone install a different model than those proposed by 'LLM picker" ?
Where can I find instructions?
You can try to download model files from huggingface into a dedicated folder under <InstallDir>\resources\service\models\llm\checkpoints
. Then you could probably be able to pick your own model in the ANSWER tab -- however, it may not work because we didn't validate them. Feel free to file issues of course.
@Nuullll It seems that in order to download a model e.g Mistral you have to sign-up to Huggingface, create a token and find a suitable app or run a script to download in reasonable time (using multithreaded download) the model.
And it could probably not working (not referring to Mistral obviously), eventually using AI Playground.
I think I'll wait for your next version with Mistral as a choice of "LLM picker".
Thank you
Hello. I downloaded AI Playground v1.22.1 for desktop GPUs which has a built-in "LLM picker" but unfortunately the dGPU version does not provide Mistral model- as mentioned in the release notes (it has a download link for gemma 7B - a gated LLM with 47.7 GB download requirements but not for Mistral)
I tried to find some instructions of where to download and install Mistral or any other HuggingFace tranformers 4.39 PyTorch LLM compatible model.
So, how do I use Mistral with AI Playground v1.22.1 Arc dGPU version ?
Thank you