Closed JohnClaw closed 3 weeks ago
Hello and welcome to the OpenLocalUI repository!
Thank you for your interest in contributing to our project. To get started, please review the following resources:
We appreciate your contributions and look forward to collaborating with you!
Best regards, The OpenLocalUI Team
Oh, just read that it requires ollama. Closing the issue.
Hello @JohnClaw! Nice to get your feedback.
In the future Ollama will be automatically installed by the setup wizard, for now you'll have to do that manually. However I did a quick search in the Ollama models library listing available models to pull and the one you mentioned is not on the list. If you want to add a model from a GUFF file you should refer to this page. You can still do that from inside the app (never tested), I think you'd need to place the GUFF in the app directory, omit the base model in the create dialog and reference the the download one in the modelfile (FROM ./NikolayKozloff/SambaLingo-Serbian-Chat-GGUF).
Taking note of that. I'll open a new issue to add the feature
Hello, @JohnClaw! I've added the feature #38 you needed, hope this makes the process easier :)
Hello, @JohnClaw! I've added the feature #38 you needed, hope this makes the process easier :)
Thanks. Please, compile the binary for Windows.
It's a matter of a pair of days or less. We just wanted to fix the remaining bugs
I tried to download this gguf: https://huggingface.co/NikolayKozloff/SambaLingo-Serbian-Chat-GGUF Unfortunately, Open Local UI can't do that for some reason. It displays this dialog for eternity: What am i doing wrong? How can i fix that? Help me, please.