Closed richardstevenhack closed 1 month ago
Thanks for logging this, I suspect it's the same issue as https://github.com/sammcj/gollama/issues/82
I'll try to find time to look at it this week.
Sorry I've been snowed under with work, will get to this at some point!
No huge hurry, it's just part of my testing several GUI front ends and I don't want to eliminate LMStudio from consideration as long as it might be capable of using Ollama models via your link. That's LMStudio's biggest defect - no direct support for Ollama, unlike AnythingLLM and MSTY.
LM studio is really nice in some ways, but in others ... not so much - closed source and their licensing prevents commercial use.
MSTY is closed source, too, but the features - especially being able to embed an Obsidian Vault in the vector database for RAG - makes it a winner regardless. AnythingLLM has decent RAG support as well. The main advantage of LMStudio is the ease of searching for and downloading the HuggingFace models. And it looks pretty - but then so does AnythingLLM. :-)
Msty is very visually pleasing, I like the concept of 'knowledge stacks' and as you say the obsidian DB use case is great.
AnythingLLM works well but damn it makes my eyes bleed with how ugly it is!
LOL Matt Smith said the same in one of his videos. I'm pretty oblivious to that sort of thing. To me, dull gray is not that bad, although it's a little dark. I pretty much hate "dark modes" and much prefer a bright background. Helps with old eyes. :-) LMStudio has the same gray, but at least they've got a lot of blue in there. MSTY is black on dull white, which is great - not too dark, not too bright. easy to read. In general, developers shouldn't design the look of an app, because 98% of them suck at it. :-)
Hey @richardstevenhack can you try out v1.27.1 with your LM-Studio version?
I'm running the v0.3.0 beta 2 of LM studio and it's now fixed (as per https://github.com/sammcj/gollama/releases/tag/v1.27.1)
GitHub1.27.1 (2024-08-06) Bug Fixes lm-studio linting, feat: add -H to (#96) (3c5a5fc) What's Changed fix: lm-studio linting feat: add -H as shortcut to connect to localhost ollama api Full Changelog...
It's working fine now. Just tested it, LMStudio sees all my Ollama models now and I can load and use them.
Good job!
Bazinga! Great :)
Description
Had a mixup with LMStudio 0.2.27 with the Ollama models directory, which I had relocated from the default Ollama location. So I removed LMS 0.2.27 AppImage and installed the LMS 0.2.28 AppImage.
Upon executing "gollama -L" I see the list of Ollama models which are installed in the new Ollama models directory. But when I open LMStudio, it does not show the models.
How to reproduce
1) Install LMStudio 0.2.28 AppImage. 2) Run "gollama -L" to link all existing Ollama models to LMStudio's models directory. 3) Load LMStudio, go to AI Chat, and look for models to load.
Screenshots / Logs
Environment
go version
) : go version go1.22.5 linux/amd64Can you contribute?
Nope, no skill.