Closed jakobklein closed 2 weeks ago
Yeah, I removed all the stuff that uses the Ollama API as it was too buggy and since then Ollama added an Open AI compatable API it isn't really needed anymore.
You can still use it will Ollama if you add /v1
to the URL, but it won't show the list of models like is did before:
http://localhost:11434/v1/
or possibly:
http://localhost:11434/v1
I do plan to add the ability to get a list of models again, but all the OpenAI compatable servers have a slightly different way of listing the model names.
Once I've done that I will redo all the screenshots for Github as there are quite a lot of improvements I've added since them:
and a lot of other little tweaks.
Thanks alot for the quick answer. Got it running with ollama and llama3.2:latest and I am amazed by the features you added.
Thanks alot for the quick answer. Got it running with ollama and llama3.2:latest and I am amazed by the features you added.
No problem and glad you got it going!
I also removed a lot of features from Wojciech's version:
@inject
Dependency Injection stuff as it was a huge source of really weird NullPointerException
bugs... Sadly I think these were probably 99% of the reason people struggled to get Wojciech's version working and why he might have stopped working on it :/The only things left on my ToDo list of things to add are:
I've added back the ability to get the model list but only tested it on openrouter
and openai
, so not sure if it works with ollama
or llama.cpp
's server...
Beware the settings page is very buggy unless you quit and reopen it and Apply
and Restore Defaults
don't work properly a lot of the time (I'll try and get to the bottom of this later).
I'd hold off as the current method isn't really usable at all:
Hi, I found your comments on the parent-project (gradusnikov) very helpful and was hoping that your fork would incorporate these ideas.
I built the Plugin as described but the preferences dialog is all about OpenAI. Instead of "Ollama API Base Address" (like in the screenshot) I get "OpenAI API Base Address" and "OpenAI API Key" My local ollama installation seems to be working correctly since it responds on http://127.0.0.1:11434
I guess the latest sources (of the version shown in the screenshots on your github project page) are not on github since I didn't find any hint to "ollama" neither in PreferencePage.java nor in the rest of the plugin-code.