Open samy80 opened 1 month ago
Yeah, I am planning to allow the models to be changed via a drop down selection just above the "user chat area" as I too have found myself changing models mid discussion to take advantage of the strengths of different models (particularly now we have GPT-o1
).
I should also point out that I completely ditched the Ollama stuff after finding so many bugs that I wasn't really sure what was getting sent back to the models :/ So the images you'll see in the readme are out of date and it currently just takes an OpenAI endpoint and key again... You can still use via Ollama using the OpenAI compatibile API, but it doesn't get the model list nor rely on the Ollama API anymore (I currently use it via: llama.cpp
server for local models, and openai
and/or openrouter
for non-local models).
The reason I never packaged it is because I didn't really want to step on the toes of the original author :) I really just wrote this for myself and in the process stripped away a lot of his Java-specific stuff so I could use it with other languages (mainly C++ but have recently used it a lot with Python). I just got carried away and added lots of other stuff: templated prompts, compare editor, spell checking and so on :D
It looks like he may have abandoned his project now, so if that is the case I will try and polish it up more and make a proper release version people can use.
Don't abandon ollama. It's a very viable alternative to commercial models. Plus lots of new models popping every day. Definitely worth it. Also confidentiality... Maybe have a parallel dev branch? (indeed, i couldn't find the model list as shown in your screenshots...) And definitely go for packaging the plugin, please!
Don't abandon ollama. It's a very viable alternative to commercial models. Plus lots of new models popping every day. Definitely worth it. Also confidentiality... Maybe have a parallel dev branch? (indeed, i couldn't find the model list as shown in your screenshots...) And definitely go for packaging the plugin, please!
Don't worry it will still work with Ollama via Ollama's "OpenAI compatibile API" endpoint; it just doesn't use the old/buggy Ollama API endpoint any more! :)
Sure, but what about the list of models?
Sure, but what about the list of models?
Most of the OpenAI API comparable endpoints have a way to get the model list (usually v1\models
) so it shouldn't be a problem.
I've got a couple of other things on the go but being able to select models is definitely my next priority: I'm currently using Claude Sonnet 3.5
via openrouter, GPT-4-Turbo
via OpenAI and a couple of other models via llama.cpp
, and having to switch these is a lot of hassle!
I've added back the ability to get the model list but only tested it on openrouter
and openai
, so not sure if it works with ollama
or llama.cpp
's server...
Beware the settings page is very buggy unless you quit and reopen it and Apply
and Restore Defaults
don't work properly a lot of the time (I'll try and get to the bottom of this later).
I'll test it here on ollama and let you know.
Are you making any progress on packaging as a plugin? It's just cumbersome having two instances of eclipse always running...
I'll test it here on ollama and let you know.
I'd hold off as the current method isn't really usable at all:
https://github.com/jukofyork/aiassistant/discussions/4
Are you making any progress on packaging as a plugin? It's just cumbersome having two instances of eclipse always running...
I'll look into it after I've done the last few things on my ToDo list.
Hi! Very nice work you did with this plugin. Haven't you manage to package it for easy install?
I think it could be interesting to allow the user to switch model in the chat, or have the same query reevaluated by another model for alternatives.