jackschedel / KoalaClient

The best LLM API Playground Interface (for me)
https://client.koaladev.io/
Creative Commons Zero v1.0 Universal
26 stars 8 forks source link

Settings menu for custom model names #94

Closed imesha10 closed 4 months ago

imesha10 commented 6 months ago

Feature Request

Summary:
Allow users to input custom model names in the model selection dropdown or within the settings when choosing which AI model to interact with.

Problem:
Certain proxies for OpenAI services have been experiencing issues where the /models endpoint might be malfunctioning or is not populating the model selection list as expected. This results in users being unable to select from the full range of available models, potentially limiting their access to specific or preferred AI models that are otherwise operational.

Proposed Solution:
Implement a feature within the chat interface that enables a user to add custom model names directly to the model selection dropdown. This could take the form of an input box within the dropdown itself, allowing for direct typing of the model name. Alternatively, a small section in the settings could be dedicated to managing custom model names where users could add or remove model names that they frequently use or wish to test.

Benefits:

jackschedel commented 6 months ago

@imesha10 Hi, I don't ever use custom proxies, but I'd be willing to accept a PR for this implemented as a settings menu for it like there is with Prompt Library (ideally with the ability to hide certain default ones in the dropdown.) with the ability to specify token pricing and max tokens.

From a code standpoint, this will require refactoring away from the custom ModelChoice type, probably just to a standard string.

imesha10 commented 6 months ago

@imesha10 Hi, I don't ever use custom proxies, but I'd be willing to accept a PR for this implemented as a settings menu for it like there is with Prompt Library (ideally with the ability to hide certain default ones in the dropdown.)

From a code standpoint, this will require refactoring away from the custom ModelChoice type, probably just to a standard string.

Alright, I'll try to do a pr in the comming weeks. I have very little experiece with node and related but I will be able to figure it out.

Please close this issue when you see it and I will reopen a pull request once I'm done.

If I don't then assume that I failed.

(oh and when I meant proxies, what I meant was custom openai endpoints.)

Thanks!

jackschedel commented 6 months ago

I'll leave the issue open, since I think it is a good change. (I have alot of issues open that I'll never spend the time to implement but maybe someone else will take pity on me and help xD)

Please let me know if you need any help navigating the codebase, I know learning React can be pretty daunting for beginners.

jackschedel commented 5 months ago

btw, i'm working on this myself now, so no need to try. development is occurring on the 2.0.9 branch

imesha10 commented 5 months ago

btw, i'm working on this myself now, so no need to try. development is occurring on the 2.0.9 branch

Ok that's good. Thank you.

imesha10 commented 5 months ago

@jackschedel Oh ya by the way if you didn't know. OpenAI default endpoint and most openai proxies gives a list of all the available models via the /v1/models endpoint. So all you have to do is send a call to that and it will return a json object that you can then parse into the application. This is how applications like SillyTavern and other ones work.

image You can find more info in here https://platform.openai.com/docs/api-reference/models

If you want to see examples in a different language then press this drop box and choose: image

jackschedel commented 5 months ago

I'm instead implementing the ability to do fully custom model names, so KoalaClient can be used with any endpoint.

imesha10 commented 5 months ago

I'm instead implementing the ability to do fully custom model names, so KoalaClient can be used with any endpoint.

Ya it's definitely a very useful feature to be able to enter custom model names so it can be used with any endpoint type and it basically will fix my problem of being unable to use certain models.

However I should mention that after looking at this project (KoalaClient) and using it for a while I have some thoughts. I think you should definitely after this feature or some features down the line consider making it so that it auto populates the model list for the ones that has that feature.

I'm taking a blind guess but I'm guessing most of your users will be openai (85%) anthropic/claude(10%), and other(5%).

So if you implement this later you would be able to get most of your users all the models they have access to (all openai models from /v1/models). This makes it so that most of your users will not have to go into the api connection menu and manually input their specific models which will make the app feel simple and very easy to use.

Currently KoalaClient can't seem to fetch other openai models and it's limited to: image image

Here is a similar one where it auto fetches in SillyTavern via a proxy: image

I just complied the latest version 3.0.9 and it does not seem to have the model "gpt-4-0125-preivew". So if you were to auto populate the list you would not have to constantly update it with the latest and the best models at least for openai.

Anyways these are just thoughts and I might be wrong or thinking in the wrong direction. Thanks for you time.

jackschedel commented 5 months ago

Thanks for your input. Btw, 2.1.0 is still in development, not in a working state yet.