jackschedel / KoalaClient

The best LLM API Playground Interface (for me)
https://client.koaladev.io/
Creative Commons Zero v1.0 Universal
26 stars 8 forks source link

Is it possible to support OpenRouter models? #103

Closed endolith closed 4 months ago

endolith commented 5 months ago

There are a bunch of models on https://openrouter.ai/docs#models, some of which are free. Is it possible to support them?

jackschedel commented 5 months ago

I would absolutely love to make this change, as well as allow support for using it with langchain. I've been planning on doing it for a while.

They both use a slightly different API so it's not as simple as adding support for custom model name strings in the config (and setting it as a custom endpoint).

University has started back up for me so the rate of development for KoalaClient will slow again, but I'll try to get around to implementing this in the next month or so. Or, you can always submit a pr :)

jackschedel commented 5 months ago

Need to add switching between different API standards + #94 as a single release, needs both to properly work with OpenRouter/Langchain

endolith commented 5 months ago

Or, you can always submit a pr :)

I know nothing about typescript. :(

jackschedel commented 4 months ago

custom endpoints/models in 2.1.0 :)

endolith commented 4 months ago

Are "Max tokens" and "Max context" swapped in the configuration screen? GPT-4 says 128000 Max tokens and 4096 max context, which seems backwards

jackschedel commented 4 months ago

oops, yeah. this will be fixed in 2.1.0b #115