Closed endolith closed 4 months ago
I would absolutely love to make this change, as well as allow support for using it with langchain. I've been planning on doing it for a while.
They both use a slightly different API so it's not as simple as adding support for custom model name strings in the config (and setting it as a custom endpoint).
University has started back up for me so the rate of development for KoalaClient will slow again, but I'll try to get around to implementing this in the next month or so. Or, you can always submit a pr :)
Need to add switching between different API standards + #94 as a single release, needs both to properly work with OpenRouter/Langchain
Or, you can always submit a pr :)
I know nothing about typescript. :(
custom endpoints/models in 2.1.0 :)
Are "Max tokens" and "Max context" swapped in the configuration screen? GPT-4 says 128000 Max tokens and 4096 max context, which seems backwards
oops, yeah. this will be fixed in 2.1.0b #115
There are a bunch of models on https://openrouter.ai/docs#models, some of which are free. Is it possible to support them?