OpenRouter spec does not perfectly conform to OpenAI so it has it's own ruby gem. But what's great about OpenRouter is that it supports TONS of models and seems to be a much more reliable, long-term way of running Llama 3, among other LLMs. Groq isn't so good if someone actually wants to use Llama 3 long term.
Obie started this task but didn't finish. He's the author of the OpenRouter gem so we can ask him questions if we have any:
OpenRouter spec does not perfectly conform to OpenAI so it has it's own ruby gem. But what's great about OpenRouter is that it supports TONS of models and seems to be a much more reliable, long-term way of running Llama 3, among other LLMs. Groq isn't so good if someone actually wants to use Llama 3 long term.
Obie started this task but didn't finish. He's the author of the OpenRouter gem so we can ask him questions if we have any:
https://github.com/obie/hostedgpt/commit/250467d887a5c268e42fa79d1cdf606020b43aa7
This is the API docs I found: https://openrouter.ai/docs/requests
Additional considerations for full OSS LLM support: https://github.com/AllYourBot/hostedgpt/pull/389#issuecomment-2187189335