jakobdylanc / llmcord

A Discord LLM chat bot that supports any OpenAI compatible API (OpenAI, xAI, Mistral, Groq, OpenRouter, Ollama, LM Studio and more)
MIT License
375 stars 69 forks source link

openai.APIError #77

Closed jdlaci closed 3 days ago

jdlaci commented 3 days ago

I am experiencing an open AI error but I am not using Openai. I am using openrouter to google. llm of https://openrouter.ai/models?q=google)/gemini-flash-1.5-exp

please help

2024-11-23 17:13:02,391 INFO: HTTP Request: POST https://openrouter.ai/api/v1/chat/completions "HTTP/1.1 200 OK" 2024-11-23 17:13:02,753 ERROR: Error while generating response Traceback (most recent call last): File "/home/--force-badname/llmcord/llmcord.py", line 199, in on_message async for curr_chunk in await openai_client.chat.completions.create(**kwargs): File "/home/--force-badname/.local/lib/python3.10/site-packages/openai/_streaming.py", line 147, in aiter async for item in self._iterator: File "/home/--force-badname/.local/lib/python3.10/site-packages/openai/_streaming.py", line 174, in stream raise APIError( openai.APIError: Provider returned error

jakobdylanc commented 3 days ago

Is that the full error? Just making sure you didn't cut anything out.

Have you tried other models from OpenRouter?

jdlaci commented 3 days ago

Yes, it was the full error ended up trying a few other models. eventually found one that worked. I the model wasn't working

jdlaci commented 3 days ago

I was able to get it to work with new model