cheshire-cat-ai / core

Production ready AI agent framework
https://cheshirecat.ai
GNU General Public License v3.0
2.14k stars 282 forks source link

[BUG] using Gemini LLM doesn't work in main branch #822

Open mottolini opened 1 month ago

mottolini commented 1 month ago

Describe the bug After you insert your key in the configuration of both the LLM and the Embedder, whenever you try to get a response you receive the error:

Invalid argument provided to Gemini: 400 * GenerateContentRequest.generation_config.stop_sequences: the number of stop_sequences must not exceed 5.

To Reproduce This happens on branch main, commit c79272bcd3ae60a4037b5e2e42f12d4106f1d033. It appears to be already fixed on branch develop, commit 3860a486d56ef594df9f8c87f418e436e66420f3

pieroit commented 1 month ago

@mottolini can you confirm it works with VPN in develop? Gave more info in #827

pieroit commented 4 weeks ago

@mottolini can you check if the proble is still there in Cat 1.6.2?

Kleomen commented 2 days ago

@mottolini can you check if the proble is still there in Cat 1.6.2?

It is even worse because currently you cannot set a Gemini API key on v 1.6.2

INFO:     172.17.0.1:46620 - "PUT /llm/settings/LLMOpenAIConfig HTTP/1.1" 400 Bad Request
INFO:     172.17.0.1:46814 - "GET /admin/favicon.ico HTTP/1.1" 200 OK
INFO:     172.17.0.1:46814 - "GET /admin/favicon.ico HTTP/1.1" 200 OK
INFO:     172.17.0.1:46814 - "GET /llm/settings HTTP/1.1" 200 OK
[2024-07-01 08:07:27.264] ERROR  cat.routes.llm..upsert_llm_setting::130
GoogleGenerativeAIError('Error embedding content: 504 Deadline Exceeded')