Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, and more.
https://anythingllm.com
MIT License
24.89k stars 2.52k forks source link

[BUG]: Error fetching from gemini #1082

Closed wwjCMP closed 6 months ago

wwjCMP commented 6 months ago

How are you running AnythingLLM?

AnythingLLM desktop app

What happened?

Snipaste_2024-04-11_10-10-04

Are there known steps to reproduce?

No document,just talk Snipaste_2024-04-11_10-12-10

timothycarambat commented 6 months ago

Unfortunately, I do not think you can use Google's models via API service (I am assuming you are based in a Mandarin-speaking country from the prompt) since their API is restricted to specific countries at this time.

Google docs: https://support.google.com/gemini/answer/13575153?hl=en#

Is this the case for you? If not, then there very well may be a bug, but i am currently unable to replicate with a valid Gemini key.

https://github.com/google/generative-ai-js/issues/29

wwjCMP commented 6 months ago

I'm not sure what the reason is, but there are two points. First, I can correctly use gemini-1.0-pro in any other service. Second, I have issues with the local ollama service; anythingllm cannot pull the local model, so I cannot select the model.

timothycarambat commented 6 months ago

Are you using gemini-1.0 pro on a cloud-based service? If so, that is why.

Second, for ollama are you using 127.0.0.1 and have ollama serve running? https://github.com/Mintplex-Labs/anything-llm/blob/master/server/utils/AiProviders/ollama/README.md

wwjCMP commented 6 months ago

Everything is running normally now, thank you.