Closed wwjCMP closed 6 months ago
Unfortunately, I do not think you can use Google's models via API service (I am assuming you are based in a Mandarin-speaking country from the prompt) since their API is restricted to specific countries at this time.
Google docs: https://support.google.com/gemini/answer/13575153?hl=en#
Is this the case for you? If not, then there very well may be a bug, but i am currently unable to replicate with a valid Gemini key.
I'm not sure what the reason is, but there are two points. First, I can correctly use gemini-1.0-pro in any other service. Second, I have issues with the local ollama service; anythingllm cannot pull the local model, so I cannot select the model.
Are you using gemini-1.0 pro on a cloud-based service? If so, that is why.
Second, for ollama are you using 127.0.0.1 and have ollama serve
running?
https://github.com/Mintplex-Labs/anything-llm/blob/master/server/utils/AiProviders/ollama/README.md
Everything is running normally now, thank you.
How are you running AnythingLLM?
AnythingLLM desktop app
What happened?
Are there known steps to reproduce?
No document,just talk