Open Ismael opened 2 months ago
Welcome @Ismael
Which LLM would you like to use?
You can have a look at the relevant keys to set within the gpt_researcher/llm_provider
folder
You'll want to replace the : with = in your .env
For example, the minimalistic .env is:
TAVILY_API_KEY=_____________
OPENAI_API_KEY=_____________
Replace _____ with the relevant keys
I'd like to use groq + llama 3 I think the issue is that it's trying to use some embeddings and that's not available on groq
Have a look at the env examples here
Search the codebase for the alternative env variables related to groq embedding models.
I.e. what are the alternatives for:
OLLAMA_EMBEDDING_MODEL=all-minilm:22m EMBEDDING_PROVIDER=ollama
I'm using the docker-compose with this env
I did try to use LLM_PROVIDER: groq but that failed with an error that the openai key wasn't set.