joaomdmoura / crewAI

Framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks.
https://crewai.com
MIT License
17k stars 2.3k forks source link

Configure tool using Ollama and Groq #733

Open vladeziegler opened 1 month ago

vladeziegler commented 1 month ago

I've been trying to use llama3-8b (both locally and with Groq), but I cant seem to make my WebsiteSearchTool() work.

I checked the previous issues raised regarding tool configuration and made the following adjustments.

  1. Added from langchain_community.embeddings import OllamaEmbeddings to my main.py script

  2. Added an embedder to the crew embedder={ "provider": "ollama", "config": { "model": "llama3-8b-8192", } }

  3. As well as configured the tool website_search_tool = WebsiteSearchTool( config=dict( llm=dict( provider="ollama", config=dict( model="llama3-8b-8192", ), ), embedder=dict( provider="ollama", config=dict( model="llama3-8b-8192" ), ), ) )

  4. I also updated my toml with crewai[tools] crewai = { version = "^0.30.0rc5", extras = ["tools"] }

Yet, I'm still unable to run my tool properly. Here is the error message I get: groq.NotFoundError: Error code: 404 - {'error': {'message': 'The modelllama-3-8b-8192does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'code': 'model_not_found'}}

--> I've already checked and my access to the GROQ API works. I already downloaded the model locally.

Any thoughts on how to resolve that?

gadgethome commented 1 month ago

This may help with connecting to Groq

https://discordapp.com/channels/1192246288507474000/1242937559277899909/1242980525631602778