Framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks.
I've been trying to use llama3-8b (both locally and with Groq), but I cant seem to make my WebsiteSearchTool() work.
I checked the previous issues raised regarding tool configuration and made the following adjustments.
Added from langchain_community.embeddings import OllamaEmbeddings to my main.py script
Added an embedder to the crew
embedder={ "provider": "ollama", "config": { "model": "llama3-8b-8192", } }
As well as configured the tool website_search_tool = WebsiteSearchTool( config=dict( llm=dict( provider="ollama", config=dict( model="llama3-8b-8192", ), ), embedder=dict( provider="ollama", config=dict( model="llama3-8b-8192" ), ), ) )
I also updated my toml with crewai[tools] crewai = { version = "^0.30.0rc5", extras = ["tools"] }
Yet, I'm still unable to run my tool properly. Here is the error message I get:
groq.NotFoundError: Error code: 404 - {'error': {'message': 'The modelllama-3-8b-8192does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'code': 'model_not_found'}}
--> I've already checked and my access to the GROQ API works. I already downloaded the model locally.
I've been trying to use llama3-8b (both locally and with Groq), but I cant seem to make my WebsiteSearchTool() work.
I checked the previous issues raised regarding tool configuration and made the following adjustments.
Added
from langchain_community.embeddings import OllamaEmbeddings
to my main.py scriptAdded an embedder to the crew
embedder={ "provider": "ollama", "config": { "model": "llama3-8b-8192", } }
As well as configured the tool
website_search_tool = WebsiteSearchTool( config=dict( llm=dict( provider="ollama", config=dict( model="llama3-8b-8192", ), ), embedder=dict( provider="ollama", config=dict( model="llama3-8b-8192" ), ), ) )
I also updated my toml with crewai[tools]
crewai = { version = "^0.30.0rc5", extras = ["tools"] }
Yet, I'm still unable to run my tool properly. Here is the error message I get:
groq.NotFoundError: Error code: 404 - {'error': {'message': 'The model
llama-3-8b-8192does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'code': 'model_not_found'}}
--> I've already checked and my access to the GROQ API works. I already downloaded the model locally.
Any thoughts on how to resolve that?