crewAIInc / crewAI

Framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks.
https://crewai.com
MIT License
20.22k stars 2.8k forks source link

running crew with Crew parametar memory = True and local ollama LLM raises openai.AuthenticationError: #685

Closed goran-ristic-dev closed 1 month ago

goran-ristic-dev commented 4 months ago

example code: ` from crewai import Agent, Task, Crew

from langchain_community.chat_models import ChatOllama

phi3_ollama = ChatOllama(model="phi3") llama3_ollama = ChatOllama(model="llama3")

manager_agent = Agent(role="manager", goal="any goal", backstory="You are experienced in whatever", llm=phi3_ollama, verbose=True)

manager_task = Task( agent=manager_agent, description="do what ever is needed", expected_output="whatever you output is ok", llm=llama3_ollama, verbose=True)

crew = Crew(tasks=[manager_task], agents=[manager_agent], full_output=True, memory=True)

crew.kickoff({'task': "design simple webpage"})`

running this code produces an openai.AuthenticationError: To me looks like, reason for this issue is because when accessing Chromadb collection to access memory tries to use :OpenAIEmbeddingFunction .

Traceback (most recent call last): File "C:\Users\user\PycharmProjects\CrewAI\main.py", line 29, in crew.kickoff({'task': "design simple webpage"}) File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\crewai\crew.py", line 264, in kickoff result = self._run_sequential_process() File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\crewai\crew.py", line 305, in _run_sequential_process output = task.execute(context=task_output) File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\crewai\task.py", line 183, in execute result = self._execute( File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\crewai\task.py", line 192, in _execute result = agent.execute_task( File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\crewai\agent.py", line 222, in execute_task memory = contextual_memory.build_context_for_task(task, context) File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\crewai\memory\contextual\contextual_memory.py", line 24, in build_context_for_task context.append(self._fetch_stm_context(query)) File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\crewai\memory\contextual\contextual_memory.py", line 33, in _fetch_stm_context stm_results = self.stm.search(query) File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\crewai\memory\short_term\short_term_memory.py", line 23, in search return self.storage.search(query=query, score_threshold=score_threshold) # type: ignore # BUG? The reference is to the parent class, but the parent class does not have this parameters File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\crewai\memory\storage\rag_storage.py", line 95, in search else self.app.search(query, limit) File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\embedchain\embedchain.py", line 635, in search return [{"context": c[0], "metadata": c[1]} for c in self.db.query(**params)] File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\embedchain\vectordb\chroma.py", line 220, in query result = self.collection.query( File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\chromadb\api\models\Collection.py", line 327, in query valid_query_embeddings = self._embed(input=valid_query_texts) File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\chromadb\api\models\Collection.py", line 633, in _embed return self._embedding_function(input=input) File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\chromadb\api\types.py", line 193, in call result = call(self, input) File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\chromadb\utils\embedding_functions.py", line 201, in call embeddings = self._client.create( File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\openai\resources\embeddings.py", line 114, in create return self._post( File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\openai_base_client.py", line 1240, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\openai_base_client.py", line 921, in request return self._request( File "C:\Users\user\PycharmProjects\CrewAI.venv\lib\site-packages\openai_base_client.py", line 1020, in _request raise self._make_status_error_from_response(err.response) from None openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: fake. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}}

goran-ristic-dev commented 4 months ago

In some off issues i have found a hint for solution. Workaround: If you pass undocumented (not in docstring) parameter "embedder" to Crew class , issue with "memory = True" disappears. here is a parameter which needed to be added in order that memory=True works with local Ollama model:

embedder={ "provider": "ollama", "config": { "model": '', } } In code it should look like this:

crew = Crew(tasks=[manager_task], agents=[manager_agent], full_output=True, memory=True, embedder={ "provider": "ollama", "config": { "model": '<your_embedding_model>', } } )

theCyberTech commented 4 months ago

when using memory you are using embedding,

Currently embedding is provided by embedchain and whist the latest version supports Ollama the current version installed by crewai does not.

I woudl suggest either:

1) upgrading embedchain with pip install embedchain, or 2) use Higgingface embedding, or 3) USe LMStudio with the OpenAI constructor

github-actions[bot] commented 2 months ago

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

github-actions[bot] commented 1 month ago

This issue was closed because it has been stalled for 5 days with no activity.