crewAIInc / crewAI

Framework for orchestrating role-playing, autonomous AI agents. By fostering collaborative intelligence, CrewAI empowers agents to work together seamlessly, tackling complex tasks.
https://crewai.com
MIT License
21.38k stars 2.97k forks source link

[FEATURE] CrewAI memory support for watsonx.ai #1288

Open ticlazau opened 2 months ago

ticlazau commented 2 months ago

Feature Area

Agent capabilities

Is your feature request related to a an existing bug? Please link it here.

Feature request related to CrewAI memory - https://docs.crewai.com/core-concepts/Memory/

Describe the solution you'd like

Hello,

Currently in CrewAI for memory features, there is support only for a few providers like OpenAI, Microsoft, and Google. I am using llama3.1-405b-instruct from IBM watsonx.ai via Langchain integration (documentation is here ) however the embedder model from IBM watsonx.ai can't be added like in the documentation: https://docs.crewai.com/core-concepts/Memory/#using-openai-embeddings-already-default

embedder={ "provider": "IBM", "config":{ "model": 'ibm/slate-125m-english-rtrvr', "url": 'https://us-south.ml.cloud.ibm.com/', "project_id": 'xxxxxxxxxxxxxxxxxxxxxxxx' }

Describe alternatives you've considered

No response

Additional context

Can this support/feature be added in a different way in Crewai to enable the memory?

Thank you, FM

Willingness to Contribute

No, I'm just suggesting the idea

joaomdmoura commented 1 month ago

🔥 Nice request, def looking into it

ticlazau commented 1 month ago

Thank you @joaomdmoura

github-actions[bot] commented 3 weeks ago

This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.

ticlazau commented 3 weeks ago

Do you have any plans to move this forward?

SalvatoreCognetta commented 2 weeks ago

Hello @ticlazau, I was looking into the code and documentation, and it seems that crewai uses chromadb embedding functions as embedders. Therefore the "problem" relies in chromadb and you could open an issue on it asking for a feature request, after the implementation crewai should release a new version with the new chromadb. Is it correct @joaomdmoura ?

SalvatoreCognetta commented 2 weeks ago

A possible workaround to avoid a PR on chroma and one on crewai, could be to use the create_langchain_embedding function available in chromadb from v0.5.0. In this sense we could create a WatsonxEmbedding with langchain_ibm and pass the function directly to the crew.

If we upgrade chromadb to v0.5.0 and embedchain (which requires chromadb to be <0.5.0), this could be a possible solution:

from crewai import Crew, Agent, Task, Process
from langchain_ibm.embeddings import WatsonxEmbeddings
from chromadb.utils.embedding_functions import create_langchain_embedding

wx_emb_langchain = WatsonxEmbeddings(
  apikey='xxx', 
  model_id='ibm/slate-125m-english-rtrvr', 
  project_id='xxx', 
  url='https://xx-yy.ml.cloud.ibm.com'
)
wx_chroma_embedder = create_langchain_embedding(wx_emb_langchain)

my_crew = Crew(
    agents=[...],
    tasks=[...],
    process=Process.sequential,
    memory=True,
    verbose=True,
    embedder=wx_chroma_embedder
)
joaomdmoura commented 2 weeks ago

Finally getting to prioritize this :D cc @bhancockio sorry for the wait

paarttipaabhalaji commented 5 days ago

Hi @joaomdmoura . I have one query regarding this memory. Now I am working on the crewai flow, contains 'n' number of crews. Flow orchestrate the entire crewai. If i want to enable the memory, only crews() have an option to configure the memory. I need to traverse those 'n' number of crews and configure 'n' times. if the flow() have an option to configure the memory, which will be common to all the crews, then its very easy and very collaborative right?

Please share the possibilities about configuring the memory in flow().