Open anifort opened 2 months ago
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.
still active feature request
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 5 days.
keep alive
Feature Area
Core functionality
Is your feature request related to a an existing bug? Please link it here.
Enabling memory cannot work when CrewAI application is deployed on a service that supports auto-scaling as the memory data storage files will not be present on new instance creation.
Describe the solution you'd like
Supporting client/server database integration and/or vertor storage integration where the storage is external to the server hosting the crewai application. I deally solution that can use main serverless cloud provider solutions like Vector Search (https://cloud.google.com/vertex-ai/docs/vector-search/quickstart) etc
Describe alternatives you've considered
Sharing drive among multiple instances but not sure how well this will scale and it is not a clean implementation. Additionally if the instance hosting the application dies the files are lost as well (in a serverless environment that storage is not persistent)
Additional context
No response
Willingness to Contribute
I can test the feature once it's implemented