Closed spike-spiegel-21 closed 1 month ago
When I try to use these configurations:
from mem0 import Memory config = { "llm": { "provider": "azure_openai", "config": { "model": "gpt4Omni", "temperature": 0.1, "max_tokens": 2000, "azure_kwargs" : { "azure_deployment" : "", "api_version" : "", "azure_endpoint" : "", "api_key" : "" } } }, "embedder": { "provider": "azure_openai", "config": { "model": "", "azure_kwargs" : { "api_version" : "", "azure_deployment" : "", "azure_endpoint" : "", "azure_endpoint": "", "api_key": "" } } }, "graph_store": { "provider": "", "config": { "url": "", "username": "neo4j", "password": "" } }, "version": "v1.1" } m = Memory.from_config(config) m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
I get an error
openai.BadRequestError: Error code: 400 - {'error': {'message': "Invalid function definition for 'search': unexpected parameter 'strict' parameter supplied.", 'type': 'invalid_request_error', 'param': None, 'code': None}} raise self._make_status_error_from_response(err.response) from None
Hi @spike-spiegel-21 let me check and get back to you on this.
Hi @prateekchhikara raising a PR quickly.
@spike-spiegel-21 the PR has been merged, so closing the issue
🐛 Describe the bug
When I try to use these configurations:
I get an error