Open nickknyc opened 3 weeks ago
Hi @nickknyc, I do not think this was an error, it was supposed to be a warning to upgrade the api_version in the config. Could you please share a screenshot of the error, so that I can pick from there? Thanks
@prateekchhikara thanks for the quick followup. I figured it out by looking at the graph docs (I am really psyched for the graph feature!!!!)
Did not understand how to set the api version in the config. I just did the config below and the warning is gone.
config = { "llm": { "provider": "openai", "config": { "model": "gpt-4o", "temperature": 0.2, "max_tokens": 1500, } }, "version": "v1.1" }
Yep, setting the version to "v1.1" in the config will do the work. The default version is "v1.0" which will show the warning shared above.
i thinj tjis just needs to be covered in the docs and the examples.
🐛 Describe the bug
I am getting the following error when trying to do get_all. I have no idea where to set the API version
The current get_all API output format is deprecated. To use the latest format, set
api_version='v1.1'
. The current format will be removed in mem0ai 1.1.0 and later versions. print(m.get_all())