Some background: Coming from Asian countries with openai restrictions, we cannot directly use the services provided by the openai endpoint, so we have to use a transit service to use openai. So the base_url parameter was added so that users from China can use the products of the mem0 team.
The current PR adds the OPENAI_API_BASE parameter to the config and llm wrapper layers, ensuring that users can set the api_base in the environment variable.
Type of change
I'm not sure if this is a new feature. If it's not suitable, please tell me in the comment and I will modify it. Thanks 🙏
Please delete options that are not relevant.
[x] New feature (non-breaking change which adds functionality)
How Has This Been Tested?
Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration
Please delete options that are not relevant.
[x] Test Script (please provide)
Because I am a novice in Python, I cannot help you complete the existing ut. But I provide the script I tested locally. If you are willing, you can help me verify my changes locally. Thank you very much.
import os
from mem0 import Memory
# os set env variable
# os.environ["OPENAI_API_KEY"] = "sk-xxx"
# os.environ["OPENAI_API_BASE"] = "https://api.xxxx.xxx/v1"
config = {
"llm": {
"provider": "openai",
"config": {
"openai_base_url": "https://api.xxxx.xxx/v1",
"api_key": "sk-xxx",
"model": "gpt-4o-mini",
"temperature": 0.2,
"max_tokens": 1500,
}
},
"embedder": {
"provider": "openai",
"config": {
"model": "text-embedding-3-small",
"openai_base_url": "https://api.xxxx.xxx/v1",
"api_key": "sk-xxx"
}
}
}
m = Memory.from_config(config)
m.enable_graph = True
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})
related_memories = m.search(query="What are Alice's hobbies?", user_id="alice")
print(related_memories)
Some immature suggestions
For the design related to openai, I think the config should be further optimized. For example, according to the existing design, I have to modify the embedding config to support this feature. Can we share a base config for openai? This may be better.
Finally, thank the community for their contribution. Great product
Description
Some background: Coming from Asian countries with openai restrictions, we cannot directly use the services provided by the openai endpoint, so we have to use a transit service to use openai. So the base_url parameter was added so that users from China can use the products of the mem0 team.
The current PR adds the OPENAI_API_BASE parameter to the config and llm wrapper layers, ensuring that users can set the api_base in the environment variable.
Type of change
I'm not sure if this is a new feature. If it's not suitable, please tell me in the comment and I will modify it. Thanks 🙏
Please delete options that are not relevant.
How Has This Been Tested?
Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration
Please delete options that are not relevant.
Some immature suggestions
For the design related to openai, I think the config should be further optimized. For example, according to the existing design, I have to modify the embedding config to support this feature. Can we share a base config for openai? This may be better.
Finally, thank the community for their contribution. Great product
Checklist:
Maintainer Checklist