mem0ai / mem0

The Memory layer for your AI apps
https://mem0.ai
Apache License 2.0
21.6k stars 1.97k forks source link

[openai_api_base support] - ft/Added openai OPENAI_API_BASE llm config support #1737

Closed ParseDark closed 2 weeks ago

ParseDark commented 3 weeks ago

Description

Some background: Coming from Asian countries with openai restrictions, we cannot directly use the services provided by the openai endpoint, so we have to use a transit service to use openai. So the base_url parameter was added so that users from China can use the products of the mem0 team.

The current PR adds the OPENAI_API_BASE parameter to the config and llm wrapper layers, ensuring that users can set the api_base in the environment variable.

Type of change

I'm not sure if this is a new feature. If it's not suitable, please tell me in the comment and I will modify it. Thanks 🙏

Please delete options that are not relevant.

How Has This Been Tested?

Please describe the tests that you ran to verify your changes. Provide instructions so we can reproduce. Please also list any relevant details for your test configuration

Please delete options that are not relevant.

Because I am a novice in Python, I cannot help you complete the existing ut. But I provide the script I tested locally. If you are willing, you can help me verify my changes locally. Thank you very much.

import os
from mem0 import Memory

# os set env variable
# os.environ["OPENAI_API_KEY"] = "sk-xxx"
# os.environ["OPENAI_API_BASE"] = "https://api.xxxx.xxx/v1"

config = {
    "llm": {
        "provider": "openai",
        "config": {
            "openai_base_url": "https://api.xxxx.xxx/v1",
            "api_key": "sk-xxx",
            "model": "gpt-4o-mini",
            "temperature": 0.2,
            "max_tokens": 1500,
        }
    },
    "embedder": {
        "provider": "openai",
        "config": {
            "model": "text-embedding-3-small",
            "openai_base_url": "https://api.xxxx.xxx/v1",
            "api_key": "sk-xxx"
        }
    }
}

m = Memory.from_config(config)
m.enable_graph = True
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})

related_memories = m.search(query="What are Alice's hobbies?", user_id="alice")
print(related_memories)

Some immature suggestions

For the design related to openai, I think the config should be further optimized. For example, according to the existing design, I have to modify the embedding config to support this feature. Can we share a base config for openai? This may be better.

Finally, thank the community for their contribution. Great product

interface OpenaiBaseConfig {
  api_base: string;
  api_key: string;
}

interface OpenaiChartConfig extends OpenaiBaseConfig  {
    top_k: float;
     temperature: float;
     model: string
}

interface OpenaiChartConfig extends OpenaiBaseConfig  {
    top_k: float;
     t: float;
     model: string
}

Checklist:

Maintainer Checklist

CLAassistant commented 3 weeks ago

CLA assistant check
All committers have signed the CLA.

Dev-Khant commented 2 weeks ago

LGTM!!!

ParseDark commented 2 weeks ago

LGTM!!!

NP! And Willing works with you.