mem0ai / mem0

The Memory layer for your AI apps
https://mem0.ai
Apache License 2.0
22.04k stars 2.02k forks source link

openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable #1534

Closed anstonjie closed 2 weeks ago

anstonjie commented 2 months ago

🐛 Describe the bug

openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

skstanwar commented 2 months ago

Hello @anstonjie Could You share the Error Screeshoot. I thing You didn't set the OpenAI Key in environment or you can pass it by CLI also.

rshah713 commented 2 months ago

+1

Trying out the example listed in the docs: https://docs.mem0.ai/llms#togetherai

import os
from mem0 import Memory
from dotenv import load_dotenv

load_dotenv()

config = {
    "llm": {
        "provider": "together",
        "config": {
            "model": "mistralai/Mixtral-8x7B-Instruct-v0.1",
            "temperature": 0.2,
            "max_tokens": 1500,
        }
    }
}

m = Memory.from_config(config)
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})

I have TOGETHER_API_KEY set in my .env, but I'm getting:

Traceback (most recent call last):
  File "/Users/secureconnectionforguest/Desktop/Python/NJITScheduleSniffer/app/llm_memory.py", line 39, in <module>
    m = Memory.from_config(config)
  File "/usr/local/lib/python3.10/site-packages/mem0/memory/main.py", line 103, in from_config
    return cls(config)
  File "/usr/local/lib/python3.10/site-packages/mem0/memory/main.py", line 69, in __init__
    self.embedding_model = EmbedderFactory.create(self.config.embedder.provider)
  File "/usr/local/lib/python3.10/site-packages/mem0/utils/factory.py", line 43, in create
    embedder_instance = load_class(class_type)()
  File "/usr/local/lib/python3.10/site-packages/mem0/embeddings/openai.py", line 8, in __init__
    self.client = OpenAI()
  File "/usr/local/lib/python3.10/site-packages/openai/_client.py", line 105, in __init__
    raise OpenAIError(
openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable
jalihui commented 2 months ago

I try to run this code: https://docs.mem0.ai/llms#google-ai

import os from mem0 import Memory

os.environ["GEMINI_API_KEY"] = "your-api-key"

config = { "llm": { "provider": "litellm", "config": { "model": "gemini/gemini-pro", "temperature": 0.2, "max_tokens": 1500, } } }

m = Memory.from_config(config) m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})

i have a GEMINI_API_KEY. but i got this error:

OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

wangmingjunabc commented 2 months ago

use GEMINI_API_KEY, same error occur!

SpaceLearner commented 2 months ago

use Azure_API_KEY, same error !

amostsai commented 2 months ago

use GROQ_API_KEY, same error!!

kmitul commented 2 months ago

Hey @amostsai @SpaceLearner @wangmingjunabc @jalihui @rshah713 @anstonjie The error is caused because default embedding model is set as OpenAI.
Try using different embedder model. PR #1627 might be helpful for using different embedding model.

rshah713 commented 2 months ago

@kmitul thanks for the PR, docs will definitely need to be updated to allow users to use the different supported models properly. In my example I was trying to use provider: together, what embeddings model would I need to use with this provider? Can you give examples?

kmitul commented 2 months ago

Hey @rshah713 You can choose any embedding model and pair it with any LLM, as they can be selected independently. Ideally, you would use an embedding model from TogetherAI. However, I haven't added support for TogetherAI embedding models yet, but I plan to do so soon. In the meantime, you can use any huggingface model. For example:

config = {
    "llm": {
        "provider": "together",
        "config": {
            "model": "mistralai/Mixtral-8x7B-Instruct-v0.1",
            "temperature": 0.2,
            "max_tokens": 1500,
        }
    },
    "embedder": {
        "provider": "huggingface",
        "config": {
            "model": "multi-qa-MiniLM-L6-cos-v1"
        }
    }
}
Dev-Khant commented 1 month ago

Hey @anstonjie @rshah713 @wangmingjunabc @SpaceLearner @amostsai Sorry for the inconvenience, docs will be updated soon so it will be more clear.

Here we use Openai as embedding model that's why it needs Openai key and we will soon add option to choose your embedding model.

dosnshcu commented 4 weeks ago

use Azure_API_KEY, same error !

uahmad235 commented 2 weeks ago

@Dev-Khant any update on this? The docs for Azure openai are completely outdated here. see config.llm.provider: here.

Dev-Khant commented 2 weeks ago

Hey @uahmad235 Yes this is fixed now. Please check: https://docs.mem0.ai/components/llms/models/azure_openai. Please let me know if you still face any issue.

Dev-Khant commented 2 weeks ago

Closing as issue solved. Please feel free to reopen if faced with same problem. Thanks.