mem0ai / mem0

The Memory layer for your AI apps
https://mem0.ai
Apache License 2.0
23.12k stars 2.14k forks source link

Groq not worked #1576

Closed 4thfever closed 4 months ago

4thfever commented 4 months ago

🐛 Describe the bug

I'm using the script from the doc, for implementing a Groq-based memory.

import os
from mem0 import Memory

os.environ["GROQ_API_KEY"] = "xxx"

config = {
    "llm": {
        "provider": "groq",
        "config": {
            "model": "mixtral-8x7b-32768",
            "temperature": 0.1,
            "max_tokens": 1000,
        }
    }
}

m = Memory.from_config(config)
m.add("Likes to play cricket on weekends", user_id="alice", metadata={"category": "hobbies"})

However, I got the error:

Traceback (most recent call last): File "c:/Users/wangx/Desktop/xxx/test.py", line 17, in m = Memory.from_config(config) File "C:\Users\wangx\AppData\Local\Programs\Python\Python38\lib\site-packages\mem0\memory\main.py", line 103, in from_config return cls(config) File "C:\Users\wangx\AppData\Local\Programs\Python\Python38\lib\site-packages\mem0\memory\main.py", line 69, in init self.embedding_model = EmbedderFactory.create(self.config.embedder.provider) File "C:\Users\wangx\AppData\Local\Programs\Python\Python38\lib\site-packages\mem0\utils\factory.py", line 43, in create embedder_instance = load_class(class_type)() File "C:\Users\wangx\AppData\Local\Programs\Python\Python38\lib\site-packages\mem0\embeddings\openai.py", line 8, in init self.client = OpenAI() File "C:\Users\wangx\AppData\Local\Programs\Python\Python38\lib\site-packages\openai_client.py", line 105, in init raise OpenAIError( openai.OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

It seems like groq is not supported? The openai embedding is automatically called

kmitul commented 4 months ago

Hi @4thfever The issue is occurring because the default embedder is set to openai if not provided in config. And it requires api keys to be set. You can try using different embedder in config (e.g. huggingface, ollama). Please refer to #1574 before changing embedder to avoid exceptions.

4thfever commented 4 months ago

Hi, thanks for reply. I'll give it a try

Dev-Khant commented 4 months ago

@4thfever Here we use Openai embedding model, that's the reason it's asking for Openai key. I'll update the docs to make it more clear.

Rajesh9998 commented 4 months ago

@4thfever Here we use Openai embedding model, that's the reason it's asking for Openai key. I'll update the docs to make it more clear. can we use other Embedding models like "models/text-embedding-004" instead of openai

Dev-Khant commented 4 months ago

@Rajesh9998 We will soon add support to use different embedding models.