cpacker / MemGPT

Letta (fka MemGPT) is a framework for creating stateful LLM services.
https://letta.com
Apache License 2.0
11.85k stars 1.29k forks source link

Unable to use gemini models #1486

Open ndisalvio3 opened 3 months ago

ndisalvio3 commented 3 months ago

Describe the bug Upon attempting to use any model other then gemini-pro it throws an error.

Please describe your setup

Screenshots If applicable, add screenshots to help explain your problem.


Traceback (most recent call last):
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/main.py", line 415, in run_agent_loop
    new_messages, user_message, skip_next_user_input = process_agent_step(user_message, no_verify)
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/main.py", line 384, in process_agent_step
    new_messages, heartbeat_request, function_failed, token_warning, tokens_accumulated = memgpt_agent.step(
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/agent.py", line 787, in step
    raise e
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/agent.py", line 702, in step
    response = self._get_ai_reply(
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/agent.py", line 420, in _get_ai_reply
    raise e
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/agent.py", line 395, in _get_ai_reply
    response = create(
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 133, in wrapper
    raise e
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 106, in wrapper
    return func(*args, **kwargs)
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/llm_api/llm_api_tools.py", line 273, in create
    return google_ai_chat_completions_request(
  File "/home/ndisalvio/.local/lib/python3.10/site-packages/memgpt/llm_api/google_ai.py", line 426, in google_ai_chat_completions_request
    assert model in SUPPORTED_MODELS, f"Model '{model}' not in supported models: {', '.join(SUPPORTED_MODELS)}"
AssertionError: Model 'gemini-1.5-pro' not in supported models: gemini-pro```

**Additional context**
I have a paid plan.

**MemGPT Config**
Please attach your `~/.memgpt/config` file or copy past it below.

```[defaults]
preset = memgpt_chat
persona = sam_pov
human = basic

[model]
model = gemini-1.5-pro
model_endpoint_type = google_ai
context_window = 2097152

[embedding]
embedding_endpoint_type = openai
embedding_endpoint = https://api.openai.com/v1
embedding_model = text-embedding-ada-002
embedding_dim = 1536
embedding_chunk_size = 300

[archival_storage]
type = chroma
path = /home/ndisalvio/.memgpt/chroma

[recall_storage]
type = sqlite
path = /home/ndisalvio/.memgpt

[metadata_storage]
type = sqlite
path = /home/ndisalvio/.memgpt

[version]
memgpt_version = 0.3.18

[client]
anon_clientid = 00000000-0000-0000-0000-000000000000```

---
MisileLab commented 3 months ago

gemini 1.5 pro doesnt supported now