letta-ai / letta

Letta (formerly MemGPT) is a framework for creating LLM services with memory.
https://letta.com
Apache License 2.0
13.11k stars 1.44k forks source link

Unable to use martian router #1491

Open BaLaurent opened 5 months ago

BaLaurent commented 5 months ago

Describe the bug When trying to use martian router you get an error due to the fact that the router doesn't return a list of available models,

Please describe your setup

Screenshots image

Additional context Martian router is as its name imply a router, which have the benefit to be able to use multiples models based on congestion, This mean that the user should provide one or multiple models to choose, also martian have 2 modes gateway and router which both work differently so it might cause issue later if not implemented while solving this issue

MemGPT Config

[defaults]
preset = memgpt_chat
persona = sofia
human = Laurent

[model]
model = meta-llama/Meta-Llama-3-70B-Instruct
model_endpoint = https://api.endpoints.anyscale.com/v1
model_endpoint_type = openai
context_window = 8192

[embedding]
embedding_endpoint_type = hugging-face
embedding_endpoint = https://embeddings.memgpt.ai
embedding_model = BAAI/bge-large-en-v1.5
embedding_dim = 1024
embedding_chunk_size = 300

[archival_storage]
type = chroma
path = C:\Users\Octop\.memgpt\chroma

[recall_storage]
type = sqlite
path = C:\Users\Octop\.memgpt

[metadata_storage]
type = sqlite
path = C:\Users\Octop\.memgpt

[version]
memgpt_version = 0.3.17

[client]
anon_clientid = 00000000-0000-0000-0000-000000000000
sarahwooders commented 5 months ago

As a workaround, you can just skip the memgpt configure step and directly edit your ~/.memgpt/config file to point to the right endpoint and maybe put in null or some dummy value for the model field. If that doesn't work, let us know what the error is.

BaLaurent commented 5 months ago

oh yeah i didn't tought of that.