Open auterak opened 1 week ago
Description Letta seems to be creating requests for LM studio with
context_overflow_policy
set to0
."lmstudio": { "context_overflow_policy": 0 },
Expected values seem to be
'stopAtLimit' | 'truncateMiddle' | 'rollingWindow'
as seen in the error from LM Studio:2024-09-24 20:14:35 [ERROR] Field with key llm.prediction.contextOverflowPolicy does not satisfy the schema:[ { "expected": "'stopAtLimit' | 'truncateMiddle' | 'rollingWindow'", "received": "number", "code": "invalid_type", "path": [], "message": "Expected 'stopAtLimit' | 'truncateMiddle' | 'rollingWindow', received number" } ]. Error Data: n/a, Additional Data: n/a
Setup
- Running Letta via Docker following instructions at https://docs.letta.com/docker.
LM Studio 0.3.2, server set to
- stop at limit
- empty template
- context 8192
there is a script called api.py in my anaconda3/envs/myenv/lib/site-packages/memgpt/local_llm/lmstudio/api.py
and it has a few lines that say this:
# In MemGPT we handle this ourselves, so this should be disabled
# "context_overflow_policy": 0,
"lmstudio": {"context_overflow_policy": 0}, # 0 = stop at limit
i deleted those lines and saved the file and it started working.
Yeah, I found that but then had some problems getting the Docker environment work in development mode (some config files missing) and haven't had time to play with it further.
Description Letta seems to be creating requests for LM studio with
context_overflow_policy
set to0
.Expected values seem to be
'stopAtLimit' | 'truncateMiddle' | 'rollingWindow'
as seen in the error from LM Studio:Setup