Closed Rivelyn closed 1 year ago
are you doing something like export OPENAI_API_BASE=http://127.0.0.1:1234
?
are you doing something like
export OPENAI_API_BASE=http://127.0.0.1:1234
?67
Yes, well windows so it's set OPEN_API_BASE=http://localhost:1234 and set BACKEND_TYPE=lmstudio
Well, after decades of being in IT, the first thing I should have tried was a restart... I restarted my conda env and it seem to be working now.
I am using LM Studio as my backend I have tried several different configuration all getting the same error with minor variation.
\MemGPT\memgpt\local_llm\lmstudio\api.py:22 in │ │ get_lmstudio_completion │ │ │ │ 19 │ request = settings │ │ 20 │ request["prompt"] = prompt │ │ 21 │ │ │ ❱ 22 │ if not HOST.startswith(("http://", "https://")): │ │ 23 │ │ raise ValueError(f"Provided OPENAI_API_BASE value ({HOST}) must begin with http: │ │ 24 │ │ │ 25 │ try: