Closed bs7280 closed 5 months ago
This is unfortunate bug in 0.2.11 where the key doesn't get set during configuration.
You can set it manually if you open up the ~/.memgpt/config
file (you can open the folder where this file exists with memgpt folder
. Then, add these line chunk that says [openai]
to the file:
[model]
model = gpt-4
model_endpoint = https://api.openai.com/v1
model_endpoint_type = openai
context_window = 8192
[openai]
key = sk-....
[embedding]
embedding_endpoint_type = openai
embedding_endpoint = https://api.openai.com/v1
embedding_dim = 1536
embedding_chunk_size = 300
You may also be able to set it properly with memgpt quickstart --backend openai
(but I'm not completely sure).
This will be fixed in the 0.2.12 update.
Auth error trying to make initial call to openai. First time tryiing to use openai for LLM backend. Appears api key is not being sent.
I used a breakpoint before line 191 in openai_tools.py (right before post request) and noticed that the auth token value is None
headers = {'Content-Type': 'application/json', 'Authorization': 'Bearer None'}
As a temporary solution I added this to the start of the openai_chat_completions_request(url, api_key, data) method and it works
Please describe your setup
memgpt version
? (eg "0.2.11")pip install pymemgpt
memgpt
? Terminal (zsh)Screenshots See logs below
Config:
Eror message:
If you're not using OpenAI, please provide additional information on your local LLM setup:
Edit: removed details about LocalLLMs