Closed tkafka closed 1 year ago
Another example of an error:
{
error: {
message: "This model's maximum context length is 4097 tokens. However, your messages resulted in 4144 tokens. Please reduce the length of the messages.",
type: "invalid_request_error",
param: "messages",
code: "context_length_exceeded",
},
}
will do
I'm on the fence about adding types
I think we should instead throw the error with the information
I noticed that when there is an error with Open AI API, the response from
createChatCompletion
looks like this: