I don't believe the OpenAI engine has explicit support for the gpt-4 model. Using a gpt-4 results in the following error log and exception:
OpenAi run response: {
"error": {
"message": "This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?",
"type": "invalid_request_error",
"param": "model",
"code": null
}
}
/Users/derek/projects/boxcars/lib/boxcars/engine/openai.rb:79:in `run': undefined method `map' for nil:NilClass (NoMethodError)
answer = response["choices"].map { |c| c.dig("message", "content") || c["text"] }.join("\n").strip
If I make the following hack in OpenAi#client I get a valid response:
if params[:model] == "gpt-3.5-turbo" || params[:model] == "gpt-4"
That may be a good starting point for adding support.
I don't believe the OpenAI engine has explicit support for the
gpt-4
model. Using agpt-4
results in the following error log and exception:If I make the following hack in
OpenAi#client
I get a valid response:That may be a good starting point for adding support.