Closed PiDanShouRouZhouXD closed 2 months ago
Should be supported shortly
Wrapping up, doing final testing
Just a heads up, need to update .env OPENAI_MODELS
, the OpenAI models API request is not including it right now
OPENAI_MODELS=o1-preview,o1-mini,gpt-4o #, ...rest
Hi @danny-avila, I updated my librechat to 0.7.5-rc2 and added the o1-preview and o1-mini models to the openai models list as you suggested. When I try to use either of the o1 models in my librechat, I get the following error. The API key I’m using is with a tier 5 org. Any ideas how to debug?
Something went wrong. Here's the specific error message we encountered: Failed to send message. HTTP 404 - { "error": { "message": "This is a chat model and not supported in the v1/completions endpoint. Did you mean to use v1/chat/completions?", "type": "invalid_request_error", "param": "model", "code": null } }
@rhiever it's not part of -rc2
. You need to update to librechat-dev:latest
to enable it
@fuegovic how do I go back from rc2 to latest? Sorry if the question is dumb 😩
@fuegovic how do I go back from rc2 to latest? Sorry if the question is dumb 😩
The default is "-dev:latest" and will currently show as rc2 in the UI. Look at the docker override file, this is where you would've made the change
What features would you like to see added?
support OpenAI o1-preview, o1-mini
More details
The o1-preview and o1-mini do not support streaming output and system prompts, therefore they cannot be used.
Which components are impacted by your request?
General
Pictures
No response
Code of Conduct