Closed JamesClarke7283 closed 1 month ago
Hello, you can use OpenAI by using the LiteLLM provider proxy.
Just add a new provider with the settings:
Oh, thanks, i will keep that in mind. I did not see OpenAI anywhere in the docs, so thats why i could not find out. But thanks/
Is your feature request related to a problem? Please describe. Its annoying that all the locally hosted models are not so good with code as GPT4o, its to be expected, but i would like to see a OpenAI provider added.
Describe the solution you'd like In the dropdown list of providers, i would like to see a OpenAI provider, and a drop down to select a OpenAI model.
Describe alternatives you've considered Alternatively, allow us to make our own providers easily without code, but that might be a bit difficult to do.
Additional context N/A