Closed chigkim closed 1 year ago
I agree, this would be very useful. It would allow far easier comparisons between the OpenAI models via their API and locally hosted models.
Do you mean to have the extensions/openai act as a proxy to real openai?
Yes. Both TavernAI and KoboldAI let you enter your openai api key to chat with OpenAI Gpt.
You want to use an OpenAI API compatible proxy to access ... OpenAI API? Why not just use OpenAI API? What's the point here?
The same reasons why people want to use oobabooga instead of inference.py for local models: Good WebUI, character management, context manipulation, expandability with extensions for things like tex to speech, speech to text, and so on.
I see, so not for the extensions/openai to proxy, but for the webui to support real openai API as a backend. Got it, makes sense.
Can we have a way to connect to OpenAI model? There's OpenAI extension but that's to make oobabooga to function as a fake OpenAI api. This might have already requested, but couldn't find when I searched. Thanks!