oobabooga / text-generation-webui

A Gradio web UI for Large Language Models.
GNU Affero General Public License v3.0
39.68k stars 5.21k forks source link

Connecting to OpenAI Model (Not Immitate OpenAI Api) #2349

Closed chigkim closed 1 year ago

chigkim commented 1 year ago

Can we have a way to connect to OpenAI model? There's OpenAI extension but that's to make oobabooga to function as a fake OpenAI api. This might have already requested, but couldn't find when I searched. Thanks!

olinorwell commented 1 year ago

I agree, this would be very useful. It would allow far easier comparisons between the OpenAI models via their API and locally hosted models.

matatonic commented 1 year ago

Do you mean to have the extensions/openai act as a proxy to real openai?

chigkim commented 1 year ago

Yes. Both TavernAI and KoboldAI let you enter your openai api key to chat with OpenAI Gpt.

matatonic commented 1 year ago

You want to use an OpenAI API compatible proxy to access ... OpenAI API? Why not just use OpenAI API? What's the point here?

chigkim commented 1 year ago

The same reasons why people want to use oobabooga instead of inference.py for local models: Good WebUI, character management, context manipulation, expandability with extensions for things like tex to speech, speech to text, and so on.

matatonic commented 1 year ago

I see, so not for the extensions/openai to proxy, but for the webui to support real openai API as a backend. Got it, makes sense.