FlowiseAI / Flowise

Drag & drop UI to build your customized LLM flow
https://flowiseai.com
Apache License 2.0
30.3k stars 15.65k forks source link

[FEATURE] Add OPENAI_API_BASE to UI to enable usage oobabooga openai plugin #248

Closed TheMasterFX closed 10 months ago

TheMasterFX commented 1 year ago

Describe the feature you'd like oobabooga Text-generation UI has a plugin which emulates the openAI API. But in the UI of Flowise it is not possible to set the URL of OpenAI. Setting the OPENAI_API_BASE via .env doesn't seem to work

HenryHengZJ commented 1 year ago

Do you have the link to the plugin?

TheMasterFX commented 1 year ago

Do you have the link to the plugin?

It is build-in already: https://github.com/oobabooga/text-generation-webui/tree/main/extensions/openai

When I use Langchain in python I just have to set OPENAI_API_KEY and OPENAI_API_BASE environment variables OPENAI_API_KEY="dummy" OPENAI_API_BASE="http://127.0.0.1:5001/v1"

heresandyboy commented 1 year ago

Instead of running oobabooga to serve your local LLM, you can use LocalAI instead. It gives you an OpenAI compatible API based on whatever model you choose to run. If you set up LocalAI from the repo here: https://github.com/go-skynet/LocalAI

Then you can use the Existing ChatLocalAI node in flowise: See example here https://github.com/go-skynet/LocalAI/tree/master/examples/flowise

HenryHengZJ commented 1 year ago

@TheMasterFX with this PR merged - https://github.com/FlowiseAI/Flowise/pull/264 you can now specify a base path

TheMasterFX commented 1 year ago

Instead of running oobabooga to serve your local LLM, you can use LocalAI instead. It gives you an OpenAI compatible API based on whatever model you choose to run. If you set up LocalAI from the repo here: https://github.com/go-skynet/LocalAI

As far as I can tell LocalAI doesn't seem to support GPTQ models (on GPU)

heresandyboy commented 1 year ago

As far as I can tell LocalAI doesn't seem to support GPTQ models (on GPU)

Looking at the readme and build instructions on their github it looks like they have GPU support now, if still experiential, I'll try it out myself soon.

Gavin-YYC commented 1 year ago

https://github.com/hwchase17/langchainjs/issues/1277

sam1am commented 1 year ago

I am attempting to use the LocalAI module with the oobabooga backend. It seems like both are intended to work as openai drop in replacements so in theory I should be able to use the LocalAI node with any drop in openai replacement, right? Well.. maybe not because I can't get it working. It may be that the LocalLLM node only needs to be modified slightly to get it to support other backends (or I am doing something wrong).

TheMasterFX commented 1 year ago

Start oobabooga with call python server.py --auto-devices --chat --wbits 4 --groupsize 128 --api --listen --extension openai Then enter the API url in the model in the UI http://127.0.0.1:5001/v1 However a LLMs might have different Keywords like "### Instruction:" or "User:"

ill-yes commented 1 year ago

However a LLMs might have different Keywords like "### Instruction:" or "User:"

Where do find these keywords? Any template-suggestions for some known LLMs? 🙏🏽

TheMasterFX commented 1 year ago

However a LLMs might have different Keywords like "### Instruction:" or "User:"

Where do find these keywords? Any template-suggestions for some known LLMs? 🙏🏽

This might be a good source: https://github.com/oobabooga/text-generation-webui/tree/main/instruction-templates

ill-yes commented 1 year ago

This might be a good source: https://github.com/oobabooga/text-generation-webui/tree/main/instruction-templates

Thanks! Do you suggest any models + templates for flowise? I've tried a bunch of LLMs but none of them work as expected.