oobabooga / text-generation-webui

A Gradio web UI for Large Language Models.
GNU Affero General Public License v3.0
40.5k stars 5.31k forks source link

Support for plugins (same as extension??) + documentation #699

Closed elephantpanda closed 11 months ago

elephantpanda commented 1 year ago

Chat GPT will have plugins such as Web Browsing, calculators etc.

I'm not sure if this is the same as "extensions"???

e.g. for a calculator plugin the AI output would something like:

The product of 2345 and 1232142 is: 
###Plugin:Calc:
2345+1232142
###

The plugin would then take that output delete the stuff between ### in the output and replace it with:

###Result:
1234487
#####

Then it would keep going:

``1234487. Anything else I could help you with?```


Another plugin might just be to parse javacsript text e.g if the output is:

The product of 23 and 46 is <script>return 23*46</script>

And so on.

It would be useful if someone could write a calculator plugin and document it just to show it working.

Other plugins would be:

tensiondriven commented 1 year ago

I am wanting something similar; the only thing I see missing right now is the ability to submit a prompt back to the model, and have the UI update. As an added complication, it appears that each session stores its own state, but there is also some shared model state, so currently it isn't possible to "respond" to specific sessions, at least as I understand it. This could be fixed though, I'm fairly sure.

Actually, looking the extension documentation, it looks like models have access to the shared state, which I think is like the god object of the app. If that's true, then it may already be possible to do what you're suggesting, at least for a single-session instance. (It isn't terribly clear that this is possible; the first time I read these docs, I thought it wasn't because I'm so used to seeing internal api hooks for everything that an addon/extension/plugin is allowed to do. But if we have access to global state, I think that means we can call functions that will update the model! We might need some way to trigger the client to refresh, ideally that would be through access to the each session instance - this may be in the shared object, i'm not sure.)

For this to be stable, i think we'd really need the session leakage issue fixed, but it won't stop me from playing around with this!

You might also check out LangChain, which looks to be a generalized solution for what you're talking about - the ability for the model to call out to services and get responses, and then take actions accordingly. There's already an issue to add langchain support, but I still think it makes sense for extensions to support a hook, or have some way to do what you're suggesting. Longer term, Langchain is going to be huge, that solution seems like it needs to be a lot more robust than what you/I are looking for now.

elephantpanda commented 1 year ago

Hmm.. perhaps it might just a UI/documentation issue then.

It would be cool if the plugin interface was similar to the Chat GPT plugin interface.

If the documentation was improved on how to make plugins, and there was a standardised way of doing it.

knoopx commented 1 year ago

found also this https://github.com/mpaepper/llm_agents/ might be useful as a minimal implementation reference instead of going all in with langchain

github-actions[bot] commented 11 months ago

This issue has been closed due to inactivity for 6 weeks. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment.