bakks / butterfish

A shell with AI superpowers
https://butterfi.sh
MIT License
294 stars 24 forks source link

text-generation-webui label:question #20

Closed txirrindulari closed 7 months ago

txirrindulari commented 8 months ago

Is there any way to connect butterfish text-generation-webui to use a local LLMs?

bakks commented 8 months ago

Hey @txirrindulari, I've just pushed a change that enables you to set a Base URL for the model API. In other words, this allows you to point butterfish at a different server than the OpenAI api. In theory text-generation-webui supports the OpenAI API but I haven't been able to get models working on text-generation-webui to test this out.

If you want to try this, you can install butterfish from the latest commit (I haven't yet released), and then set the Base URL. Something like:

go install github.com/bakks/butterfish/cmd/butterfish@latest
$(go env GOPATH)/bin/butterfish shell -u http://localhost:1234/v1 -A

Assuming that you're running text-generation-webui at port 1234, etc.

JeremyBickel commented 7 months ago

I just tried it with $(go env GOPATH)/bin/butterfish shell -u http://127.0.0.1:7860 -A It instantly 404'd - "Error prompting LLM: error, status code: 404, message: %!s()"

JeremyBickel commented 7 months ago

You have to run text-generation-webui (server.py) with --extensions openai then point your butterfish to http://127.0.0.1:5000/v1 by default Here's the docs for the openai extension. It says to use --api, but that's old. https://github.com/oobabooga/text-generation-webui/wiki/12-%E2%80%90-OpenAI-API

Example: go/bin/butterfish summarize .bashrc -u "http://127.0.0.1:5000/v1" -vv Don't forget the quotes around the address.

Anyhow.. This thing IS working! I'm so glad. THIS is AI, to me, anyways. It's so useful!

bakks commented 7 months ago

That's awesome @JeremyBickel, thanks for testing and adding your commands! Will make a note of it in the readme.