Closed fouwaru closed 10 months ago
oobabooga removed the old api support. I have a fork of oobabot up with barebones support for the new api, but it's not the plugin version.
I got it working now, Loaded model but it outputs that
An empty response was received from Oobabooga. Please check that the AI is running properly on the Oobabooga server at ws://127.0.0.1:5005
Edit: Got a new Output
connection handler failed Traceback (most recent call last): File "E:\OOBABOOBA\text-generation-webui\installer_files\env\Lib\site-packages\websockets\legacy\server.py", line 240, in handler await self.ws_handler(self) File "E:\OOBABOOBA\text-generation-webui\installer_files\env\Lib\site-packages\websockets\legacy\server.py", line 1186, in _ws_handler return await cast( ^^^^^^^^^^^ File "E:\OOBABOOBA\text-generation-webui\extensions\api\streaming_api.py", line 92, in _handle_connection await _handle_stream_message(websocket, message) File "E:\OOBABOOBA\text-generation-webui\extensions\api\util.py", line 155, in api_wrapper return await func(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "E:\OOBABOOBA\text-generation-webui\extensions\api\streaming_api.py", line 37, in _handle_stream_message for a in generator: File "E:\OOBABOOBA\text-generation-webui\modules\text_generation.py", line 32, in generate_reply for result in _generate_reply(args, **kwargs): File "E:\OOBABOOBA\text-generation-webui\modules\text_generation.py", line 87, in _generate_reply for reply in generate_func(question, original_question, seed, state, stopping_strings, is_chat=is_chat): File "E:\OOBABOOBA\text-generation-webui\modules\text_generation.py", line 289, in generate_reply_HF generate_params[k] = state[k]
KeyError: 'dynamic_temperature'
You'll need to change to the new openai api on oobabooga. it's an openai compliant one but i haven't figured out streaming yet.
Yeah, but with the new API it won't work or Start. Like I read, the new API uses that open ai API thing
tried --openai or --extensions openai ?
yep, but still not working
would be happy if you would give me a step by step installation guide, cuz the one on the main page gives me errors everytime
tried --openai or --extensions openai ?
and if i do that with that, only thing happen ist that:
Reason: Cannot connect to host localhost:5005 ssl:default [Der Remotecomputer hat die Netzwerkverbindung abgelehnt]
tried --openai or --extensions openai ?
and if i do that with that, only thing happen ist that:
Reason: Cannot connect to host localhost:5005 ssl:default [Der Remotecomputer hat die Netzwerkverbindung abgelehnt]
Unfortunately yes, because the old api is broken now. Take a peek at the --generate-config
option for my oobabot fork, there SHOULD be a new option/options there for "use openai" and "openai url". You can copy those into your existing config.
If this doesnt work you can alternatively just stay on the last working oobabooga webui commit: https://github.com/oobabooga/text-generation-webui/commit/8ea3f316012e6befe6a852501ce158a478c8e680
Well, i know that this will create a Config.yaml. I have that config.yamlin the main Oobabooga folder and edited and selected it with cmd commands. For connecting with oobabooga, it shows me that
Base URL for the oobabooga instance. This should be ws://hostname[:port] for plain websocket connections, or wss://hostname[:port] for websocket connections over TLS. default: ws://localhost:500 base_url:
and when i edited it, it will give me that error: Cannot connect to host localhost:5005 ssl:default [The remote system refused the Network connection]
mine works with latest oobagooba :) https://github.com/jakobdylanc/discord-llm-chatbot
Well, i know that this will create a Config.yaml. I have that config.yamlin the main Oobabooga folder and edited and selected it with cmd commands. For connecting with oobabooga, it shows me that...
We aren't connecting to oobabooga anymore. I advised to use the generate config command so that you can see the new configuration fields and add them into your existing config yourself, or copy over your old parameters. I just pushed a new commit that should fix the openai streaming to https://github.com/jmoney7823956789378/oobabot
mine works with latest oobagooba :) https://github.com/jakobdylanc/discord-llm-chatbot
Oh, well, imma try that out @jakobdylanc Thanks for the Info
After installing the oobabot thing, Oobabooga won't start anymore. Cmd outputs this:
2024-01-19 05:21:18,804 DEBUG oobabot_plugin: inside Oobabooga, using script.py version: 0.1.8 2024-01-19 05:21:18,805 DEBUG oobabot_plugin version: 0.2.3 2024-01-19 05:21:18,805 DEBUG oobabot version: 0.2.3 ╭───────────────────────────────────────── Traceback (most recent call last) ──────────────────────────────────────────╮ │ E:\OOBABOOBA\text-generation-webui\server.py:254 in │
│ │
│ 253 # Launch the web UI │
│ ❱ 254 create_interface() │
│ 255 while True: │
│ │
│ E:\OOBABOOBA\text-generation-webui\server.py:156 in create_interface │
│ │
│ 155 │
│ ❱ 156 extensions_module.create_extensions_tabs() # Extensions tabs │
│ 157 extensions_module.create_extensions_block() # Extensions block │
│ │
│ E:\OOBABOOBA\text-generation-webui\modules\extensions.py:208 in create_extensions_tabs │
│ │
│ 207 with gr.Tab(display_name, elem_classes="extension-tab"): │
│ ❱ 208 extension.ui() │
│ 209 │
│ │
│ E:\OOBABOOBA\text-generation-webui\extensions\oobabot\script.py:28 in ui │
│ │
│ 27 """ │
│ ❱ 28 bootstrap.plugin_ui( │
│ 29 script_py_version=SCRIPT_PY_VERSION, │
│ │
│ E:\OOBABOOBA\text-generation-webui\installer_files\env\Lib\site-packages\oobabot_plugin\bootstrap.py:86 in plugin_ui │
│ │
│ 85 api_extension_loaded = False │
│ ❱ 86 if shared.args.api_streaming_port: │
│ 87 streaming_port = shared.args.api_streaming_port │
╰──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
AttributeError: 'Namespace' object has no attribute 'api_streaming_port'