henk717 / KoboldAI

KoboldAI is generative AI software optimized for fictional use, but capable of much more!
http://koboldai.com
GNU Affero General Public License v3.0
359 stars 130 forks source link

If OpenAI API decides to send empty response, UI2 errors, and loads forever. #268

Open TheZennou opened 1 year ago

TheZennou commented 1 year ago

This is a pretty common thing for Davinci-003 to do, if it feels like not continuing, it won't, it's expecting the user to write newlines or remove periods if it wants the writing to continue. This seems to cause an issue with Kobold however.

The error can be seen in the following hastebin: https://hastebin.com/raw/ogufuvinet

Steps to reproduce: Load OpenAI API as the model on UI2, and attempt to get the API to send an empty response, in my case I prompted like this "Write the end of a story"

Debug Dump: https://files.catbox.moe/ie6aes.json

ebolam commented 1 year ago

the error link (hastebin) isn't working for me and the debug dump is pointing at a VERY generic function that doesn't quite make sense.

It looks like we have something somewhere that is trying to access the last character in the string, and since the string is empty it's dying. It'll be an easy fix if I can figure out which line is causing it. Could you perhaps send the line that's causing the issue or attach the console log screenshot to this issue, assuming you can do so without leaking anything sensitive :)

TheZennou commented 1 year ago

Huh, weird that the hastebin decided to die. I'll just paste in github to avoid that.

INFO       | __main__:get_model_info:1662 - Selected: OAI, 
INIT       | Retrieving | OAI Engines
INIT       | OK         | OAI Engines
loading Model
gpt2 n

Downloading (…)lve/main/config.json: 100%|██████| 665/665 [00:00<00:00, 179kB/s]

Downloading (…)olve/main/vocab.json:   0%|          | 0.00/1.04M [00:00<?, ?B/s]
Downloading (…)olve/main/vocab.json: 100%|█| 1.04M/1.04M [00:00<00:00, 9.31MB/s]

Downloading (…)olve/main/merges.txt: 100%|███| 456k/456k [00:00<00:00, 6.64MB/s]

Downloading (…)/main/tokenizer.json:   0%|          | 0.00/1.36M [00:00<?, ?B/s]
Downloading (…)/main/tokenizer.json: 100%|█| 1.36M/1.36M [00:00<00:00, 8.68MB/s]
INIT       | Starting   | LUA bridge
INIT       | OK         | LUA bridge
INIT       | Starting   | LUA Scripts
INIT       | OK         | LUA Scripts
Setting Seed
ERROR      | __main__:g:592 - An error has been caught in function 'g', process 'MainProcess' (117), thread 'MainThread' (139811631593280):
Traceback (most recent call last):

  File "/notebooks/runtime/envs/koboldai/lib/python3.8/site-packages/eventlet/green/thread.py", line 43, in __thread_body
    func(*args, **kwargs)
    │     │       └ {}
    │     └ ()
    └ <bound method Thread._bootstrap of <Thread(Thread-80, started daemon 139806610537344)>>
  File "/notebooks/runtime/envs/koboldai/lib/python3.8/threading.py", line 890, in _bootstrap
    self._bootstrap_inner()
    │    └ <function start_new_thread.<locals>.wrap_bootstrap_inner at 0x7f27435da670>
    └ <Thread(Thread-80, started daemon 139806610537344)>
  File "/notebooks/runtime/envs/koboldai/lib/python3.8/site-packages/eventlet/green/thread.py", line 64, in wrap_bootstrap_inner
    bootstrap_inner()
    └ <bound method Thread._bootstrap_inner of <Thread(Thread-80, started daemon 139806610537344)>>
  File "/notebooks/runtime/envs/koboldai/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self.run()
    │    └ <function Thread.run at 0x7f286cb400d0>
    └ <Thread(Thread-80, started daemon 139806610537344)>
  File "/notebooks/runtime/envs/koboldai/lib/python3.8/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
    │    │        │    │        │    └ {}
    │    │        │    │        └ <Thread(Thread-80, started daemon 139806610537344)>
    │    │        │    └ (<socketio.server.Server object at 0x7f272cd54d90>, 'N5CPYn7do5R5wggUAAAG', '5N2YsMKY9dYG64liAAAF', ['submit', {'data': '', '...
    │    │        └ <Thread(Thread-80, started daemon 139806610537344)>
    │    └ <bound method Server._handle_event_internal of <socketio.server.Server object at 0x7f272cd54d90>>
    └ <Thread(Thread-80, started daemon 139806610537344)>
  File "/notebooks/runtime/envs/koboldai/lib/python3.8/site-packages/socketio/server.py", line 731, in _handle_event_internal
    r = server._trigger_event(data[0], namespace, sid, *data[1:])
        │      │              │        │          │     └ ['submit', {'data': '', 'theme': ''}]
        │      │              │        │          └ 'N5CPYn7do5R5wggUAAAG'
        │      │              │        └ '/'
        │      │              └ ['submit', {'data': '', 'theme': ''}]
        │      └ <function Server._trigger_event at 0x7f272d88fe50>
        └ <socketio.server.Server object at 0x7f272cd54d90>
  File "/notebooks/runtime/envs/koboldai/lib/python3.8/site-packages/socketio/server.py", line 756, in _trigger_event
    return self.handlers[namespace][event](*args)
           │    │        │          │       └ ('N5CPYn7do5R5wggUAAAG', {'data': '', 'theme': ''})
           │    │        │          └ 'submit'
           │    │        └ '/'
           │    └ {'/': {'get_model_info': <function get_model_info at 0x7f272c3fa0d0>, 'OAI_Key_Update': <function get_oai_models at 0x7f272c3...
           └ <socketio.server.Server object at 0x7f272cd54d90>
  File "/notebooks/runtime/envs/koboldai/lib/python3.8/site-packages/flask_socketio/__init__.py", line 282, in _handler
    return self._handle_event(handler, message, namespace, sid,
           │    │             │        │        │          └ 'N5CPYn7do5R5wggUAAAG'
           │    │             │        │        └ '/'
           │    │             │        └ 'submit'
           │    │             └ <function UI_2_submit at 0x7f272c3ae0d0>
           │    └ <function SocketIO._handle_event at 0x7f272d3d8820>
           └ <flask_socketio.SocketIO object at 0x7f272cd54ca0>
  File "/notebooks/runtime/envs/koboldai/lib/python3.8/site-packages/flask_socketio/__init__.py", line 828, in _handle_event
    ret = handler(*args)
          │        └ ({'data': '', 'theme': ''},)
          └ <function UI_2_submit at 0x7f272c3ae0d0>

> File "aiserver.py", line 592, in g
    return f(*a, **k)
           │  │    └ {}
           │  └ ({'data': '', 'theme': ''},)
           └ <function UI_2_submit at 0x7f272c3acdc0>

  File "aiserver.py", line 8655, in UI_2_submit
    actionsubmit(data['data'], actionmode=koboldai_vars.actionmode)
    │            │                        └ <koboldai_settings.koboldai_vars object at 0x7f272ccca4c0>
    │            └ {'data': '', 'theme': ''}
    └ <function actionsubmit at 0x7f272c40ab80>

  File "aiserver.py", line 4985, in actionsubmit
    calcsubmit("")
    └ <function calcsubmit at 0x7f272c38d160>

  File "aiserver.py", line 5388, in calcsubmit
    generate(subtxt, min, max, found_entries)
    │        │       │    │    └ set()
    │        │       │    └ 189
    │        │       └ 109
    │        └ [464, 886, 286, 257, 1621, 628, 198, 464, 886, 286, 262, 1621, 318, 1690, 262, 749, 19201, 636, 13, 220, 1026, 338, 262, 2589...
    └ <function generate at 0x7f272c38daf0>

  File "aiserver.py", line 6342, in generate
    koboldai_vars.lua_koboldbridge.generated[i+1][koboldai_vars.generated_tkns] = int(genout[i, -1].item())
    │                                        │    │                                   │      └ 0
    │                                        │    │                                   └ array([], shape=(1, 0), dtype=float64)
    │                                        │    └ <koboldai_settings.koboldai_vars object at 0x7f272ccca4c0>
    │                                        └ 0
    └ <koboldai_settings.koboldai_vars object at 0x7f272ccca4c0>

IndexError: index -1 is out of bounds for axis 1 with size 0

I also find it weird that it's loading GPT2 stuff, despite me selecting Davinci as my model.