KoboldAI / KoboldAI-Client

For GGUF support, see KoboldCPP: https://github.com/LostRuins/koboldcpp
https://koboldai.com
GNU Affero General Public License v3.0
3.47k stars 747 forks source link

Trying to load "mayaeary_pygmalion-6b_dev-4bit-128g" model fails #290

Open NetAndif opened 1 year ago

NetAndif commented 1 year ago

Hello folks, i am new to KoboldAI. I wanted to install and use the model https://huggingface.co/mayaeary/pygmalion-6b_dev-4bit-128g

Here is my console output:

Runtime launching in B: drive mode
INIT       | Starting   | Flask
INIT       | OK         | Flask
INIT       | Starting   | Webserver
INIT       | Starting   | LUA bridge
INIT       | OK         | LUA bridge
INIT       | Starting   | LUA Scripts
INIT       | OK         | LUA Scripts
INIT       | OK         | Webserver
MESSAGE    | Webserver started! You may now connect with a browser at http://127.0.0.1:5000
INFO       | __main__:do_connect:3542 - Client connected!
Traceback (most recent call last):
  File "B:\python\lib\site-packages\eventlet\hubs\selects.py", line 59, in wait
    listeners.get(fileno, hub.noop).cb(fileno)
  File "B:\python\lib\site-packages\eventlet\greenthread.py", line 221, in main
    result = function(*args, **kwargs)
  File "B:\python\lib\site-packages\eventlet\wsgi.py", line 837, in process_request
    proto.__init__(conn_state, self)
  File "B:\python\lib\site-packages\eventlet\wsgi.py", line 352, in __init__
    self.finish()
  File "B:\python\lib\site-packages\eventlet\wsgi.py", line 751, in finish
    BaseHTTPServer.BaseHTTPRequestHandler.finish(self)
  File "B:\python\lib\socketserver.py", line 811, in finish
    self.wfile.close()
  File "B:\python\lib\socket.py", line 687, in write
    return self._sock.send(b)
  File "B:\python\lib\site-packages\eventlet\greenio\base.py", line 401, in send
    return self._send_loop(self.fd.send, data, flags)
  File "B:\python\lib\site-packages\eventlet\greenio\base.py", line 388, in _send_loop
    return send_method(data, *args)
ConnectionAbortedError: [WinError 10053] Eine bestehende Verbindung wurde softwaregesteuert
durch den Hostcomputer abgebrochen
Removing descriptor: 1776
INFO       | __main__:do_connect:3542 - Client connected!
INIT       | Searching  | GPU support
INIT       | Found      | GPU support
INIT       | Starting   | Transformers
INIT       | Info       | Final device configuration:
       DEVICE ID  |  LAYERS  |  DEVICE NAME
               0  |      28  |  NVIDIA GeForce RTX 3090
             N/A  |       0  |  (Disk cache)
             N/A  |       0  |  (CPU)
You are using a model of type gptj to instantiate a model of type gpt_neo. This is not supported for all configurations of models and can yield errors.
Exception in thread Thread-17:
Traceback (most recent call last):
  File "aiserver.py", line 2555, in load_model
    model     = AutoModelForCausalLM.from_pretrained(vars.custmodpth, revision=args.revision, cache_dir="cache", **lowmem)
  File "B:\python\lib\site-packages\transformers\models\auto\auto_factory.py", line 463, in from_pretrained
    return model_class.from_pretrained(
  File "aiserver.py", line 1823, in new_from_pretrained
    return old_from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
  File "B:\python\lib\site-packages\transformers\modeling_utils.py", line 2047, in from_pretrained
    raise EnvironmentError(
OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory D:\AI\KoboldAI\models\mayaeary_pygmalion-6b_dev-4bit-128g.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "B:\python\lib\threading.py", line 932, in _bootstrap_inner
    self.run()
  File "B:\python\lib\threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File "B:\python\lib\site-packages\socketio\server.py", line 731, in _handle_event_internal
    r = server._trigger_event(data[0], namespace, sid, *data[1:])
  File "B:\python\lib\site-packages\socketio\server.py", line 756, in _trigger_event
    return self.handlers[namespace][event](*args)
  File "B:\python\lib\site-packages\flask_socketio\__init__.py", line 282, in _handler
    return self._handle_event(handler, message, namespace, sid,
  File "B:\python\lib\site-packages\flask_socketio\__init__.py", line 828, in _handle_event
    ret = handler(*args)
  File "aiserver.py", line 466, in g
    return f(*a, **k)
  File "aiserver.py", line 3915, in get_message
    load_model(use_gpu=msg['use_gpu'], gpu_layers=msg['gpu_layers'], disk_layers=msg['disk_layers'], online_model=msg['online_model'])
  File "aiserver.py", line 2559, in load_model
    model     = GPTNeoForCausalLM.from_pretrained(vars.custmodpth, revision=args.revision, cache_dir="cache", **lowmem)
  File "aiserver.py", line 1823, in new_from_pretrained
    return old_from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
  File "B:\python\lib\site-packages\transformers\modeling_utils.py", line 2047, in from_pretrained
    raise EnvironmentError(
OSError: Error no file named pytorch_model.bin, tf_model.h5, model.ckpt.index or flax_model.msgpack found in directory D:\AI\KoboldAI\models\mayaeary_pygmalion-6b_dev-4bit-128g.
RevolverRalf commented 1 year ago

Same issue for me :(