0cc4m / KoboldAI

GNU Affero General Public License v3.0
151 stars 30 forks source link

i cannot load any ai models and i keep getting this error no matter what i do. this happened after i did "git pull" command from this repository #50

Open 0xYc0d0ne opened 1 year ago

0xYc0d0ne commented 1 year ago

Exception in thread Thread-14: Traceback (most recent call last): File "B:\python\lib\threading.py", line 932, in _bootstrap_inner self.run() File "B:\python\lib\threading.py", line 870, in run self._target(*self._args, self._kwargs) File "B:\python\lib\site-packages\socketio\server.py", line 731, in _handle_event_internal r = server._trigger_event(data[0], namespace, sid, data[1:]) File "B:\python\lib\site-packages\socketio\server.py", line 756, in _trigger_event return self.handlers[namespace][event](args) File "B:\python\lib\site-packages\flask_socketio__init.py", line 282, in _handler return self._handle_event(handler, message, namespace, sid, File "B:\python\lib\site-packages\flask_socketio\init__.py", line 828, in _handle_event ret = handler(args) File "aiserver.py", line 615, in g return f(a, k) File "aiserver.py", line 3191, in get_message load_model(use_gpu=msg['use_gpu'], gpu_layers=msg['gpu_layers'], disk_layers=msg['disk_layers'], online_model=msg['online_model']) File "aiserver.py", line 1980, in load_model model.load( File "C:\KoboldAI\modeling\inference_model.py", line 177, in load self._load(save_model=save_model, initial_load=initial_load) File "C:\KoboldAI\modeling\inference_models\hf_torch_4bit.py", line 198, in _load self.model = self._get_model(self.get_local_model_path(), tf_kwargs) File "C:\KoboldAI\modeling\inference_models\hf_torch_4bit.py", line 378, in _get_model model = load_quant_offload(llama_load_quant, utils.koboldai_vars.custmodpth, path_4bit, utils.koboldai_vars.gptq_bits, groupsize, self.gpu_layers_list, force_bias=v2_bias) TypeError: load_quant_offload() got an unexpected keyword argument 'force_bias'

0cc4m commented 1 year ago

You need to update the gptq module. Run install_requirements again.