File "C:\...\GitHub\ComfyUI_windows_portable\ComfyUI\execution.py", line 152, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\...\GitHub\ComfyUI_windows_portable\ComfyUI\execution.py", line 82, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\...\GitHub\ComfyUI_windows_portable\ComfyUI\execution.py", line 75, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\...\GitHub\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI-N-Nodes\py\gptcpp_node.py", line 51, in load_gpt_checkpoint
llm = Llama(model_path=ckpt_path,n_gpu_layers=gpu_layers,verbose=False,n_threads=n_threads, n_ctx=max_ctx, )
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\...\GitHub\ComfyUI_windows_portable\python_embeded\Lib\site-packages\llama_cpp\llama.py", line 328, in __init__
assert self.model is not None
^^^^^^^^^^^^^^^^^^^^^^
I have everything up to date, I've run Update All. I've saved gguf file here: "C:...\GitHub\ComfyUI_windows_portable\ComfyUI\models\GPTcheckpoints\IFpromptMKR-7b-q4_k_m.gguf"
Not sure what I'm doing wrong. Thanks for any help!
Just followed this tutorial: https://www.youtube.com/watch?v=gzTqXbF0S-w and I can't seem to be able to load GPT.
I'm getting following error:
Error occurred when executing GPT Loader Simple:
I have everything up to date, I've run Update All. I've saved gguf file here: "C:...\GitHub\ComfyUI_windows_portable\ComfyUI\models\GPTcheckpoints\IFpromptMKR-7b-q4_k_m.gguf"
Not sure what I'm doing wrong. Thanks for any help!