kijai / ComfyUI-KwaiKolorsWrapper

Diffusers wrapper to run Kwai-Kolors model
Apache License 2.0
517 stars 26 forks source link

Error occurred when executing LoadChatGLM3 #16

Open HEITAOKAKA opened 1 month ago

HEITAOKAKA commented 1 month ago

Error occurred when executing LoadChatGLM3:

Only Tensors of floating point and complex dtype can require gradients

File "I:\AI\ComfyUI\execution.py", line 151, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\AI\ComfyUI\execution.py", line 81, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\AI\ComfyUI\execution.py", line 74, in map_node_over_list results.append(getattr(obj, func)(slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\AI\ComfyUI\custom_nodes\ComfyUI-KwaiKolorsWrapper\nodes.py", line 124, in loadmodel text_encoder.quantize(8) File "I:\AI\ComfyUI\custom_nodes\ComfyUI-KwaiKolorsWrapper\kolors\models\modeling_chatglm.py", line 852, in quantize quantize(self.encoder, weight_bit_width) File "I:\AI\ComfyUI\custom_nodes\ComfyUI-KwaiKolorsWrapper\kolors\models\quantization.py", line 155, in quantize layer.self_attention.query_key_value = QuantizedLinear( ^^^^^^^^^^^^^^^^ File "I:\AI\ComfyUI\custom_nodes\ComfyUI-KwaiKolorsWrapper\kolors\models\quantization.py", line 141, in init self.weight = Parameter(self.weight.to(device), requires_grad=False) ^^^^^^^^^^^ File "I:\AI\ComfyUI.ext\Lib\site-packages\torch\nn\modules\module.py", line 1726, in setattr self.register_parameter(name, value) File "I:\AI\ComfyUI.ext\Lib\site-packages\accelerate\big_modeling.py", line 123, in register_empty_parameter module._parameters[name] = param_cls(module._parameters[name].to(device), kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "I:\AI\ComfyUI.ext\Lib\site-packages\torch\nn\parameter.py", line 40, in new return torch.Tensor._make_subclass(cls, data, requires_grad)

touchwolf commented 1 month ago

screenshot-20240717-152649

I encountered the same issue. I resolved it by changing the "load chatglm3 model" node to "(download) load chatglm3 model" node, without changing the model save location. After this modification, everything worked fine.