kijai / ComfyUI-HunyuanVideoWrapper

183 stars 5 forks source link

Error: Float8_e4m3fn dtype not supported on MPS backend when loading Hunyuan model #14

Closed 6worm9 closed 10 hours ago

6worm9 commented 11 hours ago

Description

When trying to run the example workflow with Hunyuan model on MacOS with MPS backend, encountered dtype conversion error for Float8_e4m3fn which is not supported by MPS.

Environment

Steps to Reproduce

  1. Load example workflow
  2. Run workflow
  3. Error occurs at HyVideoModelLoader node

Error Message

HyVideoModelLoader
Trying to convert Float8_e4m3fn to the MPS backend but it does not have support for that dtype.
Detailed Error Traceback ``` Using accelerate to load and assign model weights to device... !!! Exception during processing !!! Trying to convert Float8_e4m3fn to the MPS backend but it does not have support for that dtype. Traceback (most recent call last): File "/Users/username/ComfyUI/execution.py", line 323, in execute output_data, output_ui, has_subgraph = get_output_data(obj, input_data_all, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/username/ComfyUI/execution.py", line 198, in get_output_data return_values = _map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True, execution_block_cb=execution_block_cb, pre_execute_cb=pre_execute_cb) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/username/ComfyUI/execution.py", line 169, in _map_node_over_list process_inputs(input_dict, i) File "/Users/username/ComfyUI/execution.py", line 158, in process_inputs results.append(getattr(obj, func)(**inputs)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/username/ComfyUI/custom_nodes/ComfyUI-HunyuanVideoWrapper/nodes.py", line 174, in loadmodel set_module_tensor_to_device(transformer, name, device=transformer_load_device, dtype=dtype_to_use, value=sd[name]) File "/Users/username/miniconda3/lib/python3.12/site-packages/accelerate/utils/modeling.py", line 294, in set_module_tensor_to_device value = value.to(dtype) ^^^^^^^^^^^^^^^ TypeError: Trying to convert Float8_e4m3fn to the MPS backend but it does not have support for that dtype. ```

Workflow

workflow

kijai commented 11 hours ago

That's right, fp8 is not supported on MPS. I have uploaded compatible .safetensors file of the original bf16 as well, but I have no MPS device to test or develop support for myself.