Closed neutronslime closed 7 months ago
Sigma is included to ELLA prompt node
Sigma is included to ELLA prompt node
but it still report error,just like the code I copied
Same issue here Error occurred when executing ELLATextEncode:
expected np.ndarray (got float)
File "C:\ComfyUI_BLYAT\ComfyUI_windows_portable_nvidia_cu121_or_cpu\ComfyUI_windows_portable\ComfyUI\execution.py", line 151, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI_BLYAT\ComfyUI_windows_portable_nvidia_cu121_or_cpu\ComfyUI_windows_portable\ComfyUI\execution.py", line 81, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI_BLYAT\ComfyUI_windows_portable_nvidia_cu121_or_cpu\ComfyUI_windows_portable\ComfyUI\execution.py", line 74, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\ComfyUI_BLYAT\ComfyUI_windows_portable_nvidia_cu121_or_cpu\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_ELLA\ella.py", line 100, in encode cond_ella = ella(cond, timesteps=torch.from_numpy(sigma)) ^^^^^^^^^^^^^^^^^^^^^^^
diff --git a/ella.py b/ella.py
index a202c66..c810bd1 100644
--- a/ella.py
+++ b/ella.py
@@ -97,7 +97,7 @@ class ELLATextEncode:
t5: T5TextEmbedder = ella_dict.get("T5")
cond = t5(text)
- cond_ella = ella(cond, timesteps=torch.from_numpy(sigma))
+ cond_ella = ella(cond, timesteps=torch.tensor(sigma))
return ([[cond_ella, {"pooled_output": cond_ella}]], ) # Output twice as we don't use pooled output
Why was this marked as completed? I'm receiving the same error. If it matters, I hastily installed Kijai's more lightweight flan-t5-xl model and manually installed sentencepiece through the embedded Python, Other than that, I think I did everything the standard way. I'm on Windows 10.
expected np.ndarray (got float)
File "H:\ComfyUI_windows_portable_nvidia_cu118_or_cpu\ComfyUI_windows_portable\ComfyUI\execution.py", line 151, in recursive_execute
output_data, output_ui = get_output_data(obj, input_data_all)
File "H:\ComfyUI_windows_portable_nvidia_cu118_or_cpu\ComfyUI_windows_portable\ComfyUI\execution.py", line 81, in get_output_data
return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True)
File "H:\ComfyUI_windows_portable_nvidia_cu118_or_cpu\ComfyUI_windows_portable\ComfyUI\execution.py", line 74, in map_node_over_list
results.append(getattr(obj, func)(**slice_dict(input_data_all, i)))
File "H:\ComfyUI_windows_portable_nvidia_cu118_or_cpu\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_ELLA\ella.py", line 100, in encode
cond_ella = ella(cond, timesteps=torch.from_numpy(sigma))
Update: I didn't realize qfe0's comment was the solution. If anyone else doesn't get it immediately, open ComfyUI\custom_nodes\ComfyUI_ELLA\ella.py and replace
cond_ella = ella(cond, timesteps=torch.from_numpy(sigma))
with:
cond_ella = ella(cond, timesteps=torch.tensor(sigma))
I don't understand why it is working on my machine without this edit.
Perhaps people that get this error are not using the correct nodes, as https://github.com/ExponentialML/ComfyUI_ELLA/issues/22#issuecomment-2057836102 seems to suggest.
I see where the confusion is. When I generate the workflow from scratch:
I think the fix here is to remove the default sigma value from ELLA Text Encode and change the type from FLOAT to SIGMA in both ELLA Text Encode and the return from Sigma. That would eliminate the confusion about how the nodes are intended to be used.
Error occurred when executing ELLATextEncode:
expected np.ndarray (got float)
File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 151, in recursive_execute output_data, output_ui = get_output_data(obj, input_data_all) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 81, in get_output_data return_values = map_node_over_list(obj, input_data_all, obj.FUNCTION, allow_interrupt=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\execution.py", line 74, in map_node_over_list results.append(getattr(obj, func)(**slice_dict(input_data_all, i))) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\AI\ComfyUI_windows_portable\ComfyUI\custom_nodes\ComfyUI_ELLA\ella.py", line 100, in encode cond_ella = ella(cond, timesteps=torch.from_numpy(sigma))
there is no chioce to connect the get sigma options