chrisgoringe / cg-mixed-casting

8 stars 0 forks source link

Got unsupported ScalarType Float8_e4m3fn #7

Closed itswhateverman closed 2 months ago

itswhateverman commented 2 months ago

I assume the following is indicating we can't start with an fp8 model and need fp16? Could the node upcast first to a useable format? I know it may not be ideal but it may still have utility.


Error Details

itswhateverman commented 2 months ago

the above replies are malware spam fyi

edit: they've been removed

chrisgoringe commented 2 months ago

Thanks for the report.

Try an update (git pull in the cg-mixed-casting directory) and see if it works now.

itswhateverman commented 2 months ago

After update, now errors on BFloat16:

I'm trying to load custom models such as jibMixFlux_v10. (fp8) I have this issue with the other custom models I have tried. I don't experience this using fp16 models which properly quantize.

Mixed Cast Flux Loader

Got unsupported ScalarType BFloat16

ComfyUI Error Report

Error Details

chrisgoringe commented 2 months ago

If you go to the file mixed_gguf_node.py and find line 133 (the one shown in the error), change bfloat16 to float and it should work.

That change will be in the next push...

chrisgoringe commented 2 months ago

Should be fixed in the new version.

itswhateverman commented 2 months ago

yep thanks