Closed tdrminglin closed 6 days ago
Where does it say that? 1.5 only supports bf16, black outputs with fp16 are expected currently.
Where does it say that? 1.5 only supports bf16, black outputs with fp16 are expected currently.
https://huggingface.co/THUDM/CogVideoX1.5-5B-I2V in their model Introduction, supported Inference Precision are just the same as THUDM/CogVideoX-5b-I2V . My 2080ti GPU is too old to support bf16 ,maybe I should just use fp32 instead.
Probably just copy/pasted from the old model card, the weights are in bf16, they didn't release fp32 weights, as far as I know the model was trained with bf16, so we can't really even get proper fp16 weights.
Probably just copy/pasted from the old model card, the weights are in bf16, they didn't release fp32 weights, as far as I know the model was trained with bf16, so we can't really even get proper fp16 weights.
Thanks for the reply. I just tried fp32+fp8_transformer enabled. I successfully get a valid result, although with low quanlity .
THUDM/CogVideoX1.5-5B-I2V says their model support fp16. As ComfyUI-CogVideoXWrapper currently can't select THUDM/CogVideoX1.5-5B-I2V, I'm using Kijai/CogVideoX-5b-1.5 instead. But when I use fp16 to run the model ,I get totally black video.