Linaqruf / kohya-trainer

Adapted from https://note.com/kohya_ss/n/nbf7ce8d80f29 for easier cloning
Apache License 2.0
1.83k stars 300 forks source link

bf16 mixed precision requires PyTorch >= 1.10 and a supported device. #246

Open KingWu opened 1 year ago

KingWu commented 1 year ago

save_precision = "bf16" mixed_precision = "bf16"

these two parameter use bf16 will throw the following error

bf16 mixed precision requires PyTorch >= 1.10 and a supported device.

Any idea?

SlZeroth commented 1 year ago

same

Linaqruf commented 1 year ago

T4 and V100 doesn't support that precision, please use FP16 or FP32 (no)