bmaltais / kohya_ss

Apache License 2.0
9.33k stars 1.21k forks source link

Flux.dev - BF16 is set instead of FP16 #2745

Open Samael-1976 opened 2 weeks ago

Samael-1976 commented 2 weeks ago

I'm trying to train a lora flux with my 2060 and 12B from VRam Currently it is practically impossible, because - even if you choose FP16 during the configuration phase

Executing command: I:\Kohya\kohya_ss\venv\Scripts\accelerate.EXE launch --dynamo_backend no --dynamo_mode default --mixed_precision bf16 --num_processes 1 --num_machines 1 --num_cpu_threads_per_process 2 I:/Kohya/kohya_ss/sd-scripts/flux_train.py --config_file G:/Lora/config_dreambooth-20240827-173150.toml

JimikoSK commented 1 week ago

Can you make a .bat file, and add the code and run it manually?

I:\Kohya\kohya_ss\venv\Scripts\accelerate.EXE launch --dynamo_backend no --dynamo_mode default --mixed_precision fp16 --num_processes 1 --num_machines 1 --num_cpu_threads_per_process 2 I:/Kohya/kohya_ss/sd-scripts/flux_train.py --config_file G:/Lora/config_dreambooth-20240827-173150.toml

That should kick off the training with fp16. Not in the GUI, but it'll still do the same training.

Samael-1976 commented 1 week ago

yes, sure, I hope to do it today, max tomorrow! Thank you

bmaltais commented 1 week ago

Hummm... I just tried and when I set the Mixed precision to fp16 it does use fp16:

image

image