bmaltais / kohya_ss

Apache License 2.0
9.64k stars 1.24k forks source link

Training Lora always using Standard Lora #2326

Closed rushuna86 closed 6 months ago

rushuna86 commented 6 months ago

When using GUI to change Lora Type even when selecting other types like locon and adding Conv values, tomol always comes out as standard lora networks.lora

bmaltais commented 6 months ago

Le me take a look at this... not supposed to be like that...

bmaltais commented 6 months ago

I just tested on my system and it does work as expected... Are you sure you are using the "Lora" tab and not the "Dreambooth" tab?

rushuna86 commented 6 months ago

100% Lora Tab. Screenshot 2024-04-19 110533

bucket_no_upscale = true bucket_reso_steps = 64 cache_latents = true cache_latents_to_disk = true caption_dropout_every_n_epochs = 0 caption_dropout_rate = 0 caption_extension = ".txt" clip_skip = 2 debiased_estimation_loss = true dynamo_backend = "no" enable_bucket = true epoch = 12 gradient_accumulation_steps = 1 huber_c = 0.1 huber_schedule = "snr" keep_tokens = 1 learning_rate = 1.0 logging_dir = "D:/Training/log" loss_type = "l2" lr_scheduler = "cosine" lr_scheduler_args = [] lr_scheduler_num_cycles = 1 lr_scheduler_power = 1 lr_warmup_steps = 0 max_bucket_reso = 2048 max_data_loader_n_workers = 0 max_grad_norm = 1 max_timestep = 1000 max_token_length = 225 max_train_steps = 1960 min_bucket_reso = 512 min_snr_gamma = 5 mixed_precision = "bf16" multires_noise_discount = 0.3 network_alpha = 128 network_args = [] network_dim = 128 network_dropout = 0 network_module = "networks.lora" noise_offset_type = "Multires" optimizer_type = "DAdaptAdam" optimizer_args = [ "decouple=True", "weight_decay=0.1", "betas=0.9,0.99", "use_bias_correction=True", "growth_rate=1.02",] output_dir = "D:/Training/model" output_name = "HildaPkm_V2" pretrained_model_name_or_path = "D:/SD/WebUI/models/Stable-diffusion/nai-pruned.ckpt" prior_loss_weight = 1 resolution = "768,768" sample_every_n_epochs = 1 sample_prompts = "D:/Training/model\prompt.txt" sample_sampler = "euler" save_every_n_epochs = 1 save_model_as = "safetensors" save_precision = "bf16" scale_weight_norms = 0 sdpa = true seed = 31337 shuffle_caption = true text_encoder_lr = 1 train_batch_size = 3 train_data_dir = "D:/Training/images" training_comment = "nai-dadpt" unet_lr = 1

file output comes out same size as standard lora as well without conv. 147,577KB. With Network Dim of 128 and Conv of 8 the size output is normally 176,833KB

bmaltais commented 6 months ago

Can you share the .json config file? I need to find what in the json does not het properly put in the toml

rushuna86 commented 6 months ago

Testing_V1_20240419-112440.json

bmaltais commented 6 months ago

OK... let me check

bmaltais commented 6 months ago

When using GUI to change Lora Type even when selecting other types like locon and adding Conv values, tomol always comes out as standard lora networks.lora

What is the network supposed to be? Maybe the gui never set it up properly? You are setting it as a kohya Locon... so for that the network should normally be a network.lora? Are you trying to train a LyCORIS Locon?

rushuna86 commented 6 months ago

i'm setting it as kohya locon, but it's ignore the conv dim settings

bmaltais commented 6 months ago

Oh... the network args are empty... this is why... let me see if I can fix that quickly

rushuna86 commented 6 months ago

Thank you just updated to v24.0.4 and it's working as intended.