bmaltais / kohya_ss

Apache License 2.0
9.76k stars 1.25k forks source link

subprocess.CalledProcessError: Command '['C:\\Users\\18078\\AppData\\Local\\Programs\\Python\\Python310\\python.exe', 'D:/kohya/kohya_ss/sd-scripts/train_network.py', '--config_file', 'D:\\kohya\\kohya_ss\\dataset\\huangquan\\output/config_lora-20240924-115523.toml']' returned non-zero exit status 1. #2852

Open AlexiosDyral opened 2 months ago

AlexiosDyral commented 2 months ago
微信图片_20240912213731
AlexiosDyral commented 2 months ago

bucket_no_upscale = true bucket_reso_steps = 1 cache_latents = true cache_latents_to_disk = true caption_extension = ".none-use-foldername" clip_skip = 1 dynamo_backend = "no" enable_bucket = true epoch = 4 gradient_accumulation_steps = 4 huber_c = 0.1 huber_schedule = "snr" learning_rate = 1.0 logging_dir = "D:\kohya\kohya_ss\dataset\huangquan\log" loss_type = "l2" lr_scheduler = "cosine" lr_scheduler_args = [] lr_scheduler_num_cycles = 1 lr_scheduler_power = 1 max_bucket_reso = 2048 max_data_loader_n_workers = 0 max_grad_norm = 1 max_timestep = 1000 max_token_length = 75 max_train_steps = 1600 min_bucket_reso = 256 min_snr_gamma = 10 mixed_precision = "bf16" multires_noise_discount = 0.2 multires_noise_iterations = 8 network_alpha = 64 network_args = [ "preset=full", "conv_dim=64", "conv_alpha=64", "rank_dropout=0", "bypass_mode=False", "dora_wd=False", "module_dropout=0", "factor=-1", "use_cp=False", "use_scalar=False", "decompose_both=False", "rank_dropout_scale=False", "algo=lokr", "train_norm=False",] network_dim = 64 network_module = "lycoris.kohya" noise_offset_type = "Multires" optimizer_args = [] optimizer_type = "Prodigy" output_dir = "D:\kohya\kohya_ss\dataset\huangquan\output" output_name = "last" pretrained_model_name_or_path = "runwayml/stable-diffusion-v1-5" prior_loss_weight = 1 resolution = "512,512" sample_prompts = "D:\kohya\kohya_ss\dataset\huangquan\output\prompt.txt" sample_sampler = "euler_a" save_every_n_epochs = 1 save_model_as = "safetensors" save_precision = "fp16" text_encoder_lr = 1 train_batch_size = 1 train_data_dir = "D:\kohya\kohya_ss\dataset\huangquan\image" unet_lr = 1 xformers = true

AlexiosDyral commented 2 months ago

{ "LoRA_type": "LyCORIS/LoKr", "LyCORIS_preset": "full", "adaptive_noise_scale": 0, "additional_parameters": "", "async_upload": false, "block_alphas": "", "block_dims": "", "block_lr_zero_threshold": "", "bucket_no_upscale": true, "bucket_reso_steps": 1, "bypass_mode": false, "cache_latents": true, "cache_latents_to_disk": true, "caption_dropout_every_n_epochs": 0, "caption_dropout_rate": 0, "caption_extension": ".none-use-foldername", "clip_skip": 1, "color_aug": false, "constrain": 0, "conv_alpha": 64, "conv_block_alphas": "", "conv_block_dims": "", "conv_dim": 64, "dataset_config": "", "debiased_estimation_loss": false, "decompose_both": false, "dim_from_weights": false, "dora_wd": false, "down_lr_weight": "", "dynamo_backend": "no", "dynamo_mode": "default", "dynamo_use_dynamic": false, "dynamo_use_fullgraph": false, "enable_bucket": true, "epoch": 4, "extra_accelerate_launch_args": "", "factor": -1, "flip_aug": false, "fp8_base": false, "full_bf16": false, "full_fp16": false, "gpu_ids": "", "gradient_accumulation_steps": 4, "gradient_checkpointing": false, "huber_c": 0.1, "huber_schedule": "snr", "huggingface_path_in_repo": "", "huggingface_repo_id": "", "huggingface_repo_type": "", "huggingface_repo_visibility": "", "huggingface_token": "", "ip_noise_gamma": 0, "ip_noise_gamma_random_strength": false, "keep_tokens": 0, "learning_rate": 1, "log_tracker_config": "", "log_tracker_name": "", "log_with": "", "logging_dir": "D:\kohya\kohya_ss\dataset\huangquan\log", "loss_type": "l2", "lr_scheduler": "cosine", "lr_scheduler_args": "", "lr_scheduler_num_cycles": 1, "lr_scheduler_power": 1, "lr_warmup": 0, "main_process_port": 0, "masked_loss": false, "max_bucket_reso": 2048, "max_data_loader_n_workers": 0, "max_grad_norm": 1, "max_resolution": "512,512", "max_timestep": 1000, "max_token_length": 75, "max_train_epochs": 0, "max_train_steps": 1600, "mem_eff_attn": false, "metadata_author": "", "metadata_description": "", "metadata_license": "", "metadata_tags": "", "metadata_title": "", "mid_lr_weight": "", "min_bucket_reso": 256, "min_snr_gamma": 10, "min_timestep": 0, "mixed_precision": "bf16", "model_list": "custom", "module_dropout": 0, "multi_gpu": false, "multires_noise_discount": 0.2, "multires_noise_iterations": 8, "network_alpha": 64, "network_dim": 64, "network_dropout": 0, "network_weights": "", "noise_offset": 0, "noise_offset_random_strength": false, "noise_offset_type": "Multires", "num_cpu_threads_per_process": 2, "num_machines": 1, "num_processes": 1, "optimizer": "Prodigy", "optimizer_args": "", "output_dir": "D:\kohya\kohya_ss\dataset\huangquan\output", "output_name": "last", "persistent_data_loader_workers": false, "pretrained_model_name_or_path": "runwayml/stable-diffusion-v1-5", "prior_loss_weight": 1, "random_crop": false, "rank_dropout": 0, "rank_dropout_scale": false, "reg_data_dir": "", "rescaled": false, "resume": "", "resume_from_huggingface": "", "sample_every_n_epochs": 0, "sample_every_n_steps": 0, "sample_prompts": "", "sample_sampler": "euler_a", "save_every_n_epochs": 1, "save_every_n_steps": 0, "save_last_n_steps": 0, "save_last_n_steps_state": 0, "save_model_as": "safetensors", "save_precision": "fp16", "save_state": false, "save_state_on_train_end": false, "save_state_to_huggingface": false, "scale_v_pred_loss_like_noise_pred": false, "scale_weight_norms": 0, "sdxl": false, "sdxl_cache_text_encoder_outputs": false, "sdxl_no_half_vae": false, "seed": 0, "shuffle_caption": false, "stop_text_encoder_training_pct": 0, "text_encoder_lr": 1, "train_batch_size": 1, "train_data_dir": "D:\kohya\kohya_ss\dataset\huangquan\image", "train_norm": false, "train_on_input": false, "training_comment": "", "unet_lr": 1, "unit": 1, "up_lr_weight": "", "use_cp": false, "use_scalar": false, "use_tucker": false, "v2": false, "v_parameterization": false, "v_pred_like_loss": 0, "vae": "", "vae_batch_size": 0, "wandb_api_key": "", "wandb_run_name": "", "weighted_captions": false, "xformers": "xformers" }

AlexiosDyral commented 2 months ago

No matter how I run, I have the same problem, what should I do

Lecho303 commented 2 months ago

optimizer_type = "Prodigy"

try optimizer_type = "Prodigy" Revise to “8bitAdamW”

and,pretrained model do not use the runaway model,use local model,put them to the folder:C:\Users\XXX\kohya_ss\models

it maybe can fix,i use it to fix once

AlexiosDyral commented 1 month ago

optimizer_type = "Prodigy"

try optimizer_type = "Prodigy" Revise to “8bitAdamW”

and,pretrained model do not use the runaway model,use local model,put them to the folder:C:\Users\XXX\kohya_ss\models

it maybe can fix,i use it to fix once

it did not work

AlexiosDyral commented 1 month ago

OSError: Can't load tokenizer for 'openai/clip-vit-large-patch14'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'openai/clip-vit-large-patch14' is the correct path to a directory containing all relevant files for a CLIPTokenizer tokenizer.

DavidYang347 commented 1 day ago

Did you solve this problem? I have same error in Ubantu, but I can run the scripts successfully in win10. Thank you for your answer.

AlexiosDyral commented 14 hours ago

when i use the local model,it can run successfully

---- Replied Message ---- | From | @.> | | Date | 12/03/2024 16:57 | | To | bmaltais/kohya_ss @.> | | Cc | AlexiosDyral @.>, Author @.> | | Subject | Re: [bmaltais/kohya_ss] subprocess.CalledProcessError: Command '['C:\Users\18078\AppData\Local\Programs\Python\Python310\python.exe', 'D:/kohya/kohya_ss/sd-scripts/train_network.py', '--config_file', 'D:\kohya\kohya_ss\dataset\huangquan\output/config_lora-20240924-115523.toml']' returned non-zero ex |

Did you solve this problem? I have same error in Ubantu, but I can run the scripts successfully in win10. Thank you for your answer.

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>