jwj7140 / Bert-VITS2-Korean

vits2 backbone with multilingual-bert(한국어 지원)
GNU Affero General Public License v3.0
24 stars 1 forks source link

initial_lr? #3

Open JWWPXX opened 7 months ago

JWWPXX commented 7 months ago

when I run train_ms.py ,show raise KeyError("param 'initial_lr' is not specified " KeyError: "param 'initial_lr' is not specified in param_groups[0] when resuming an optimizer"

Can you help me?

加载config中的配置localhost 加载config中的配置10086 加载config中的配置1 加载config中的配置0 加载config中的配置0 加载环境变量 MASTER_ADDR: localhost, MASTER_PORT: 10086, WORLD_SIZE: 1, RANK: 0, LOCAL_RANK: 0 02-03 20:49:07 INFO | data_utils.py:61 | Init dataset... 02-03 20:49:07 INFO | data_utils.py:76 | skipped: 0, total: 368 02-03 20:49:07 INFO | data_utils.py:61 | Init dataset... 02-03 20:49:07 INFO | data_utils.py:76 | skipped: 0, total: 4 Using noise scaled MAS for VITS2 Using duration discriminator for VITS2 100%|██████████| 368/368 [00:00<00:00, 46001.96it/s] 100%|██████████| 4/4 [00:00<?, ?it/s] Use existed model, skip downloading. PytorchStreamReader failed reading zip archive: failed finding central directory Traceback (most recent call last): File "D:\AI Learning\Bert-VITS2-Korean(old)\train_ms.py", line 850, in run() File "D:\AI Learning\Bert-VITS2-Korean(old)\train_ms.py", line 339, in run scheduler_g = torch.optim.lr_scheduler.ExponentialLR( File "D:\Anaconda3\envs\Bert-VITS2-Korean\lib\site-packages\torch\optim\lr_scheduler.py", line 591, in init super().init(optimizer, last_epoch, verbose) File "D:\Anaconda3\envs\Bert-VITS2-Korean\lib\site-packages\torch\optim\lr_scheduler.py", line 43, in init raise KeyError("param 'initial_lr' is not specified " KeyError: "param 'initial_lr' is not specified in param_groups[0] when resuming an optimizer"

Process finished with exit code 1