NVIDIA / DeepLearningExamples

State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enterprise-grade infrastructure.
13.63k stars 3.24k forks source link

Inference on tacotron and waveglow for ljspeech not working, i trained on databricks gpu nvidia eula server , i am getting the same error on custom dataset too, i am trrying this for first time, so no idea #986

Closed jazz215 closed 2 years ago

jazz215 commented 3 years ago

My training script for tacotron

python -m multiproc train.py -m Tacotron2 -o ./output/ -lr 1e-3 --epochs 1 -bs 32 --weight-decay 1e-6 --grad-clip-thresh 1.0 --cudnn-enabled --log-file nvlog.json --anneal-steps 500 1000 1500 --anneal-factor 0.1

My training script for waveglow

python -m multiproc train.py -m WaveGlow -o ./output/ -lr 1e-4 --epochs 1 -bs 4 -wn-channels 256 --segment-length 8000 --weight-decay 0 --grad-clip-thresh 3.4028234663852886e+38 --cudnn-enabled --cudnn-benchmark --log-file nvlog.json I tried following inference scripts:

python inference.py --tacotron2 ./output/checkpoint_Tacotron2_last.pt --waveglow ./output/checkpoint_WaveGlow_last.pt --wn-channels 256 -o output/ --include-warmup -i phrases/phrase.txt --fp16

python inference.py --tacotron2 ./output/checkpoint_Tacotron2_0.pt --waveglow ./output/checkpoint_WaveGlow_0.pt --wn-channels 256 -o output/ --include-warmup -i phrases/phrase.txt --fp32

python inference.py --tacotron2 ./output/checkpoint_Tacotron2_0.pt --waveglow ./output/checkpoint_WaveGlow_0.pt -o output/ --include-warmup -i phrases/phrase.txt --fp32

python inference.py --tacotron2 output/checkpoint_Tacotron2_0.pt --waveglow output/checkpoint_WaveGlow_0.pt -o output/ --include-warmup -i phrases/phrase.txt.txt --logfile=output/nvlog_fp32.json

Traceback (most recent call last): File "inference.py", line 274, in main() File "inference.py", line 209, in main waveglow = load_and_setup_model('WaveGlow', parser, args.waveglow, File "inference.py", line 125, in load_and_setup_model model.load_state_dict(state_dict) File "/databricks/conda/envs/databricks-ml-gpu/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1406, in load_state_dict raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format( RuntimeError: Error(s) in loading state_dict for WaveGlow__forward_is_infer: size mismatch for WN.0.in_layers.0.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.in_layers.0.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.in_layers.0.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]). size mismatch for WN.0.in_layers.1.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.in_layers.1.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.in_layers.1.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]). size mismatch for WN.0.in_layers.2.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.in_layers.2.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.in_layers.2.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]). size mismatch for WN.0.in_layers.3.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.in_layers.3.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.in_layers.3.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]). size mismatch for WN.0.in_layers.4.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.in_layers.4.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.in_layers.4.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]). size mismatch for WN.0.in_layers.5.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.in_layers.5.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.in_layers.5.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]). size mismatch for WN.0.in_layers.6.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.in_layers.6.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.in_layers.6.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]). size mismatch for WN.0.in_layers.7.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.in_layers.7.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.in_layers.7.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]). size mismatch for WN.0.res_skip_layers.0.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.res_skip_layers.0.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.res_skip_layers.0.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]). size mismatch for WN.0.res_skip_layers.1.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.res_skip_layers.1.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.res_skip_layers.1.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]). size mismatch for WN.0.res_skip_layers.2.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.res_skip_layers.2.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.res_skip_layers.2.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]). size mismatch for WN.0.res_skip_layers.3.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.res_skip_layers.3.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.res_skip_layers.3.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]). size mismatch for WN.0.res_skip_layers.4.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.res_skip_layers.4.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.res_skip_layers.4.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]). size mismatch for WN.0.res_skip_layers.5.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.res_skip_layers.5.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.res_skip_layers.5.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]). size mismatch for WN.0.res_skip_layers.6.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.res_skip_layers.6.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.res_skip_layers.6.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]). size mismatch for WN.0.res_skip_layers.7.bias: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([256]). size mismatch for WN.0.res_skip_layers.7.weight_g: copying a param with shape torch.Size([512, 1, 1]) from checkpoint, the shape in current model is torch.Size([256, 1, 1]). size mismatch for WN.0.res_skip_layers.7.weight_v: copying a param with shape torch.Size([512, 512, 1]) from checkpoint, the shape in current model is torch.Size([256, 256, 1]). size mismatch for WN.0.cond_layers.0.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.cond_layers.0.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.cond_layers.0.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]). size mismatch for WN.0.cond_layers.1.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.cond_layers.1.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.cond_layers.1.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]). size mismatch for WN.0.cond_layers.2.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.cond_layers.2.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.cond_layers.2.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]). size mismatch for WN.0.cond_layers.3.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.cond_layers.3.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.cond_layers.3.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]). size mismatch for WN.0.cond_layers.4.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.cond_layers.4.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.cond_layers.4.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]). size mismatch for WN.0.cond_layers.5.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.cond_layers.5.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.cond_layers.5.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]). size mismatch for WN.0.cond_layers.6.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.cond_layers.6.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.cond_layers.6.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]). size mismatch for WN.0.cond_layers.7.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.0.cond_layers.7.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.0.cond_layers.7.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]). size mismatch for WN.0.start.bias: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([256]). size mismatch for WN.0.start.weight_g: copying a param with shape torch.Size([512, 1, 1]) from checkpoint, the shape in current model is torch.Size([256, 1, 1]). size mismatch for WN.0.start.weight_v: copying a param with shape torch.Size([512, 4, 1]) from checkpoint, the shape in current model is torch.Size([256, 4, 1]). size mismatch for WN.0.end.weight: copying a param with shape torch.Size([8, 512, 1]) from checkpoint, the shape in current model is torch.Size([8, 256, 1]). size mismatch for WN.1.in_layers.0.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.in_layers.0.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.in_layers.0.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]). size mismatch for WN.1.in_layers.1.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.in_layers.1.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.in_layers.1.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]). size mismatch for WN.1.in_layers.2.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.in_layers.2.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.in_layers.2.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]). size mismatch for WN.1.in_layers.3.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.in_layers.3.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.in_layers.3.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]). size mismatch for WN.1.in_layers.4.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.in_layers.4.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.in_layers.4.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]). size mismatch for WN.1.in_layers.5.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.in_layers.5.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.in_layers.5.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]). size mismatch for WN.1.in_layers.6.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.in_layers.6.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.in_layers.6.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]). size mismatch for WN.1.in_layers.7.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.in_layers.7.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.in_layers.7.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]). size mismatch for WN.1.res_skip_layers.0.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.res_skip_layers.0.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.res_skip_layers.0.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]). size mismatch for WN.1.res_skip_layers.1.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.res_skip_layers.1.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.res_skip_layers.1.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]). size mismatch for WN.1.res_skip_layers.2.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.res_skip_layers.2.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.res_skip_layers.2.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]). size mismatch for WN.1.res_skip_layers.3.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.res_skip_layers.3.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.res_skip_layers.3.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]). size mismatch for WN.1.res_skip_layers.4.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.res_skip_layers.4.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.res_skip_layers.4.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]). size mismatch for WN.1.res_skip_layers.5.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.res_skip_layers.5.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.res_skip_layers.5.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]). size mismatch for WN.1.res_skip_layers.6.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.res_skip_layers.6.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.res_skip_layers.6.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]). size mismatch for WN.1.res_skip_layers.7.bias: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([256]). size mismatch for WN.1.res_skip_layers.7.weight_g: copying a param with shape torch.Size([512, 1, 1]) from checkpoint, the shape in current model is torch.Size([256, 1, 1]). size mismatch for WN.1.res_skip_layers.7.weight_v: copying a param with shape torch.Size([512, 512, 1]) from checkpoint, the shape in current model is torch.Size([256, 256, 1]). size mismatch for WN.1.cond_layers.0.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.cond_layers.0.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.cond_layers.0.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]). size mismatch for WN.1.cond_layers.1.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.cond_layers.1.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.cond_layers.1.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]). size mismatch for WN.1.cond_layers.2.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.cond_layers.2.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.cond_layers.2.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]). size mismatch for WN.1.cond_layers.3.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.1.cond_layers.3.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.1.cond_layers.3.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]). size mismatch for WN.1.cond_layers.4.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).

WARNING: skipped 105503 bytes of output

size mismatch for WN.10.in_layers.1.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]).
size mismatch for WN.10.in_layers.2.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.in_layers.2.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.in_layers.2.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]).
size mismatch for WN.10.in_layers.3.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.in_layers.3.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.in_layers.3.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]).
size mismatch for WN.10.in_layers.4.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.in_layers.4.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.in_layers.4.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]).
size mismatch for WN.10.in_layers.5.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.in_layers.5.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.in_layers.5.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]).
size mismatch for WN.10.in_layers.6.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.in_layers.6.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.in_layers.6.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]).
size mismatch for WN.10.in_layers.7.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.in_layers.7.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.in_layers.7.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]).
size mismatch for WN.10.res_skip_layers.0.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.res_skip_layers.0.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.res_skip_layers.0.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]).
size mismatch for WN.10.res_skip_layers.1.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.res_skip_layers.1.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.res_skip_layers.1.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]).
size mismatch for WN.10.res_skip_layers.2.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.res_skip_layers.2.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.res_skip_layers.2.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]).
size mismatch for WN.10.res_skip_layers.3.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.res_skip_layers.3.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.res_skip_layers.3.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]).
size mismatch for WN.10.res_skip_layers.4.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.res_skip_layers.4.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.res_skip_layers.4.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]).
size mismatch for WN.10.res_skip_layers.5.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.res_skip_layers.5.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.res_skip_layers.5.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]).
size mismatch for WN.10.res_skip_layers.6.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.res_skip_layers.6.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.res_skip_layers.6.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]).
size mismatch for WN.10.res_skip_layers.7.bias: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([256]).
size mismatch for WN.10.res_skip_layers.7.weight_g: copying a param with shape torch.Size([512, 1, 1]) from checkpoint, the shape in current model is torch.Size([256, 1, 1]).
size mismatch for WN.10.res_skip_layers.7.weight_v: copying a param with shape torch.Size([512, 512, 1]) from checkpoint, the shape in current model is torch.Size([256, 256, 1]).
size mismatch for WN.10.cond_layers.0.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.cond_layers.0.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.cond_layers.0.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]).
size mismatch for WN.10.cond_layers.1.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.cond_layers.1.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.cond_layers.1.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]).
size mismatch for WN.10.cond_layers.2.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.cond_layers.2.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.cond_layers.2.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]).
size mismatch for WN.10.cond_layers.3.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.cond_layers.3.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.cond_layers.3.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]).
size mismatch for WN.10.cond_layers.4.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.cond_layers.4.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.cond_layers.4.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]).
size mismatch for WN.10.cond_layers.5.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.cond_layers.5.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.cond_layers.5.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]).
size mismatch for WN.10.cond_layers.6.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.cond_layers.6.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.cond_layers.6.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]).
size mismatch for WN.10.cond_layers.7.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.10.cond_layers.7.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.10.cond_layers.7.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]).
size mismatch for WN.10.start.bias: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([256]).
size mismatch for WN.10.start.weight_g: copying a param with shape torch.Size([512, 1, 1]) from checkpoint, the shape in current model is torch.Size([256, 1, 1]).
size mismatch for WN.10.start.weight_v: copying a param with shape torch.Size([512, 2, 1]) from checkpoint, the shape in current model is torch.Size([256, 2, 1]).
size mismatch for WN.10.end.weight: copying a param with shape torch.Size([4, 512, 1]) from checkpoint, the shape in current model is torch.Size([4, 256, 1]).
size mismatch for WN.11.in_layers.0.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.in_layers.0.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.in_layers.0.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]).
size mismatch for WN.11.in_layers.1.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.in_layers.1.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.in_layers.1.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]).
size mismatch for WN.11.in_layers.2.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.in_layers.2.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.in_layers.2.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]).
size mismatch for WN.11.in_layers.3.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.in_layers.3.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.in_layers.3.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]).
size mismatch for WN.11.in_layers.4.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.in_layers.4.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.in_layers.4.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]).
size mismatch for WN.11.in_layers.5.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.in_layers.5.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.in_layers.5.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]).
size mismatch for WN.11.in_layers.6.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.in_layers.6.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.in_layers.6.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]).
size mismatch for WN.11.in_layers.7.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.in_layers.7.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.in_layers.7.weight_v: copying a param with shape torch.Size([1024, 512, 3]) from checkpoint, the shape in current model is torch.Size([512, 256, 3]).
size mismatch for WN.11.res_skip_layers.0.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.res_skip_layers.0.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.res_skip_layers.0.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]).
size mismatch for WN.11.res_skip_layers.1.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.res_skip_layers.1.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.res_skip_layers.1.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]).
size mismatch for WN.11.res_skip_layers.2.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.res_skip_layers.2.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.res_skip_layers.2.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]).
size mismatch for WN.11.res_skip_layers.3.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.res_skip_layers.3.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.res_skip_layers.3.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]).
size mismatch for WN.11.res_skip_layers.4.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.res_skip_layers.4.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.res_skip_layers.4.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]).
size mismatch for WN.11.res_skip_layers.5.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.res_skip_layers.5.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.res_skip_layers.5.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]).
size mismatch for WN.11.res_skip_layers.6.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.res_skip_layers.6.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.res_skip_layers.6.weight_v: copying a param with shape torch.Size([1024, 512, 1]) from checkpoint, the shape in current model is torch.Size([512, 256, 1]).
size mismatch for WN.11.res_skip_layers.7.bias: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([256]).
size mismatch for WN.11.res_skip_layers.7.weight_g: copying a param with shape torch.Size([512, 1, 1]) from checkpoint, the shape in current model is torch.Size([256, 1, 1]).
size mismatch for WN.11.res_skip_layers.7.weight_v: copying a param with shape torch.Size([512, 512, 1]) from checkpoint, the shape in current model is torch.Size([256, 256, 1]).
size mismatch for WN.11.cond_layers.0.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.cond_layers.0.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.cond_layers.0.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]).
size mismatch for WN.11.cond_layers.1.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.cond_layers.1.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.cond_layers.1.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]).
size mismatch for WN.11.cond_layers.2.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.cond_layers.2.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.cond_layers.2.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]).
size mismatch for WN.11.cond_layers.3.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.cond_layers.3.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.cond_layers.3.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]).
size mismatch for WN.11.cond_layers.4.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.cond_layers.4.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]).
size mismatch for WN.11.cond_layers.4.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]).
size mismatch for WN.11.cond_layers.5.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]).
size mismatch for WN.11.cond_layers.5.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, 

the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.11.cond_layers.5.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]). size mismatch for WN.11.cond_layers.6.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.11.cond_layers.6.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.11.cond_layers.6.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]). size mismatch for WN.11.cond_layers.7.bias: copying a param with shape torch.Size([1024]) from checkpoint, the shape in current model is torch.Size([512]). size mismatch for WN.11.cond_layers.7.weight_g: copying a param with shape torch.Size([1024, 1, 1]) from checkpoint, the shape in current model is torch.Size([512, 1, 1]). size mismatch for WN.11.cond_layers.7.weight_v: copying a param with shape torch.Size([1024, 640, 1]) from checkpoint, the shape in current model is torch.Size([512, 640, 1]). size mismatch for WN.11.start.bias: copying a param with shape torch.Size([512]) from checkpoint, the shape in current model is torch.Size([256]). size mismatch for WN.11.start.weight_g: copying a param with shape torch.Size([512, 1, 1]) from checkpoint, the shape in current model is torch.Size([256, 1, 1]). size mismatch for WN.11.start.weight_v: copying a param with shape torch.Size([512, 2, 1]) from checkpoint, the shape in current model is torch.Size([256, 2, 1]). size mismatch for WN.11.end.weight: copying a param with shape torch.Size([4, 512, 1]) from checkpoint, the shape in current model is torch.Size([4, 256, 1]).

pedrohlopes commented 3 years ago

I'm facing the same issue here as well when trying to run inference from custom trained models... The problem doesn't happen when running from the pre-trained models, though.

pedrohlopes commented 3 years ago

Seems like the default wn-channels parameters for the waveglow training is 512 and the inference command sets it to 256, so it becomes incompatible with the saved models. You can either change the model default to 256 in waveglow/arg_parser.py or run inference with --wn-channels 512 option. That solved it for me.

jazz215 commented 3 years ago

Yeah that was it. I had to train the waveglow part with the same number of channels as in the inference script.

On Tue, Aug 24, 2021 at 7:16 AM pedrohlopes @.***> wrote:

Seems like the default wn-channels parameters for the waveglow training is 512 and the inference command sets it to 256, so it becomes incompatible with the saved models. You can either change the model default to 256 in waveglow/arg_parser.py or run inference with --wn-channels 512 option. That solved it for me.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/NVIDIA/DeepLearningExamples/issues/986#issuecomment-904548978, or unsubscribe https://github.com/notifications/unsubscribe-auth/AKHVMLMXTFNRBECS2NBEAXLT6N5PVANCNFSM5CFTOBSQ . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&utm_campaign=notification-email .