Closed annaklyueva closed 2 years ago
Please try the latest version.
@erogol I was doing everything according to this pipeline (https://github.com/Edresson/YourTTS/issues/8). Here, it was mentioned that for training and finetuning we have to use this branch https://github.com/Edresson/Coqui-TTS/tree/multilingual-torchaudio-SE/
If it is not correct, could you please clarify what steps should I follow?
I had same issues, but trying with tacotron2 models.
I don't see any errors in the logs. Without any error message or trace, it is hard for us to help. Can you try pulling some more information about how the training really ends.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions. You might also look our discussion channels.
Describe the bug
Good day!
I'm trying to train YourTTS model, seems like I've done everything correct, however, my training stoped after the first 656 steps. What might be the problem?
To Reproduce
Expected behavior
The training should last for 1000 epochs, but it went only for 3 epoches and then stopped.
Logs
Environment
Additional context
{ "model": "vits", "run_name": "vits_tts-rus", "run_description": "", "epochs": 1000, "batch_size": 32, "eval_batch_size": 32, "mixed_precision": false, "scheduler_after_epoch": true, "run_eval": true, "test_delay_epochs": -1, "print_eval": true, "dashboard_logger": "tensorboard", "print_step": 1, "plot_step": 100, "model_param_stats": false, "project_name": null, "log_model_step": 10, "wandb_entity": null, "save_step": 20, "checkpoint": true, "keep_all_best": false, "keep_after": 20, "num_loader_workers": 4, "num_eval_loader_workers": 4, "use_noise_augment": false, "use_language_weighted_sampler": true, "output_path": "YourTTS_ru", "distributed_backend": "nccl", "distributed_url": "tcp://localhost:54321", "audio": { "fft_size": 1024, "win_length": 1024, "hop_length": 256, "frame_shift_ms": null, "frame_length_ms": null, "stft_pad_mode": "reflect", "sample_rate": 16000, "resample": true, "preemphasis": 0.0, "ref_level_db": 20, "do_sound_norm": true, "log_func": "np.log", "do_trim_silence": true, "trim_db": 45, "power": 1.5, "griffin_lim_iters": 60, "num_mels": 80, "mel_fmin": 0.0, "mel_fmax": null, "spec_gain": 1, "do_amp_to_db_linear": false, "do_amp_to_db_mel": true, "signal_norm": false, "min_level_db": -100, "symmetric_norm": true, "max_norm": 4.0, "clip_norm": true, "stats_path": null }, "use_phonemes": false, "use_espeak_phonemes": false, "phoneme_language": "pt-br", "compute_input_seq_cache": false, "text_cleaner": "multilingual_cleaners", "enable_eos_bos_chars": false, "test_sentences_file": "", "phoneme_cachepath": null, "characters": { "pad": "", "eos": "&", "bos": "*", "characters": "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyzЁЙЦУКЕНГШЩЗХЪФЫВАПРОЛДЖЭЯЧСМИТЬБЮёйцукенгшщзхфывапролджэъячсмитьбю«»–—\u00af\u00b7\u00df\u00e0\u00e1\u00e2\u00e3\u00e4\u00e6\u00e7\u00e8\u00e9\u00ea\u00eb\u00ec\u00ed\u00ee\u00ef\u00f1\u00f2\u00f3\u00f4\u00f5\u00f6\u00f9\u00fa\u00fb\u00fc\u00ff\u0101\u0105\u0107\u0113\u0119\u011b\u012b\u0131\u0142\u0144\u014d\u0151\u0153\u015b\u016b\u0171\u017a\u017c\u01ce\u01d0\u01d2\u01d4\u0430\u0431\u0432\u0433\u0434\u0435\u0436\u0437\u0438\u0439\u043a\u043b\u043c\u043d\u043e\u043f\u0440\u0441\u0442\u0443\u0444\u0445\u0446\u0447\u0448\u0449\u044a\u044b\u044c\u044d\u044e\u044f\u0451\u0454\u0456\u0457\u0491\u2013!'(),-.:;? ", "punctuations": "!'(),-.:;? «»–—", "phonemes": "iy\u0268\u0289\u026fu\u026a\u028f\u028ae\u00f8\u0258\u0259\u0275\u0264o\u025b\u0153\u025c\u025e\u028c\u0254\u00e6\u0250a\u0276\u0251\u0252\u1d7b\u0298\u0253\u01c0\u0257\u01c3\u0284\u01c2\u0260\u01c1\u029bpbtd\u0288\u0256c\u025fk\u0261q\u0262\u0294\u0274\u014b\u0272\u0273n\u0271m\u0299r\u0280\u2c71\u027e\u027d\u0278\u03b2fv\u03b8\u00f0sz\u0283\u0292\u0282\u0290\u00e7\u029dx\u0263\u03c7\u0281\u0127\u0295h\u0266\u026c\u026e\u028b\u0279\u027bj\u0270l\u026d\u028e\u029f\u02c8\u02cc\u02d0\u02d1\u028dw\u0265\u029c\u02a2\u02a1\u0255\u0291\u027a\u0267\u025a\u02de\u026b'\u0303' ", "unique": true }, "batch_group_size": 0, "loss_masking": null, "min_seq_len": 90, "max_seq_len": 270, "compute_f0": false, "compute_linear_spec": true, "add_blank": true, "datasets": [
}