Closed hdmjdp closed 5 years ago
Please share you config.json file.
127086: -5.529374123 127087: -5.540636063 127088: -5.553906918 127089: -5.329674721 127090: -5.541586876 127091: -5.408481598 127092: -5.414450645 127093: -5.248669624 127094: -5.427709103 127095: -2.714895010 127096: 95719013234155368078343523008512.000000000 127097: 52885717261894352461863363543040.000000000 127098: 60365312293630392414670281506816.000000000 127099: 49856149157940091750049686487040.000000000 127100: 42010945944132917183711427952640.000000000 127101: 34616195744269071135643738832896.000000000 127102: 44373148310033674922953705259008.000000000 127103: 35036077442085984911344338468864.000000000 127104: 39258381431078595683956526415872.000000000 127105: 26697526108403915554358585982976.000000000 127106: 27712641776321205838292566671360.000000000 127107: 25422699324510453714080826916864.000000000 127108: 29501189498001708200069968166912.000000000 127109: 22853707779312974425246708793344.000000000 127110: 26433441098812378269660720660480.000000000 127111: 24986896073649215272149346942976.000000000 127112: 18961502194291957763417216909312.000000000 127113: 18775145070272543061507085172736.000000000 127114: 14259090241000871618878917050368.000000000 127115: 28707413640546021142474591830016.000000000 127116: 15301785133641030438175469731840.000000000 127117: 20859995674637312575488271581184.000000000 127118: 23445994388265130467552156188672.000000000 127119: 16647481566931941069932854247424.000000000 127120: 12664278926542711655494053789696.000000000 127121: 25767644546472735085758089527296.000000000 127122: 10197990465424773573567924142080.000000000 127123: 19634062686592344797552329097216.000000000 127124: 11341334477176948344804639506432.000000000 127125: 12670222005871937172516909350912.000000000 127126: 9401790711700759184477261922304.000000000 127127: 7546769599365836630214053986304.000000000 127128: 9317671839783243864557444136960.000000000 127129: 10236624107842198278104009408512.000000000 127130: 8901066330937485733920298237952.000000000 127131: 9088880815032445871300538269696.000000000 127132: 10927190341299925599167195381760.000000000 127133: 13174872373234409131318731341824.000000000 127134: 10466855567707067102022577684480.000000000 127135: 5866469700036537972054430842880.000000000 127136: 9870349203181546196582818381824.000000000 127137: 5678254457032375585102775713792.000000000 127138: 11480977498600634160774949896192.000000000 127139: 8249163358579763675807977832448.000000000 127140: 8682212071961010185164889784320.000000000 127141: 6736010751996203261972405288960.000000000 127142: 6138184240475663181473866842112.000000000 127143: 8614818688757863259877131943936.000000000 127144: 5044054389914180349310265196544.000000000 127145: 5770338336712421889340145139712.000000000 127146: 9085351960564990768739570941952.000000000 127147: 5530654285863986279904549273600.000000000 127148: 5280103805285944576132106944512.000000000 127149: 6407647156575756144055412588544.000000000 127150: 5080223032585410824959124570112.000000000 127151: 6670546814401161284847478505472.000000000 127152: 5618088636841794720335998746624.000000000 127153: 6185713159073812327477440151552.000000000 127154: 4928174618863749491712566755328.000000000 127155: 8882835729577697125965729103872.000000000 127156: 4942385541873319457661237854208.000000000 127157: 9100974304468960814249711501312.000000000 127158: 5209860983925965969879566647296.000000000 127159: 8056382607217826641778044829696.000000000 127160: 9182347705850131311313771560960.000000000 127161: 5500377947637557506853207801856.000000000
the loss explode
@rafaelvalle { "train_config": { "output_directory": "checkpoints", "epochs": 100000, "learning_rate": 1e-4, "sigma": 1.0, "iters_per_checkpoint": 2000, "batch_size": 1, "seed": 1234, "checkpoint_path": "" }, "data_config": { "wav_dataroot":"/home/hdm/Documents/tts/data/wav/hb_cen_lily_sent-24K", "mel_dataroot":"/home/hdm/Documents/wavenet/msc", "segment_length": 18000, "sampling_rate": 24000, "filter_length": 2048, "hop_length": 120, "win_length": 2048, "mel_fmin": 0.0, "mel_fmax": 8000.0 }, "dist_config": { "dist_backend": "nccl", "dist_url": "tcp://localhost:54321" },
"waveglow_config": {
"n_mel_channels": 80,
"n_flows": 12,
"n_group": 8,
"n_early_every": 4,
"n_early_size": 2,
"WN_config": {
"n_layers": 8,
"n_channels": 512,
"kernel_size": 3
}
}
}
The loss could have such a spike if your dataset has an audio sample that is all "silence".
@maozhiqiang let's address https://github.com/NVIDIA/waveglow/issues/38 here as well.
@rafaelvalle Thanks. Your mean 18000 sample points may have all "silence" (my seqlen=18000)?
I mean there can be an audio slice that has only silence.
Closing due to inactivity.
step78540: loss-4.065555573 step78560: loss-4.490265846 step78580: loss-4.454271317 step78600: loss-4.322495461 step_78620: loss_4191.683593750 step78640: loss-2.408614397 step78660: loss-2.908127546