bfs18 / nsynth_wavenet

parallel wavenet based on nsynth
105 stars 30 forks source link

Error when training wavenet #23

Closed wada-s closed 6 years ago

wada-s commented 6 years ago

python3 -u train_wavenet.py \ --config config_jsons/wavenet_mol.json \ --train_path data/train/TFR \ --log_root logdir \

My setup is tensorflow 1.8.0, cuda 9.0. python3.5.

I'm getting this error:

WARNING:tensorflow:From /work/wada/nsynth_wavenet/wavenet/masked.py:420: Uniform UnitScaling.init (from tensorflow.python.ops.init_ops) is deprecated and will be removed in a future version. Instructions for updating: Use tf.initializers.variance_scaling instead with distribution=uniform to get equivalent behavior. INFO:tensorflow:using config form config_jsons/wavenet_mol.json INFO:tensorflow:Saving to logdir/ns_wn-n_MU-WN-TS-n_IN-n_DO-tanh-MOL-09_12 WARNING:tensorflow:From /work/wada/nsynth_wavenet/wavenet/masked.py:127: calling reduce_sum (from tensorflow.python.ops.math_ops) with keep_dims is deprecated and will be removed in a future version. Instructions for updating: keep_dims is deprecated, use keepdims instead WARNING:tensorflow:From /work/wada/nsynth_wavenet/wavenet/loss_func.py:10: calling reduce_max (from tensorflow.python.ops.math_ops) with keep_dims is deprecated and will be removed in a future version. Instructions for updating: keep_dims is deprecated, use keepdims instead Traceback (most recent call last): File "train_wavenet.py", line 188, in train(args) File "train_wavenet.py", line 153, in train data_dep_init_fn = _data_dep_init() File "train_wavenet.py", line 60, in _data_dep_init wn.train_path, batch_size=args.total_batch_size, seq_len=wn.wave_length) File "/work/wada/nsynth_wavenet/auxilaries/reader.py", line 123, in get_init_batch first_n_serialized_example.append(serialized_examples.next()) StopIteration