Open jbrea opened 6 years ago
Hi, The problem stems from the fact that the provided dataset was created using an older version of the programme. I'm sorry for that. Normally if you regenerate the dataset and train a new model it should be working. I'll upload a new version of the code with correct datasets and models soon. Best
2018-05-09 18:23 GMT+02:00 jbrea notifications@github.com:
After cloning the current master branch, installing all dependencies, downloading the pretrained models with download_pretrained_data.sh (which btw. throws the error that the target directories are not empty), moving the data with mv deepbach_ressources/datasets/raw_dataset/bach_dataset.pickle DeepBach/datasets/raw_dataset I get the following error when trying to generate a sample
$ python deepBach.py -l 100 -o output.mid /usr/lib/python3.6/site-packages/h5py/init.py:36: FutureWarning: Conversion of the second argument of issubdtype from
float
tonp.floating
is deprecated. In future, it will be treated asnp.float64 == np.dtype(float).type
. from ._conv import register_converters as _register_converters Using TensorFlow backend. Namespace(batch_size_train=128, dataset='', ext='', length=100, midi_file=None, name='deepbach', num_dense=200, num_iterations=20000, num_units_lstm=[200, 200], output_file='output.mid', overwrite=False, parallel=1, reharmonization=None, steps_per_epoch=500, timesteps=16, train=0, validation_steps=20) Traceback (most recent call last): File "deepBach.py", line 183, inmain() File "deepBach.py", line 102, in main 'rb')) ModuleNotFoundError: No module named 'metadata' The error does not occur, if I remove bach_dataset.pickle again.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/Ghadjeres/DeepBach/issues/47, or mute the thread https://github.com/notifications/unsubscribe-auth/AQY2mD5b5izLumjEJuUxuXqZ7-zZ2jjEks5twxgfgaJpZM4T4nYv .
Hi. Thanks for your response. Regenerating the data and training a new model does indeed work.
After cloning the current master branch, installing all dependencies, downloading the pretrained models with download_pretrained_data.sh (which btw. throws the error that the target directories are not empty), moving the data with
mv deepbach_ressources/datasets/raw_dataset/bach_dataset.pickle DeepBach/datasets/raw_dataset
I get the following error when trying to generate a sampleThe error does not occur, if I remove bach_dataset.pickle again.