Skuldur / Classical-Piano-Composer

MIT License
602 stars 318 forks source link

RuntimeError: Unable to create attribute (Object header message is too large) #8

Open MoffD opened 5 years ago

MoffD commented 5 years ago

Running the files unmodified in Python 3.6.4 on windows I get the following trace:

Traceback (most recent call last):
  File "lstm.py", line 122, in <module>
    train_network()
  File "lstm.py", line 26, in train_network
    train(model, network_input, network_output)
  File "lstm.py", line 119, in train
    model.fit(network_input, network_output, epochs=200, batch_size=64, callbacks=callbacks_list, include_optimizer=False)
  File "D:\Programs\Python\Python36\lib\site-packages\keras\engine\training.py", line 1035, in fit
    validation_steps=validation_steps)
  File "D:\Programs\Python\Python36\lib\site-packages\keras\engine\training_arrays.py", line 217, in fit_loop
    callbacks.on_epoch_end(epoch, epoch_logs)
  File "D:\Programs\Python\Python36\lib\site-packages\keras\callbacks.py", line 79, in on_epoch_end
    callback.on_epoch_end(epoch, logs)
  File "D:\Programs\Python\Python36\lib\site-packages\keras\callbacks.py", line 446, in on_epoch_end
    self.model.save(filepath, overwrite=True)
  File "D:\Programs\Python\Python36\lib\site-packages\keras\engine\network.py", line 1081, in save
    save_model(self, filepath, overwrite, include_optimizer)
  File "D:\Programs\Python\Python36\lib\site-packages\keras\engine\saving.py", line 381, in save_model
    _serialize_model(model, f, include_optimizer)
  File "D:\Programs\Python\Python36\lib\site-packages\keras\engine\saving.py", line 113, in _serialize_model
    layer_group[name] = val
  File "D:\Programs\Python\Python36\lib\site-packages\keras\utils\io_utils.py", line 256, in __setitem__
    self.data.attrs[attr] = val
  File "h5py\_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py\_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "D:\Programs\Python\Python36\lib\site-packages\h5py\_hl\attrs.py", line 95, in __setitem__
    self.create(name, data=value, dtype=base.guess_dtype(value))
  File "D:\Programs\Python\Python36\lib\site-packages\h5py\_hl\attrs.py", line 188, in create
    attr = h5a.create(self._id, self._e(tempname), htype, space)
  File "h5py\_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
  File "h5py\_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
  File "h5py\h5a.pyx", line 47, in h5py.h5a.create
RuntimeError: Unable to create attribute (object header message is too large)

running pip3 show for the libraries:

Name: music21
Version: 5.3.0
---
Name: Keras
Version: 2.2.2
---
Name: tensorflow-gpu
Version: 1.10.0
---
Name: h5py
Version: 2.8.0

Can you tell me what versions you are using? Seems to be a known problem with Keras per: https://github.com/keras-team/keras/issues/6766 but I assume there's a version that works. Thanks!

Anurag27031994 commented 5 years ago

USE get_weights AND set_weights TO SAVE AND LOAD MODEL, RESPECTIVELY.

OPEN THIS LINK TO READ THE SAME CODE PROPERLY:

https://drive.google.com/open?id=1xzrqP7ExTmJiZqVt0A_G6AT69EbIjEI9tUDLD1twqj8

##############################################################################

Assuming that this is your model architecture. However, you may use

whatever architecture, you want to (big or small; any).

def mymodel(): inputShape= (28, 28, 3); model= Sequential() model.add(Conv2D(20, 5, padding="same", input_shape=inputShape)) model.add(Activation('relu')) model.add(Flatten()) model.add(Dense(500)) model.add(Activation('relu')) model.add(Dense(2, activation= "softmax")) return model model.fit(....) #paramaters to start training your model

################################################################################ ################################################################################

once your model has been trained, you want to save your model in your PC

use get_weights() command to get your model weights

weigh= model.get_weights()

now, use pickle to save your model weights, instead of .h5

for heavy model architectures, .h5 file is unsupported.

pklfile= "D:/modelweights.pkl" try: fpkl= open(pklfile, 'wb') #Python 3
pickle.dump(weigh, fpkl, protocol= pickle.HIGHEST_PROTOCOL) fpkl.close() except: fpkl= open(pklfile, 'w') #Python 2
pickle.dump(weigh, fpkl, protocol= pickle.HIGHEST_PROTOCOL) fpkl.close()

################################################################################ ################################################################################

in future, you may want to load your model back

use pickle to load model weights

pklfile= "D:/modelweights.pkl" try: f= open(pklfile) #Python 2

weigh= pickle.load(f);                
f.close();

except:

f= open(pklfile, 'rb')     #Python 3                 
weigh= pickle.load(f);                
f.close();

restoredmodel= mymodel()

use set_weights to load the modelweights into the model architecture

restoredmodel.set_weights(weigh)

################################################################################ ################################################################################

now, you can do your testing and evaluation- predictions

y_pred= restoredmodel.predict(X)

Anurag27031994 commented 5 years ago

open this drive link and read the same above code properly:

https://drive.google.com/open?id=1xzrqP7ExTmJiZqVt0A_G6AT69EbIjEI9tUDLD1twqj8