MarcBS / keras

Keras' fork with several new functionalities. Caffe2Keras converter, multimodal layers, etc.
https://github.com/MarcBS/keras
Other
225 stars 58 forks source link

No model found in config file when trying to load converted model #25

Open dchouren opened 7 years ago

dchouren commented 7 years ago

I used caffe2keras to convert the VGG16-hybrid1365 caffe model to an h5 file. Conversion went fine and I used the caffemodel and prototxt found here: https://github.com/metalbubble/places365.

However, when I try to load_model, I get ValueError: No model found in config file.

>>> import keras.models
Using Theano backend.
>>> x = keras.models.load_model('keras/keras/caffe/models/Keras_model_weights.h5')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/tigress/dchouren/git_sources/keras/keras/models.py", line 140, in load_model
    raise ValueError('No model found in config file.')
ValueError: No model found in config file.

Theano and Keras are up to date.

Any idea why the model is not converted properly?

The only other related issue I could find was this, which didn't shed any light. https://github.com/fchollet/deep-learning-models/issues/1


Converted with

python caffe2keras.py -load_path /tigress/dchouren/thesis/src/keras/keras/caffe/models/ -prototxt deploy_vgg16_hybrid1365.prototxt -caffemodel vgg16_hybrid1365.caffemodel
.
.
.
LOADING WEIGHTS
Finished converting model.
Storing model...
Finished storing the converted model to /tigress/dchouren/thesis/src/keras/keras/caffe/models/
dchouren commented 7 years ago

Not sure if this is relevant, but I tried to do some basic inspection with h5py and have this:


>>> import h5py
>>> x = h5py.File('models/Keras_model_weights.h5', 'r')
>>> [y for y in x]
['conv1_1', 'conv1_1_zeropadding', 'conv1_2', 'conv1_2_zeropadding', 'conv2_1', 'conv2_1_zeropadding', 'conv2_2', 'conv2_2_zeropadding', 'conv3_1', 'conv3_1_zeropadding', 'conv3_2', 'conv3_2_zeropadding', 'conv3_3', 'conv3_3_zeropadding', 'conv4_1', 'conv4_1_zeropadding', 'conv4_2', 'conv4_2_zeropadding', 'conv4_3', 'conv4_3_zeropadding', 'conv5_1', 'conv5_1_zeropadding', 'conv5_2', 'conv5_2_zeropadding', 'conv5_3', 'conv5_3_zeropadding', 'data', 'drop6', 'drop7', 'fc6', 'fc6_flatten', 'fc7', 'fc8a', 'pool1', 'pool2', 'pool3', 'pool4', 'pool5', 'prob', 'relu1_1', 'relu1_2', 'relu2_1', 'relu2_2', 'relu3_1', 'relu3_2', 'relu3_3', 'relu4_1', 'relu4_2', 'relu4_3', 'relu5_1', 'relu5_2', 'relu5_3', 'relu6', 'relu7']
MarcBS commented 7 years ago

Hi @dchouren , are you trying to load the model with the original Keras version? Or with this fork?

dchouren commented 7 years ago

Have tried both with the same result

dchouren commented 7 years ago

@MarcBS So it seems that what's being created is just a weights file. Is that correct? I can create a model, say VGG16, and then use model.load_weights('...') and that works. I would suggest changing the output from 'Finished storing the converted model...' to indicate that this isn't an .h5 model file but rather just the layer weights so there's no confusion.