keras-team / tf-keras

The TensorFlow-specific implementation of the Keras API, which was the default Keras from 2019 to 2023.
Apache License 2.0
52 stars 24 forks source link

I can't download the NASNetMobile model with the include_top=False option #142

Closed jkoniecznyy closed 9 months ago

jkoniecznyy commented 10 months ago

System information. Google Colab Python 3.10.12 tensorflow 2.12.0 keras 2.12.0

Describe the problem. I can't download the NASNetMobile model with the 'include_top=False' option. NASNetMobile(weights='imagenet', include_top=True) works fine, but NASNetMobile(weights='imagenet', include_top=False) generates an Error: ValueError: Layer count mismatch when loading weights from file. Model expected 382 layers, found 388 saved layers.

Standalone code to reproduce the issue.

from tensorflow.keras.applications.nasnet import NASNetMobile model = NASNetMobile(weights='imagenet', include_top=False) https://colab.research.google.com/drive/11apqiS93G4ZjRDrzQUU4MAc0UXkWj3Cx?usp=sharing

Source code / logs. Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/nasnet/NASNet-mobile-no-top.h5 19993432/19993432 [==============================] - 0s 0us/step

ValueError Traceback (most recent call last) in <cell line: 1>() ----> 1 model = NASNetMobile(weights='imagenet', include_top=False) #error

3 frames /usr/local/lib/python3.10/dist-packages/keras/applications/nasnet.py in NASNetMobile(input_shape, include_top, weights, input_tensor, pooling, classes, classifier_activation) 425 backend that does not support separable convolutions. 426 """ --> 427 return NASNet( 428 input_shape, 429 penultimate_filters=1056,

/usr/local/lib/python3.10/dist-packages/keras/applications/nasnet.py in NASNet(input_shape, penultimate_filters, num_blocks, stem_block_filters, skip_reduction, filter_multiplier, include_top, weights, input_tensor, pooling, classes, default_size, classifier_activation) 323 file_hash="1ed92395b5b598bdda52abe5c0dbfd63", 324 ) --> 325 model.load_weights(weights_path) 326 elif default_size == 331: # large version 327 if include_top:

/usr/local/lib/python3.10/dist-packages/keras/utils/traceback_utils.py in error_handler(*args, **kwargs) 68 # To get the full stack trace, call: 69 # tf.debugging.disable_traceback_filtering() ---> 70 raise e.with_traceback(filtered_tb) from None 71 finally: 72 del filtered_tb

/usr/local/lib/python3.10/dist-packages/keras/saving/legacy/hdf5_format.py in load_weights_from_hdf5_group(f, model) 806 layer_names = filtered_layer_names 807 if len(layer_names) != len(filtered_layers): --> 808 raise ValueError( 809 "Layer count mismatch when loading weights from file. " 810 f"Model expected {len(filtered_layers)} layers, found "

ValueError: Layer count mismatch when loading weights from file. Model expected 382 layers, found 388 saved layers.

image

AnimeshMaheshwari22 commented 10 months ago

Hi. I would like to contribute a fix for this.

Frightera commented 10 months ago

Hi @AnimeshMaheshwari22,

The problem is arising because of input_shape is not specified when it is include_top=False. I guess you can add a check for that.

See: https://github.com/keras-team/keras/blob/b3ffea6602dbbb481e82312baa24fe657de83e11/keras/applications/nasnet.py#L382-L388

yashsinghcodes commented 10 months ago

Hi @Frightera,

So should we pass the default (example value) if no input_shape is provided when include_top=False ?

AnimeshMaheshwari22 commented 10 months ago

Hi @Frightera Is this related to the input size issue? Isnt this a number of model layers issue?

Frightera commented 10 months ago

So should we pass the default (example value) if no input_shape is provided when include_top=False ?

Yes, that's right. If you specify the input_shape when it is include_top=False, it should fix it.

Hi @Frightera Is this related to the input size issue? Isnt this a number of model layers issue?

As far as I remember, NasNet implementation was little bit different, so that error is little bit misleading. There should be a check to force the user the specify the input shape in that case.

AnimeshMaheshwari22 commented 10 months ago

Shall I add the check for this?

tilakrayal commented 10 months ago

@jkoniecznyy, I tried to execute the code by providing the input_shape with include_top=False, and it was executed without any error/issue. Kindly find the gist of it here.

model = NASNetMobile(weights='imagenet', include_top=False, input_shape=(224,224,3))

https://github.com/keras-team/keras/issues/9812 https://github.com/keras-team/keras/pull/9865 https://github.com/keras-team/keras/pull/9891

Thank you!

jkoniecznyy commented 10 months ago

Oh, I didn't expect that to be about the input_shape, thank you very much for solving my problem. I think it would be helpful to standardize the behavior of all models in the absence of a specific input_shape. Because currently Mobilenet shows a warning, Xception, DenseNet and EfficientNet just load, and NASNetMobile shows an error related to the number of layers (even on version 2.13.0.) Below you can find my gist: https://colab.research.google.com/drive/1UZoMLrNp_y_2sjfkXUHlVzWxC42KCnQ5?usp=sharing

yashsinghcodes commented 10 months ago

Yea I will write to pull to fix an default value if no input shape is provided.