keras-team / keras

Deep Learning for humans
http://keras.io/
Apache License 2.0
61.98k stars 19.48k forks source link

Unable to load model from .h5 file #6937

Closed tejareddy-b closed 7 years ago

tejareddy-b commented 7 years ago
# Unable to load model
Using TensorFlow backend.
Traceback (most recent call last):
  File "ocv.py", line 7, in <module>
    model = load_model('bottleneck_fc_model.h5')
  File "/usr/local/lib/python2.7/dist-packages/keras/models.py", line 230, in load_model
    raise ValueError('No model found in config file.')
ValueError: No model found in config file.

I did not find a clear answer in other issues.

Dref360 commented 7 years ago

Did you saved with model.save or model.save_weights? Also, try to provide a working example so that we can see if it's a bug.

tejareddy-b commented 7 years ago
Thank You,
I was saving 'weights' but trying to call a 'model'.
sureddi-rajesh commented 7 years ago

I am using a pre-trained model such as Alexnet, In this case also, ending up with the same error.

I was downloaded the alexnet_weights from here-->https://github.com/heuritech/convnets-keras then I tried like this

from keras.models import load_model base_model=load_model('alexnet_weights.h5')

I ended up with

ValueError: No model found in config file.

please help me to get rid out of it.

gururao001 commented 7 years ago

It seems like your only using only the weights of the model. In that case you cant use load_model method. You have to set and define the architecture of your model and then use model.load_weights('alexnet_weights.h5'). Take a look at this for example https://stackoverflow.com/questions/35074549/how-to-load-a-model-from-an-hdf5-file-in-keras

ajaysg-zz commented 6 years ago

pretr_model=load_model('/home/ajay_sg/Desktop/arnekt_pdf/alaj_RPN_2230/models/vgg16_weights.h5') it is throwing an error ValueError: No model found in config file.

qingfengmingyue commented 6 years ago

use 'model.save()' will work

mdt01 commented 4 years ago

model = load_model('F:/Yagnesh_Project/deep_learning_flask_integration/mode_files/three_clas.h5') Traceback (most recent call last):

File "", line 1, in model = load_model('F:/Yagnesh_Project/deep_learning_flask_integration/mode_files/three_clas.h5')

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\keras\saving\save.py", line 184, in load_model return hdf5_format.load_model_from_hdf5(filepath, custom_objects, compile)

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\keras\saving\hdf5_format.py", line 178, in load_model_from_hdf5 custom_objects=custom_objects)

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\keras\saving\model_config.py", line 55, in model_from_config return deserialize(config, custom_objects=custom_objects)

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\keras\layers\serialization.py", line 109, in deserialize printable_module_name='layer')

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\keras\utils\generic_utils.py", line 373, in deserialize_keras_object list(custom_objects.items())))

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\keras\engine\sequential.py", line 398, in from_config custom_objects=custom_objects)

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\keras\layers\serialization.py", line 109, in deserialize printable_module_name='layer')

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\keras\utils\generic_utils.py", line 375, in deserialize_keras_object return cls.from_config(cls_config)

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\keras\engine\base_layer.py", line 655, in from_config return cls(**config)

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\keras\layers\convolutional.py", line 599, in init **kwargs)

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\keras\layers\convolutional.py", line 125, in init **kwargs)

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\training\tracking\base.py", line 456, in _method_wrapper result = method(self, *args, **kwargs)

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\keras\engine\base_layer.py", line 294, in init generic_utils.validate_kwargs(kwargs, allowed_kwargs)

File "C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\keras\utils\generic_utils.py", line 792, in validate_kwargs raise TypeError(error_message, kwarg)

TypeError: ('Keyword argument not understood:', 'groups')

mdt01 commented 4 years ago

Can anyone guide me regarding above issue while loading the .h5 model.

I have used google colab to save this model. On google colab, I am able to load this model. But, it I download that model and trying to load that, it gives above erroe.

Nakul24-1 commented 4 years ago

Can anyone guide me regarding above issue while loading the .h5 model.

I have used google colab to save this model. On google colab, I am able to load this model. But, it I download that model and trying to load that, it gives above erroe.

I have the same issue. Did you find the solution?

prateek1621 commented 4 years ago

Can anyone guide me regarding above issue while loading the .h5 model.

I have used google colab to save this model. On google colab, I am able to load this model. But, it I download that model and trying to load that, it gives above erroe.

I have the same issue.

Did you find the solution?

Even I have the same issue if you get the answer can you please let me also know

Nakul24-1 commented 4 years ago

Can anyone guide me regarding above issue while loading the .h5 model.

I have used google colab to save this model. On google colab, I am able to load this model. But, it I download that model and trying to load that, it gives above erroe.

I have the same issue. Did you find the solution?

Even I have the same issue if you get the answer can you please let me also know

The only solution I found was saving the model in google drive directly and using from there itself in Collab Even for that I'm not 100 percent sure if it works in all cases.

layne-sadler commented 3 years ago

For me the issue seems to be Python 3.8. This all worked in 3.7

I can't even load_model from h5 files on disk anymore in 3.8

ismael-elatifi commented 3 years ago

I have this issue (ValueError: No model found in config file.) with TF 2.4.1, tf.keras.callbacks.Callback.ModelCheckpoint and a custom network.
The reason of the issue is that the model was saved with model.save_weights despite having passed save_weights_only = False. I inspected tensorflow code and save_weights_only is forced to True in ModelCheckpoint in some specific case (which happens for me) even if the user passed save_weights_only = False to construct the ModelCheckpoint object.
Here is the code which causes the issue in the class ModelCheckpoint (tensorflow\python\keras\callbacks.py) :

  def set_model(self, model):
    self.model = model
    # Use name matching rather than `isinstance` to avoid circular dependencies.
    if (not self.save_weights_only and
        not model._is_graph_network and  # pylint: disable=protected-access
        model.__class__.__name__ != 'Sequential'):
      self.save_weights_only = True

We should show a warning to inform the user that save_weights_only is forced to True (with the reasons). It will save people hours of debugging as I did.
This is related to https://github.com/tensorflow/tensorflow/issues/37620 and https://github.com/tensorflow/tensorflow/issues/42741

ravinderkhatri commented 3 years ago

I am using the below code and am able to save my model

checkpoint_cb = tf.keras.callbacks.ModelCheckpoint(filepath='cv_model.h5', 
                                                    monitor = 'accuracy',
                                                    save_weights_only = False,
                                                    save_best_only=True)

but when I am calling model = tf.keras.models.load_model('cv_model.h5') It is giving me an error KeyError: "Unable to open object (object 'model_weights' doesn't exist)" I tried using this with the full path also and results are same Using Tensorflow 2.7

krishyadav007 commented 2 years ago

I am also facing the same issue

.local/lib/python3.8/site-packages/keras/saving/hdf5_format.py", line 177, in load_model_from_hdf5
    raise ValueError(f'No model config found in the file at {filepath}.')

Any hopes?

effaaaaaaa commented 1 year ago

Hye i have a problem issue on my code

history = classifier.fit(training_set, epochs = 50, validation_data = test_set)

classifier.save('model1.h5')

InvalidArgumentError Traceback (most recent call last)

in ----> 1 history = classifier.fit(training_set, 2 epochs = 50, 3 validation_data = test_set) 4 5 classifier.save('model.h5') # creates a HDF5 file 'my_model.h5' C:\ProgramData\Anaconda3\lib\site-packages\keras\utils\traceback_utils.py in error_handler(*args, **kwargs) 68 # To get the full stack trace, call: 69 # `tf.debugging.disable_traceback_filtering()` ---> 70 raise e.with_traceback(filtered_tb) from None 71 finally: 72 del filtered_tb C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\eager\execute.py in quick_execute(op_name, num_outputs, inputs, attrs, ctx, name) 50 try: 51 ctx.ensure_initialized() ---> 52 tensors = pywrap_tfe.TFE_Py_Execute(ctx._handle, device_name, op_name, 53 inputs, attrs, num_outputs) 54 except core._NotOkStatusException as e: InvalidArgumentError: Graph execution error:
Alirezanltv commented 1 year ago

model = load_model('model2019_fullHead.h5') here's the error : ValueError: No model config found in the file at <tensorflow.python.platform.gfile.GFile object at 0x7f724affbf10>. what shall I do ?

baseplate77 commented 1 year ago

same with me

muhammad-faizan-122 commented 1 year ago

Following code is working on my end. For saving.h5 trained model.

from tensorflow.keras.models import save_model
save_model(trained_model, "path_to_save_h5_model")

Load .h5 trained model.

from tensorflow.keras.models import load_model
load_model(trained_model, "path_of_h5_model")
roheetv commented 8 months ago

facing problem in loading h5 file even both of them exist in the same folder C:\Users\RC PRASAD\AppData\Local\Programs\Python\Python310\lib\site-packages\scipy__init__.py:169: UserWarning: A NumPy version >=1.18.5 and <1.26.0 is required for this version of SciPy (detected version 1.26.4 warnings.warn(f"A NumPy version >={np_minversion} and <{np_maxversion}" Traceback (most recent call last): File "C:\Users\RC PRASAD\Desktop\python ws\project\app.py", line 9, in model = tf.keras.models.load_model('project/ad_deploy.h5') File "C:\Users\RC PRASAD\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\saving\saving_api.py", line 212, in load_model return legacy_sm_saving_lib.load_model( File "C:\Users\RC PRASAD\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\utils\traceback_utils.py", line 70, in error_handler raise e.with_traceback(filtered_tb) from None File "C:\Users\RC PRASAD\AppData\Local\Programs\Python\Python310\lib\site-packages\keras\saving\legacy\save.py", line 230, in load_model raise IOError( OSError: No file or directory found at project/ad_deploy.h5 PS C:\Users\RC PRASAD\Desktop\python ws\project> this is the error kindly help me please

GhaouiYoussef commented 6 months ago

I didnt have a problem loading it on kaggle, i did encouter one on colab tho. Solution : !pip install keras==3.1.1 (or above i suppose)

bayuafriyadi5 commented 3 months ago

i have problem when deploying my service it's successfully deployed but when when accessing the url it says Service Unavailable.

this is the logs:

The request failed because either the HTTP response was malformed or connection to the instance had an error

"Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/keras/src/ops/operation.py", line 234, in from_config return cls(**config)"

Traceback (most recent call last): File "/usr/local/lib/python3.11/site-packages/gunicorn/arbiter.py", line 609, in spawn_worker worker.init_process() File "/usr/local/lib/python3.11/site-packages/gunicorn/workers/gthread.py", line 95, in init_process super().init_process() File "/usr/local/lib/python3.11/site-packages/gunicorn/workers/base.py", line 134, in init_process self.load_wsgi() File "/usr/local/lib/python3.11/site-packages/gunicorn/workers/base.py", line 146, in load_wsgi self.wsgi = self.app.wsgi()

and this is the code:

mengimport module yang dibutuhkan

from flask import Flask, request, jsonify

import masing-masing file dan fungsi

from prediction.kaca import klasifikasiKaca, kaca from prediction.kain import klasifikasiKain, kain from prediction.metal import klasifikasiMetal, metal from prediction.plastik import klasifikasiPlastik, plastik

inisialisasi flask

app = Flask(name)

endpoint index/homepage

@app.route('/', methods=["GET"]) def index(): return '

Welcome to T2T API Homepage : --port 8080,

(this 16th deploying attemp).'

membuat endpoint untuk masing masing file prediksi machine learning

@app.route('/kaca', methods=['POST']) def kaca_endpoint(): return kaca()

@app.route('/kain', methods=['POST']) def kain_endpoint(): return kain()

@app.route('/metal', methods=['POST']) def metal_endpoint(): return metal()

@app.route('/plastik', methods=['POST']) def plastik_endpoint(): return plastik()

if name == 'main': port = int(os.environ.get('PORT', 8080))
app.run(port=port, host="0.0.0.0", debug=True)

FROM python:3.11-slim

ENV PYTHONUNBUFFERED True ENV APP_HOME /app

WORKDIR $APP_HOME COPY . ./

RUN pip install -r requirements.txt

CMD exec gunicorn --bind :8080 --workers 1 --threads 8 --timeout 0 main:app

EXPOSE 8080

from google.cloud import storage import os import io import matplotlib.pyplot as plt import matplotlib.image as mpimg import tensorflow as tf import tempfile import psutil from flask import Flask, request, jsonify from werkzeug.utils import secure_filename from PIL import Image from urllib.parse import quote

Inisialisasi klien penyimpanan Google Cloud

service_account = 'credential/pragmatic-star-431611-e8-d1f679e6a94c.json' client = storage.Client.from_service_account_json(service_account)

Inisialisasi flask

app = Flask(name)

load model dari file model

bucket_name = 'bucket-bayu-3' model_plastik_path = 'models/model_plastik.h5' bucket = client.get_bucket(bucket_name) blob = bucket.blob(model_plastik_path) model_plastik_path_local = '/tmp/model_plastik.h5' blob.download_to_filename(model_plastik_path_local) loaded_model_plastik = tf.keras.models.load_model(model_plastik_path_local)

Fungsi untuk mengklasifikasikan gambar plastik dan memberikan rekomendasi

def klasifikasiPlastik(image_path):

Load dan preprocess gambar

image = tf.keras.preprocessing.image.load_img(image_path, target_size=(224, 224))
image_array = tf.keras.preprocessing.image.img_to_array(image)
image_array = tf.expand_dims(image_array, 0)
image_array = image_array / 255.0  # Normalisasi

# Klasifikasikan gambar menggunakan model
predictions = loaded_model_plastik.predict(image_array)
predicted_class = tf.argmax(predictions[0])

# Jawaban dan deskripsi berdasarkan klasifikasi
answer = ""
description = ""
folder_path = ""
if predicted_class == 0:
    answer = "Botol Plastik."
    description = '''Botol plastik merupakan salah satu jenis sampah yang sangat umum ditemui dalam kehidupan sehari-hari.
    \tBotol plastik umumnya terbuat dari polietilen tereftalat (PET), yang merupakan jenis plastik yang dapat didaur ulang.
    \tDaur ulang botol plastik memiliki manfaat besar bagi lingkungan karena mengurangi penggunaan bahan baku baru, 
    \tmengurangi limbah plastik yang mencemari lingkungan, dan menghemat energi yang diperlukan dalam produksi plastik baru. 
    \tOleh karena itu, penting bagi kita semua untuk mempraktikkan daur ulang botol plastik dengan membuangnya ke tempat yang benar 
    \tdan mendukung program daur ulang yang ada di lingkungan kita.\n'''
    folder_path = 'Recomendation/Plastik/Botol'  # Path folder tempat gambar botol disimpan
elif predicted_class == 1:
    answer = "Kemasan Plastik."
    description = '''Sampah kemasan plastik merupakan jenis sampah yang terkait dengan penggunaan berbagai macam kemasan plastik dalam kehidupan sehari-hari.
    \tKemasan plastik umumnya digunakan untuk mengemas makanan, minuman, produk rumah tangga, produk kecantikan, dan berbagai produk lainnya.
    \tSampah kemasan plastik dapat berdampak negatif terhadap lingkungan dan ekosistem. Jika tidak dikelola dengan baik, kemasan plastik bisa mencemari lautan,
    \tsungai, dan lahan. Plastik yang terbuang sembarangan dapat merusak habitat alami dan menyebabkan keracunan pada hewan laut yang memakan atau terperangkap olehnya. 
    \tSelain itu, pembakaran atau pembuangan plastik yang tidak tepat juga dapat menghasilkan polusi udara dan tanah yang berbahaya.\n
    \tUpaya kolektif dari individu, pemerintah, dan industri dalam mengurangi penggunaan kemasan plastik sekali pakai, meningkatkan daur ulang, 
    \tserta meningkatkan inovasi dalam pengemasan yang ramah lingkungan dapat membantu mengurangi masalah sampah kemasan plastik dan menjaga keberlanjutan lingkungan kita.\n'''
    folder_path = 'Recomendation/Plastik/Kemasan'  # Path folder tempat gambar kemasan disimpan
elif predicted_class == 2:
    answer = "Sedotan."
    description = '''Sampah sedotan plastik adalah jenis sampah yang terdiri dari sedotan atau pipet kecil yang terbuat dari plastik. 
    \tSedotan plastik umumnya digunakan untuk minuman seperti jus, soda, atau minuman ringan lainnya.\n
    \tMeskipun sedotan plastik terlihat kecil dan tidak signifikan, mereka dapat menjadi masalah lingkungan yang serius. 
    \tIni karena sedotan plastik sulit terurai dan cenderung berakhir sebagai sampah plastik yang mencemari lautan dan ekosistem.\n 
    \tDengan kesadaran dan tindakan kolektif, penggunaan dan pembuangan sedotan plastik sekali pakai dapat dikurangi secara signifikan. 
    \tMenggantinya dengan alternatif yang lebih berkelanjutan dan mengelola sampah sedotan plastik dengan benar adalah langkah penting 
    \tdalam menjaga keberlanjutan lingkungan kita.\n'''
    folder_path = 'Recomendation/Plastik/Sedotan'  # Path folder tempat gambar sedotan disimpan
elif predicted_class == 3:
    answer = "Sendok."
    description = '''Sampah sendok plastik adalah jenis sampah yang terdiri dari sendok kecil yang terbuat dari plastik. 
    \tSendok plastik sering digunakan dalam berbagai acara atau makanan makanan siap saji yang dikonsumsi di luar rumah.\n 
    \tSampah sendok plastik memiliki dampak negatif pada ekosistem laut dan lingkungan secara keseluruhan. Satwa laut dapat tersangkut 
    \tatau memakan sendok plastik, yang dapat menyebabkan luka atau kematian. Selain itu, pembuangan sendok plastik yang tidak tepat juga 
    \tdapat menyebabkan pencemaran tanah dan air.\n
    \tDengan mengadopsi alternatif yang lebih ramah lingkungan dan mengelola sendok plastik dengan benar, kita dapat mengurangi dampak negatifnya 
    \tterhadap lingkungan dan menjaga keberlanjutan sumber daya alam kita.\n'''
    folder_path = 'Recomendation/Plastik/Sendok'  # Path folder tempat gambar sendok disimpan

#mengembalikan response
return answer, description, folder_path

endpoint plastik

@app.route('/plastik', methods=['POST']) def plastik():

periksa apakah ada file gambar yang diunggah

if 'image' not in request.files:
    return jsonify({'error': 'No image uploaded'}), 400

#meminta request gambar ke body dengan method form data nantinya di postman
image = request.files['image']
# Save the uploaded image to a temporary file
temp_file = tempfile.NamedTemporaryFile(delete=False)
image.save(temp_file.name)
#klasifikasi
answer, description, folder_path = klasifikasiPlastik(temp_file.name)
os.remove(temp_file.name)
# Dapatkan objek bucket
bucket = client.get_bucket(bucket_name)
# Dapatkan daftar file dalam folder
blobs = bucket.list_blobs(prefix=folder_path)

#tampilkan gambar yang sesuai dengan klasifikasi
file_urls = []
for blob in blobs:
    #menggunakan urlib parse module untuk parsing url agar bisa diakses
    file_url = f"https://storage.googleapis.com/{bucket.name}/{quote(blob.name)}"
    file_urls.append(file_url)
if len(file_urls) > 0:
    return jsonify({'answer': answer, 'description': description, 'file_urls': file_urls}), 200
else:
    return jsonify({'answer': answer, 'description': description, 'file_urls': []}), 200

return jsonify({'error': 'No folder found for the classification'}), 500