taipingeric / yolo-v4-tf.keras

A simple tf.keras implementation of YOLO v4
MIT License
141 stars 78 forks source link

Can't load custom model properly #9

Closed Miremax closed 3 years ago

Miremax commented 3 years ago

Good afternoon :) I train model like You said in train.ipynb. Then predict, it predicts well:

img shape: (324, 432, 3) # of bboxes: 2

Then I save the model: _model.save_model('automodel3.h5') And when I load it back by _model = Yolov4(weight_path='auto_model3.h5', class_name_path=class_namepath) , it does not predict at all! Doesn't show any boxes. I can't understand why and what am I doing wrong:

img shape: (324, 432, 3) # of bboxes: 0

Model size is ok: auto_model3.h5 - 251 181 KB

Any advice? Or can You show the code, when You train the model. then save it, then loads back and it works? Thank You in advance ))

gajdosech2 commented 3 years ago

Hello there, a similar issue was closed recently https://github.com/taipingeric/yolo-v4-tf.keras/issues/8. To sum it up, you should use method load_model. You just create instance of Yolov4 and call that method on the instance, as can be seen in this script in my fork of this repo https://github.com/gajdosech2/yolo-v4-tf.keras/blob/master/inference.py. From the code https://github.com/taipingeric/yolo-v4-tf.keras/blob/23f89c7c3734db81aa3bc62fefae73ad0d9353d6/utils.py#L12 it seems that the _weightpath parameter in the constructor serves only to load the official YOLO pretrained weights from the darknet backend github/AlexeyAB/darknet, i.e, not your custom ones trained here.

Miremax commented 3 years ago

Oh, it really worked) Seems, we have to initialize model to be a Yolov4 model first, and only then call it's method load_model, not original tf.keras.models.load_model . Thank you)