OlafenwaMoses / ImageAI

A python library built to empower developers to build applications and systems with self-contained Computer Vision capabilities
https://www.genxr.co/#products
MIT License
8.61k stars 2.19k forks source link

OSError: cannot identify image file #429

Open Overdoze47 opened 4 years ago

Overdoze47 commented 4 years ago

HI i have a problem with my custom model training. I get the following error message again and again. He's looking for an image that's no longer in the directory. How can I bypass these error messages or how and where does it store that these images must be present?

Code: from imageai.Prediction.Custom import ModelTraining from PIL import Image

model_trainer = ModelTraining() model_trainer.setModelTypeAsResNet() model_trainer.setDataDirectory(r"C:\Images\Lebensmittel") model_trainer.trainModel(num_objects=17, num_experiments=100, enhance_data=True, batch_size=32, show_network_summary=True, save_full_model=True)

Output: `WARNING:tensorflow:From C:\Users\christoph.bo\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\keras_impl\keras\backend.py:1557: calling reduce_mean (from tensorflow.python.ops.math_ops) with keep_dims is deprecated and will be removed in a future version. Instructions for updating: keep_dims is deprecated, use keepdims instead WARNING:tensorflow:From C:\Users\christoph.bo\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\keras_impl\keras\backend.py:3086: calling reduce_sum (from tensorflow.python.ops.math_ops) with keep_dims is deprecated and will be removed in a future version. Instructions for updating: keep_dims is deprecated, use keepdims instead


Layer (type) Output Shape Param # Connected to

input_1 (InputLayer) (None, 224, 224, 3) 0


conv2d_1 (Conv2D) (None, 112, 112, 64) 9472 input_1[0][0]


batch_normalization_1 (BatchNor (None, 112, 112, 64) 256 conv2d_1[0][0]


activation_1 (Activation) (None, 112, 112, 64) 0 batch_normalization_1[0][0]


max_pooling2d_1 (MaxPooling2D) (None, 55, 55, 64) 0 activation_1[0][0]


conv2d_3 (Conv2D) (None, 55, 55, 64) 4160 max_pooling2d_1[0][0]


batch_normalization_3 (BatchNor (None, 55, 55, 64) 256 conv2d_3[0][0]


activation_2 (Activation) (None, 55, 55, 64) 0 batch_normalization_3[0][0]


conv2d_4 (Conv2D) (None, 55, 55, 64) 36928 activation_2[0][0]


batch_normalization_4 (BatchNor (None, 55, 55, 64) 256 conv2d_4[0][0]


activation_3 (Activation) (None, 55, 55, 64) 0 batch_normalization_4[0][0]


conv2d_5 (Conv2D) (None, 55, 55, 256) 16640 activation_3[0][0]


conv2d_2 (Conv2D) (None, 55, 55, 256) 16640 max_pooling2d_1[0][0]


batch_normalization_5 (BatchNor (None, 55, 55, 256) 1024 conv2d_5[0][0]


batch_normalization_2 (BatchNor (None, 55, 55, 256) 1024 conv2d_2[0][0]


add_1 (Add) (None, 55, 55, 256) 0 batch_normalization_5[0][0]
batch_normalization_2[0][0]


activation_4 (Activation) (None, 55, 55, 256) 0 add_1[0][0]


conv2d_6 (Conv2D) (None, 55, 55, 64) 16448 activation_4[0][0]


batch_normalization_6 (BatchNor (None, 55, 55, 64) 256 conv2d_6[0][0]


activation_5 (Activation) (None, 55, 55, 64) 0 batch_normalization_6[0][0]


conv2d_7 (Conv2D) (None, 55, 55, 64) 36928 activation_5[0][0]


batch_normalization_7 (BatchNor (None, 55, 55, 64) 256 conv2d_7[0][0]


activation_6 (Activation) (None, 55, 55, 64) 0 batch_normalization_7[0][0]


conv2d_8 (Conv2D) (None, 55, 55, 256) 16640 activation_6[0][0]


batch_normalization_8 (BatchNor (None, 55, 55, 256) 1024 conv2d_8[0][0]


add_2 (Add) (None, 55, 55, 256) 0 batch_normalization_8[0][0]
activation_4[0][0]


activation_7 (Activation) (None, 55, 55, 256) 0 add_2[0][0]


conv2d_9 (Conv2D) (None, 55, 55, 64) 16448 activation_7[0][0]


batch_normalization_9 (BatchNor (None, 55, 55, 64) 256 conv2d_9[0][0]


activation_8 (Activation) (None, 55, 55, 64) 0 batch_normalization_9[0][0]


conv2d_10 (Conv2D) (None, 55, 55, 64) 36928 activation_8[0][0]


batch_normalization_10 (BatchNo (None, 55, 55, 64) 256 conv2d_10[0][0]


activation_9 (Activation) (None, 55, 55, 64) 0 batch_normalization_10[0][0]


conv2d_11 (Conv2D) (None, 55, 55, 256) 16640 activation_9[0][0]


batch_normalization_11 (BatchNo (None, 55, 55, 256) 1024 conv2d_11[0][0]


add_3 (Add) (None, 55, 55, 256) 0 batch_normalization_11[0][0]
activation_7[0][0]


activation_10 (Activation) (None, 55, 55, 256) 0 add_3[0][0]


conv2d_13 (Conv2D) (None, 28, 28, 128) 32896 activation_10[0][0]


batch_normalization_13 (BatchNo (None, 28, 28, 128) 512 conv2d_13[0][0]


activation_11 (Activation) (None, 28, 28, 128) 0 batch_normalization_13[0][0]


conv2d_14 (Conv2D) (None, 28, 28, 128) 147584 activation_11[0][0]


batch_normalization_14 (BatchNo (None, 28, 28, 128) 512 conv2d_14[0][0]


activation_12 (Activation) (None, 28, 28, 128) 0 batch_normalization_14[0][0]


conv2d_15 (Conv2D) (None, 28, 28, 512) 66048 activation_12[0][0]


conv2d_12 (Conv2D) (None, 28, 28, 512) 131584 activation_10[0][0]


batch_normalization_15 (BatchNo (None, 28, 28, 512) 2048 conv2d_15[0][0]


batch_normalization_12 (BatchNo (None, 28, 28, 512) 2048 conv2d_12[0][0]


add_4 (Add) (None, 28, 28, 512) 0 batch_normalization_15[0][0]
batch_normalization_12[0][0]


activation_13 (Activation) (None, 28, 28, 512) 0 add_4[0][0]


conv2d_16 (Conv2D) (None, 28, 28, 128) 65664 activation_13[0][0]


batch_normalization_16 (BatchNo (None, 28, 28, 128) 512 conv2d_16[0][0]


activation_14 (Activation) (None, 28, 28, 128) 0 batch_normalization_16[0][0]


conv2d_17 (Conv2D) (None, 28, 28, 128) 147584 activation_14[0][0]


batch_normalization_17 (BatchNo (None, 28, 28, 128) 512 conv2d_17[0][0]


activation_15 (Activation) (None, 28, 28, 128) 0 batch_normalization_17[0][0]


conv2d_18 (Conv2D) (None, 28, 28, 512) 66048 activation_15[0][0]


batch_normalization_18 (BatchNo (None, 28, 28, 512) 2048 conv2d_18[0][0]


add_5 (Add) (None, 28, 28, 512) 0 batch_normalization_18[0][0]
activation_13[0][0]


activation_16 (Activation) (None, 28, 28, 512) 0 add_5[0][0]


conv2d_19 (Conv2D) (None, 28, 28, 128) 65664 activation_16[0][0]


batch_normalization_19 (BatchNo (None, 28, 28, 128) 512 conv2d_19[0][0]


activation_17 (Activation) (None, 28, 28, 128) 0 batch_normalization_19[0][0]


conv2d_20 (Conv2D) (None, 28, 28, 128) 147584 activation_17[0][0]


batch_normalization_20 (BatchNo (None, 28, 28, 128) 512 conv2d_20[0][0]


activation_18 (Activation) (None, 28, 28, 128) 0 batch_normalization_20[0][0]


conv2d_21 (Conv2D) (None, 28, 28, 512) 66048 activation_18[0][0]


batch_normalization_21 (BatchNo (None, 28, 28, 512) 2048 conv2d_21[0][0]


add_6 (Add) (None, 28, 28, 512) 0 batch_normalization_21[0][0]
activation_16[0][0]


activation_19 (Activation) (None, 28, 28, 512) 0 add_6[0][0]


conv2d_22 (Conv2D) (None, 28, 28, 128) 65664 activation_19[0][0]


batch_normalization_22 (BatchNo (None, 28, 28, 128) 512 conv2d_22[0][0]


activation_20 (Activation) (None, 28, 28, 128) 0 batch_normalization_22[0][0]


conv2d_23 (Conv2D) (None, 28, 28, 128) 147584 activation_20[0][0]


batch_normalization_23 (BatchNo (None, 28, 28, 128) 512 conv2d_23[0][0]


activation_21 (Activation) (None, 28, 28, 128) 0 batch_normalization_23[0][0]


conv2d_24 (Conv2D) (None, 28, 28, 512) 66048 activation_21[0][0]


batch_normalization_24 (BatchNo (None, 28, 28, 512) 2048 conv2d_24[0][0]


add_7 (Add) (None, 28, 28, 512) 0 batch_normalization_24[0][0]
activation_19[0][0]


activation_22 (Activation) (None, 28, 28, 512) 0 add_7[0][0]


conv2d_26 (Conv2D) (None, 14, 14, 256) 131328 activation_22[0][0]


batch_normalization_26 (BatchNo (None, 14, 14, 256) 1024 conv2d_26[0][0]


activation_23 (Activation) (None, 14, 14, 256) 0 batch_normalization_26[0][0]


conv2d_27 (Conv2D) (None, 14, 14, 256) 590080 activation_23[0][0]


batch_normalization_27 (BatchNo (None, 14, 14, 256) 1024 conv2d_27[0][0]


activation_24 (Activation) (None, 14, 14, 256) 0 batch_normalization_27[0][0]


conv2d_28 (Conv2D) (None, 14, 14, 1024) 263168 activation_24[0][0]


conv2d_25 (Conv2D) (None, 14, 14, 1024) 525312 activation_22[0][0]


batch_normalization_28 (BatchNo (None, 14, 14, 1024) 4096 conv2d_28[0][0]


batch_normalization_25 (BatchNo (None, 14, 14, 1024) 4096 conv2d_25[0][0]


add_8 (Add) (None, 14, 14, 1024) 0 batch_normalization_28[0][0]
batch_normalization_25[0][0]


activation_25 (Activation) (None, 14, 14, 1024) 0 add_8[0][0]


conv2d_29 (Conv2D) (None, 14, 14, 256) 262400 activation_25[0][0]


batch_normalization_29 (BatchNo (None, 14, 14, 256) 1024 conv2d_29[0][0]


activation_26 (Activation) (None, 14, 14, 256) 0 batch_normalization_29[0][0]


conv2d_30 (Conv2D) (None, 14, 14, 256) 590080 activation_26[0][0]


batch_normalization_30 (BatchNo (None, 14, 14, 256) 1024 conv2d_30[0][0]


activation_27 (Activation) (None, 14, 14, 256) 0 batch_normalization_30[0][0]


conv2d_31 (Conv2D) (None, 14, 14, 1024) 263168 activation_27[0][0]


batch_normalization_31 (BatchNo (None, 14, 14, 1024) 4096 conv2d_31[0][0]


add_9 (Add) (None, 14, 14, 1024) 0 batch_normalization_31[0][0]
activation_25[0][0]


activation_28 (Activation) (None, 14, 14, 1024) 0 add_9[0][0]


conv2d_32 (Conv2D) (None, 14, 14, 256) 262400 activation_28[0][0]


batch_normalization_32 (BatchNo (None, 14, 14, 256) 1024 conv2d_32[0][0]


activation_29 (Activation) (None, 14, 14, 256) 0 batch_normalization_32[0][0]


conv2d_33 (Conv2D) (None, 14, 14, 256) 590080 activation_29[0][0]


batch_normalization_33 (BatchNo (None, 14, 14, 256) 1024 conv2d_33[0][0]


activation_30 (Activation) (None, 14, 14, 256) 0 batch_normalization_33[0][0]


conv2d_34 (Conv2D) (None, 14, 14, 1024) 263168 activation_30[0][0]


batch_normalization_34 (BatchNo (None, 14, 14, 1024) 4096 conv2d_34[0][0]


add_10 (Add) (None, 14, 14, 1024) 0 batch_normalization_34[0][0]
activation_28[0][0]


activation_31 (Activation) (None, 14, 14, 1024) 0 add_10[0][0]


conv2d_35 (Conv2D) (None, 14, 14, 256) 262400 activation_31[0][0]


batch_normalization_35 (BatchNo (None, 14, 14, 256) 1024 conv2d_35[0][0]


activation_32 (Activation) (None, 14, 14, 256) 0 batch_normalization_35[0][0]


conv2d_36 (Conv2D) (None, 14, 14, 256) 590080 activation_32[0][0]


batch_normalization_36 (BatchNo (None, 14, 14, 256) 1024 conv2d_36[0][0]


activation_33 (Activation) (None, 14, 14, 256) 0 batch_normalization_36[0][0]


conv2d_37 (Conv2D) (None, 14, 14, 1024) 263168 activation_33[0][0]


batch_normalization_37 (BatchNo (None, 14, 14, 1024) 4096 conv2d_37[0][0]


add_11 (Add) (None, 14, 14, 1024) 0 batch_normalization_37[0][0]
activation_31[0][0]


activation_34 (Activation) (None, 14, 14, 1024) 0 add_11[0][0]


conv2d_38 (Conv2D) (None, 14, 14, 256) 262400 activation_34[0][0]


batch_normalization_38 (BatchNo (None, 14, 14, 256) 1024 conv2d_38[0][0]


activation_35 (Activation) (None, 14, 14, 256) 0 batch_normalization_38[0][0]


conv2d_39 (Conv2D) (None, 14, 14, 256) 590080 activation_35[0][0]


batch_normalization_39 (BatchNo (None, 14, 14, 256) 1024 conv2d_39[0][0]


activation_36 (Activation) (None, 14, 14, 256) 0 batch_normalization_39[0][0]


conv2d_40 (Conv2D) (None, 14, 14, 1024) 263168 activation_36[0][0]


batch_normalization_40 (BatchNo (None, 14, 14, 1024) 4096 conv2d_40[0][0]


add_12 (Add) (None, 14, 14, 1024) 0 batch_normalization_40[0][0]
activation_34[0][0]


activation_37 (Activation) (None, 14, 14, 1024) 0 add_12[0][0]


conv2d_41 (Conv2D) (None, 14, 14, 256) 262400 activation_37[0][0]


batch_normalization_41 (BatchNo (None, 14, 14, 256) 1024 conv2d_41[0][0]


activation_38 (Activation) (None, 14, 14, 256) 0 batch_normalization_41[0][0]


conv2d_42 (Conv2D) (None, 14, 14, 256) 590080 activation_38[0][0]


batch_normalization_42 (BatchNo (None, 14, 14, 256) 1024 conv2d_42[0][0]


activation_39 (Activation) (None, 14, 14, 256) 0 batch_normalization_42[0][0]


conv2d_43 (Conv2D) (None, 14, 14, 1024) 263168 activation_39[0][0]


batch_normalization_43 (BatchNo (None, 14, 14, 1024) 4096 conv2d_43[0][0]


add_13 (Add) (None, 14, 14, 1024) 0 batch_normalization_43[0][0]
activation_37[0][0]


activation_40 (Activation) (None, 14, 14, 1024) 0 add_13[0][0]


conv2d_45 (Conv2D) (None, 7, 7, 512) 524800 activation_40[0][0]


batch_normalization_45 (BatchNo (None, 7, 7, 512) 2048 conv2d_45[0][0]


activation_41 (Activation) (None, 7, 7, 512) 0 batch_normalization_45[0][0]


conv2d_46 (Conv2D) (None, 7, 7, 512) 2359808 activation_41[0][0]


batch_normalization_46 (BatchNo (None, 7, 7, 512) 2048 conv2d_46[0][0]


activation_42 (Activation) (None, 7, 7, 512) 0 batch_normalization_46[0][0]


conv2d_47 (Conv2D) (None, 7, 7, 2048) 1050624 activation_42[0][0]


conv2d_44 (Conv2D) (None, 7, 7, 2048) 2099200 activation_40[0][0]


batch_normalization_47 (BatchNo (None, 7, 7, 2048) 8192 conv2d_47[0][0]


batch_normalization_44 (BatchNo (None, 7, 7, 2048) 8192 conv2d_44[0][0]


add_14 (Add) (None, 7, 7, 2048) 0 batch_normalization_47[0][0]
batch_normalization_44[0][0]


activation_43 (Activation) (None, 7, 7, 2048) 0 add_14[0][0]


conv2d_48 (Conv2D) (None, 7, 7, 512) 1049088 activation_43[0][0]


batch_normalization_48 (BatchNo (None, 7, 7, 512) 2048 conv2d_48[0][0]


activation_44 (Activation) (None, 7, 7, 512) 0 batch_normalization_48[0][0]


conv2d_49 (Conv2D) (None, 7, 7, 512) 2359808 activation_44[0][0]


batch_normalization_49 (BatchNo (None, 7, 7, 512) 2048 conv2d_49[0][0]


activation_45 (Activation) (None, 7, 7, 512) 0 batch_normalization_49[0][0]


conv2d_50 (Conv2D) (None, 7, 7, 2048) 1050624 activation_45[0][0]


batch_normalization_50 (BatchNo (None, 7, 7, 2048) 8192 conv2d_50[0][0]


add_15 (Add) (None, 7, 7, 2048) 0 batch_normalization_50[0][0]
activation_43[0][0]


activation_46 (Activation) (None, 7, 7, 2048) 0 add_15[0][0]


conv2d_51 (Conv2D) (None, 7, 7, 512) 1049088 activation_46[0][0]


batch_normalization_51 (BatchNo (None, 7, 7, 512) 2048 conv2d_51[0][0]


activation_47 (Activation) (None, 7, 7, 512) 0 batch_normalization_51[0][0]


conv2d_52 (Conv2D) (None, 7, 7, 512) 2359808 activation_47[0][0]


batch_normalization_52 (BatchNo (None, 7, 7, 512) 2048 conv2d_52[0][0]


activation_48 (Activation) (None, 7, 7, 512) 0 batch_normalization_52[0][0]


conv2d_53 (Conv2D) (None, 7, 7, 2048) 1050624 activation_48[0][0]


batch_normalization_53 (BatchNo (None, 7, 7, 2048) 8192 conv2d_53[0][0]


add_16 (Add) (None, 7, 7, 2048) 0 batch_normalization_53[0][0]
activation_46[0][0]


activation_49 (Activation) (None, 7, 7, 2048) 0 add_16[0][0]


global_avg_pooling (GlobalAvera (None, 2048) 0 activation_49[0][0]


dense_1 (Dense) (None, 17) 34833 global_avg_pooling[0][0]


activation_50 (Activation) (None, 17) 0 dense_1[0][0]

Total params: 23,622,545 Trainable params: 23,569,425 Non-trainable params: 53,120


Using Enhanced Data Generation Found 17709 images belonging to 17 classes. Found 14 images belonging to 17 classes. JSON Mapping for the model classes saved to C:\Images\Lebensmittel\json\model_class.json Number of experiments (Epochs) : 100 Epoch 1/100

Warning (from warnings module): File "C:\Users\christoph.bo\AppData\Local\Programs\Python\Python35\lib\site-packages\PIL\Image.py", line 872 'to RGBA images') UserWarning: Palette images with Transparency expressed in bytes should be converted to RGBA images

1/553 [..............................] 1/553 [..............................] - ETA: 3:57:11 - loss: 3.7262 - acc: 0.0312 2/553 [..............................] 2/553 [..............................] - ETA: 3:40:41 - loss: 4.1303 - acc: 0.0469 3/553 [..............................] 3/553 [..............................] - ETA: 3:35:22 - loss: 4.4315 - acc: 0.0312 4/553 [..............................] 4/553 [..............................] - ETA: 3:32:23 - loss: 4.1377 - acc: 0.0625 5/553 [..............................] 5/553 [..............................] - ETA: 3:30:21 - loss: 4.0024 - acc: 0.0625 6/553 [..............................] 6/553 [..............................] - ETA: 3:28:41 - loss: 4.0945 - acc: 0.0677 7/553 [..............................] 7/553 [..............................] - ETA: 3:27:39 - loss: 4.1872 - acc: 0.0670 8/553 [..............................] 8/553 [..............................] - ETA: 3:26:45 - loss: 4.1478 - acc: 0.0703 9/553 [..............................] 9/553 [..............................] - ETA: 3:25:58 - loss: 4.1029 - acc: 0.0660 10/553 [..............................] 10/553 [..............................] - ETA: 3:25:13 - loss: 4.0398 - acc: 0.0781 11/553 [..............................] 11/553 [..............................] - ETA: 3:24:34 - loss: 3.9646 - acc: 0.0938 12/553 [..............................] 12/553 [..............................] - ETA: 3:23:53 - loss: 3.9061 - acc: 0.1016 13/553 [..............................] 13/553 [..............................] - ETA: 3:23:22 - loss: 3.8396 - acc: 0.1058 14/553 [..............................] 14/553 [..............................] - ETA: 3:23:11 - loss: 3.8066 - acc: 0.1049 15/553 [..............................] 15/553 [..............................] - ETA: 3:22:40 - loss: 3.7849 - acc: 0.1083 16/553 [..............................] 16/553 [..............................] - ETA: 3:22:11 - loss: 3.8373 - acc: 0.1074 17/553 [..............................] 17/553 [..............................] - ETA: 3:21:36 - loss: 3.8707 - acc: 0.1066 18/553 [..............................] 18/553 [..............................] - ETA: 3:21:08 - loss: 3.9117 - acc: 0.1076 19/553 [>.............................] 19/553 [>.............................] - ETA: 3:20:49 - loss: 3.9067 - acc: 0.1086 20/553 [>.............................] 20/553 [>.............................] - ETA: 3:20:25 - loss: 3.8696 - acc: 0.1109 21/553 [>.............................] 21/553 [>.............................] - ETA: 3:19:59 - loss: 3.8614 - acc: 0.1176 22/553 [>.............................] 22/553 [>.............................] - ETA: 3:19:36 - loss: 3.8433 - acc: 0.1151 23/553 [>.............................] 23/553 [>.............................] - ETA: 3:19:07 - loss: 3.8473 - acc: 0.1114 24/553 [>.............................] 24/553 [>.............................] - ETA: 3:18:44 - loss: 3.8672 - acc: 0.1068 25/553 [>.............................] 25/553 [>.............................] - ETA: 3:18:22 - loss: 3.8924 - acc: 0.1062 26/553 [>.............................] 26/553 [>.............................] - ETA: 3:17:58 - loss: 3.9009 - acc: 0.1046 27/553 [>.............................] 27/553 [>.............................] - ETA: 3:17:35 - loss: 3.8898 - acc: 0.1007 28/553 [>.............................] 28/553 [>.............................] - ETA: 3:17:10 - loss: 3.8820 - acc: 0.0982 29/553 [>.............................] 29/553 [>.............................] - ETA: 3:16:51 - loss: 3.8991 - acc: 0.0981 30/553 [>.............................] 30/553 [>.............................] - ETA: 3:16:32 - loss: 3.8525 - acc: 0.0990 31/553 [>.............................] 31/553 [>.............................] - ETA: 3:16:10 - loss: 3.8414 - acc: 0.0978 32/553 [>.............................] 32/553 [>.............................] - ETA: 3:15:48 - loss: 3.8234 - acc: 0.0996 33/553 [>.............................] 33/553 [>.............................] - ETA: 3:15:24 - loss: 3.8328 - acc: 0.0985 34/553 [>.............................] 34/553 [>.............................] - ETA: 3:15:00 - loss: 3.8087 - acc: 0.0993 35/553 [>.............................] 35/553 [>.............................] - ETA: 3:14:38 - loss: 3.7820 - acc: 0.0982 36/553 [>.............................] 36/553 [>.............................] - ETA: 3:14:16 - loss: 3.7831 - acc: 0.0972 37/553 [=>............................] 37/553 [=>............................] - ETA: 3:13:55 - loss: 3.7822 - acc: 0.0954 38/553 [=>............................] 38/553 [=>............................] - ETA: 3:13:32 - loss: 3.7811 - acc: 0.0979 39/553 [=>............................] 39/553 [=>............................] - ETA: 3:13:09 - loss: 3.7867 - acc: 0.0970 40/553 [=>............................] 40/553 [=>............................] - ETA: 3:12:48 - loss: 3.7895 - acc: 0.0992 41/553 [=>............................] 41/553 [=>............................] - ETA: 3:12:28 - loss: 3.7694 - acc: 0.0998 42/553 [=>............................] 42/553 [=>............................] - ETA: 3:12:05 - loss: 3.7535 - acc: 0.1004 43/553 [=>............................] 43/553 [=>............................] - ETA: 3:11:43 - loss: 3.7565 - acc: 0.1003 44/553 [=>............................] 44/553 [=>............................] - ETA: 3:11:20 - loss: 3.7545 - acc: 0.1001 45/553 [=>............................] 45/553 [=>............................] - ETA: 3:10:59 - loss: 3.7627 - acc: 0.1000 46/553 [=>............................] 46/553 [=>............................] - ETA: 3:10:38 - loss: 3.7687 - acc: 0.0985 47/553 [=>............................] 47/553 [=>............................] - ETA: 3:10:16 - loss: 3.7732 - acc: 0.1004 48/553 [=>............................] 48/553 [=>............................] - ETA: 3:09:54 - loss: 3.7784 - acc: 0.0983 49/553 [=>............................] 49/553 [=>............................] - ETA: 3:09:32 - loss: 3.7661 - acc: 0.0995 50/553 [=>............................] 50/553 [=>............................] - ETA: 3:09:11 - loss: 3.7698 - acc: 0.0981 51/553 [=>............................] 51/553 [=>............................] - ETA: 3:08:50 - loss: 3.7500 - acc: 0.0980 52/553 [=>............................] 52/553 [=>............................] - ETA: 3:08:29 - loss: 3.7386 - acc: 0.0986 53/553 [=>............................] 53/553 [=>............................] - ETA: 3:08:09 - loss: 3.7483 - acc: 0.0985Traceback (most recent call last): File "C:\Users\christoph.bo\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\keras_impl\keras\utils\data_utils.py", line 560, in get inputs = self.queue.get(block=True).get() File "C:\Users\christoph.bo\AppData\Local\Programs\Python\Python35\lib\multiprocessing\pool.py", line 608, in get raise self._value File "C:\Users\christoph.bo\AppData\Local\Programs\Python\Python35\lib\multiprocessing\pool.py", line 119, in worker result = (True, func(*args, **kwds)) File "C:\Users\christoph.bo\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\keras_impl\keras\utils\data_utils.py", line 402, in get_index return _SHARED_SEQUENCES[uid][i] File "C:\Users\christoph.bo\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\keras_impl\keras\preprocessing\image.py", line 835, in getitem return self._get_batches_of_transformed_samples(index_array) File "C:\Users\christoph.bo\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\keras_impl\keras\preprocessing\image.py", line 1205, in _get_batches_of_transformed_samples interpolation=self.interpolation) File "C:\Users\christoph.bo\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\keras_impl\keras\preprocessing\image.py", line 386, in load_img img = pil_image.open(path) File "C:\Users\christoph.bo\AppData\Local\Programs\Python\Python35\lib\site-packages\PIL\Image.py", line 2349, in open % (filename if filename else fp)) OSError: cannot identify image file 'C:\Images\Lebensmittel\train\Birne\Birne897.jpg'

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "C:\Users\christoph.bo\Desktop\ImageAI-master\naiboo.py", line 7, in model_trainer.trainModel(num_objects=17, num_experiments=100, enhance_data=True, batch_size=32, show_network_summary=True, save_full_model=True) File "C:\Users\christoph.bo\Desktop\ImageAI-master\imageai\Prediction\Custom__init__.py", line 342, in trainModel validation_steps=int(num_test / batch_size), callbacks=[checkpoint, lr_scheduler, tensorboard]) File "C:\Users\christoph.bo\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\keras_impl\keras\engine\training.py", line 2143, in fit_generator generator_output = next(output_generator) File "C:\Users\christoph.bo\AppData\Local\Programs\Python\Python35\lib\site-packages\tensorflow\python\keras_impl\keras\utils\data_utils.py", line 566, in get six.raise_from(StopIteration(e), e) File "", line 3, in raise_from StopIteration: cannot identify image file 'C:\Images\Lebensmittel\train\Birne\Birne897.jpg'`

rola93 commented 4 years ago

He's looking for an image that's no longer in the directory

There is a cache directory you must remove, it's literally called cache, and its on the raining directory