IBM / tfjs-web-app

A TensorFlow.js Progressive Web App for Offline Visual Recognition
Apache License 2.0
178 stars 69 forks source link

Warning: Each child in a list should have a unique "key" prop. when i run inference in the browser. #17

Closed KudzayiKing closed 3 years ago

KudzayiKing commented 3 years ago

HI! I made a plant leaf diseases classification using transfer learning MobileNet, i saved my model as h5 the converted it into .js, in the app i changed classes.js to fit the SoftMax of my model which was 38 classes, When I run the app it does not show the classes , it jus shows the probabilities , but interestingly when i put classes.js with ImageNet classes , classes appear in app .where a I going wrong ? I have attaches files below . @pvaneck @xgqfrms @autumnblue @joshpitzalis @jdhiro @keithort @Diaver

app error 1 app error 2 classify.js `const MODEL_PATH = '/model/model.json'; const IMAGE_SIZE = 224; const CANVAS_SIZE = 224; const TOPK_PREDICTIONS = 5;

const INDEXEDDB_DB = 'tensorflowjs'; const INDEXEDDB_STORE = 'model_info_store'; const INDEXEDDB_KEY = 'web-model';

/**

classes.js `/ eslint-disable / // Theses classes should correspond to the softmax output of your model.

export const MODEL_CLASSES = { 0: 'Apple_Apple_scab', 1: 'Apple_Blackrot', 2: 'AppleCedar_applerust', 3: 'Applehealthy', 4: 'Blueberryhealthy', 5: 'Cherry(includingsour)Powderymildew', 6: 'Cherry(including_sour)healthy', 7: 'Corn(maize)_Cercospora_leaf_spot Gray_leafspot', 8: 'Corn(maize)_Commonrust', 9: 'Corn(maize)_Northern_LeafBlight', 10: 'Corn(maize)healthy', 11: 'GrapeBlack_rot', 12: 'GrapeEsca(BlackMeasles)', 13: 'GrapeLeafblight(Isariopsis_Leaf_Spot)', 14: 'Grapehealthy', 15: 'OrangeHaunglongbing_(Citrus_greening)', 16: 'Peach_Bacterial_spot', 17: 'Peach_healthy', 18: 'Pepper,bellBacterial_spot', 19: 'Pepper,bellhealthy', 20: 'Potato_Early_blight', 21: 'Potato_Lateblight', 22: 'Potatohealthy', 23: 'Raspberry_healthy', 24: 'Soybeanhealthy', 25: 'SquashPowdery_mildew', 26: 'Strawberry_Leaf_scorch', 27: 'Strawberry__healthy', 28: 'TomatoBacterial_spot', 29: 'Tomato_Early_blight', 30: 'Tomato_Lateblight', 31: 'TomatoLeaf_Mold', 32: 'Tomato___Septoria_leafspot', 33: 'TomatoSpider_mites Two-spotted_spider_mite', 34: 'Tomato_Target_Spot', 35: 'Tomato___Tomato_Yellow_Leaf_CurlVirus', 36: 'TomatoTomato_mosaic_virus', 37: 'Tomato___healthy' }; `

pvaneck commented 3 years ago

In the getTopKClasses function, can you do add a console.log(values.length);? The error you see typically indicates that the class that corresponds to the index is undefined. This generally happens when there is a mismatch between the lengths of the output softmax array and the number of possible classes. So, the value you should see for console.log(values.length); should be 38 assuming the app is loading the correct model and the model was trained properly so that the output shape is 38.

KudzayiKing commented 3 years ago

@pvaneck Thank you for the quick reply, the console shows 1000

pvaneck commented 3 years ago

Sounds like you are still using the original MobileNet model since the model prediction output is giving you 1000 class probabilities. Double check to make sure that the updated model files are in the public/model directory. Might also be worthwhile to double check the output of your Keras model in python to ensure that the output size is 38.

KudzayiKing commented 3 years ago

@pvaneck I ran model.summary() on the model.h5 , output is 38 , and yes I updated the files in public/model and scr/public/model/classes.js. I also saved one version of the model with ImageDataGeneration class_mode="categorical" with loss="CategoricalCrossentropy" and another one with class_mode="sparse" with loss="SparseCategoricalCrossentropy" nothing changed both versions did not show classes.

app error 4

KudzayiKing commented 3 years ago

@pvaneck another thing i noticed is at {this.state.predictions.map((category) => { console.log('category:', category) shows this in the console app error 5

pvaneck commented 3 years ago

Can you also do a this.model.summary() somewhere and see what the output shape is? Can just put it near where you put console.log(values.length);. If the model is correct and was converted properly, output shape should be [null, 38]. A shape of [null, 1000] seems the indicate the original Mobilenet with the 1000 ImageNet classes are being used.

KudzayiKing commented 3 years ago

@pvaneck Yes your suspicions are right ! it shows [null,1000] in the console. Guess i need to get back to saving the Model .Is this caused by tfjs-converter or i need to specifically save the model with specifying signatures during export https://www.tensorflow.org/guide/saved_model#specifying_signatures_during_export MODEL OUTPUT

pvaneck commented 3 years ago

You were exporting your model in the h5 format, right? Those signatures just apply to the TF SavedModel format. Converting an h5 file is outlined here: https://www.tensorflow.org/js/tutorials/conversion/import_keras (just make sure you use the correct plant leaf model when converting via tensorflowjs_converter --input_format=keras ./path/to/plant-leaf-model.h5 ./my-model.

KudzayiKing commented 3 years ago

@pvaneck yes I am exporting h5 the converting it to .js directly using tfjs.converters.save_keras_model(model, tfjs_target_dir) and i am also saving the h5 the converting it to .js somewhere else . I have uploaded the h5 file to cross check output shape what i am getting is [null,38] MODEL OUTPUT 2

And this is the h5 I have converted and put in my public/model. Console still shows [null,1000] after running Classify console output

So I can't understand were things are going wrong .

KudzayiKing commented 3 years ago

@pvaneck I was just going through some images testing , a couple images actually have their classes appearing in the app , as in picture below. Why am I getting the feeling that the converter is not converting the model properly .Maybe I should use a previous version of the converter ? console output4

pvaneck commented 3 years ago

Can you show the python code that constructs and saves your h5 model? That model.summary in the console only looks like the base mobilenet model layers.

I tried reconstructing a model similar to yours:

base_model = tf.keras.applications.MobileNet(
    input_shape=(224, 224, 3),
    include_top=False, weights='imagenet',
    classifier_activation='softmax'
)
base_model.trainable = False
input_1 = tf.keras.layers.Input(shape=(224,224,3))
x = base_model(input_1)
x = tf.keras.layers.GlobalAveragePooling2D()(x)
x = tf.keras.layers.Dropout(0)(x)
x = tf.keras.layers.Dense(38, activation='softmax')(x)

model = tf.keras.models.Model(inputs=input_1, outputs=x)
model.summary()
tfjs.converters.save_keras_model(model, './tfjs-output')

This gives a model summary just like yours:

Model: "model"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
input_2 (InputLayer)         [(None, 224, 224, 3)]     0         
_________________________________________________________________
mobilenet_1.00_224 (Function (None, 7, 7, 1024)        3228864   
_________________________________________________________________
global_average_pooling2d (Gl (None, 1024)              0         
_________________________________________________________________
dropout (Dropout)            (None, 1024)              0         
_________________________________________________________________
dense (Dense)                (None, 38)                38950     
=================================================================
Total params: 3,267,814
Trainable params: 38,950
Non-trainable params: 3,228,864

The converted model loaded in tensorflow looks to be the same as expected:

_________________________________________________________________
tfjs@2.4.0:17 Layer (type)                 Output shape              Param #   
tfjs@2.4.0:17 =================================================================
tfjs@2.4.0:17 input_2 (InputLayer)         [null,224,224,3]          0         
tfjs@2.4.0:17 _________________________________________________________________
tfjs@2.4.0:17 mobilenet_1.00_224 (Function [null,7,7,1024]           3228864   
tfjs@2.4.0:17 _________________________________________________________________
tfjs@2.4.0:17 global_average_pooling2d (Gl [null,1024]               0         
tfjs@2.4.0:17 _________________________________________________________________
tfjs@2.4.0:17 dropout (Dropout)            [null,1024]               0         
tfjs@2.4.0:17 _________________________________________________________________
tfjs@2.4.0:17 dense (Dense)                [null,38]                 38950     
tfjs@2.4.0:17 =================================================================
tfjs@2.4.0:17 Total params: 3267814
tfjs@2.4.0:17 Trainable params: 38950
tfjs@2.4.0:17 Non-trainable params: 3228864
tfjs@2.4.0:17 _________________________________________________________________

So, I would double check your model saving to ensure you are not just saving the base model.

Sidenote: In trying this, seems like we need a minimum TF version of 2.4.0 (for Functional layer support) but also a version less than 2.8.0 (just released and it seems to introduce a bug that breaks the webcam component of the app). So I will probably submit a PR updating the package.json.

My tests were with tensorflowjs==2.8.0 (Python) and @tensorflow/tfjs==2.7.0 (JavaScript).

KudzayiKing commented 3 years ago

@pvaneck Thanks , here is the full PYTHON CODE take a look , while I cross check the versions !let me know if there are any errors in the python code !

pvaneck commented 3 years ago

@KudzayiKing The code seems to look fine and should export a working model. Hmm, now that I think about it, perhaps the app keeps using an old model your previously loaded and is not picking up any updated models you put in public/model. By default the app will keep fetching a saved model in IndexedDB in the browser (giving it offline functionality). To force the app to load the latest model, either update model_info.txt with a newer date (date > model_info.txt) or you can just delete the indexeddb tensorflowjs database in the browser. Reloading should then pull in the new model.

image

KudzayiKing commented 3 years ago

@pvaneck Yes it worked !everything is now working well ! I just deleted the tensorflowjs database .Thanks for the support. Please check your email!