android / codelab-mlkit-android

Other
179 stars 101 forks source link

mobilenet_v1_1.0_224_quant.tflite vs mobilenet_v1_1.0_224.tflite for inference #34

Open lingjiekong opened 4 years ago

lingjiekong commented 4 years ago

In "Identify objects in images using custom machine learning models with ML Kit for Firebase" tutorial https://codelabs.developers.google.com/codelabs/mlkit-android-custom-model/index.html?index=..%2F..index#1

There is a step to unpack the downloaded zip file. This will unpack a root folder (mobilenet_v1_1.0_224_quant) inside which you will find the Tensor Flow Lite custom model we will use in this codelab (mobilenet_v1_1.0_224_quant.tflite).

It looks like mobilenet_v1_1.0_224_quant.tflite can run inference with no problem. However, If I download mobilenet_v1_1.0_224.tflite from mobilenet_v1_1.0_224 https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet_v1.md, Inference get stuck at val inferenceOutput = it.result?.getOutput<Array>(0)!!

Is there a reason why mobilenet_v1_1.0_224.tflite does not work for the current code base for inference and how to update the current code base to get mobilenet_v1_1.0_224.tflite working?

lingjiekong commented 4 years ago

Based on this old issue: https://github.com/googlecodelabs/mlkit-android/issues/5 as well as stack overflow: https://stackoverflow.com/questions/50923996/changes-required-for-using-non-quantized-tflite-files-in-mainactivity-java

I made below change to allocate 4 byte per float using mobilenet_v1_1.0_224.tflite.

private fun convertBitmapToByteBuffer(bitmap: Bitmap): ByteBuffer {
    val imgData = ByteBuffer.allocateDirect(
        4*DIM_BATCH_SIZE * DIM_IMG_SIZE_X * DIM_IMG_SIZE_Y * DIM_PIXEL_SIZE).apply {
        order(ByteOrder.nativeOrder())
        rewind()
    }

However, I am still running into the same issue that inference get stuck in

val inferenceOutput = it.result?.getOutput<Array<ByteArray>>(0)!!

Is there anything else that I need to change in order to use mobilenet_v1_1.0_224.tflite?