tensorflow / tflite-support

TFLite Support is a toolkit that helps users to develop ML and deploy TFLite models onto mobile / ioT devices.
Apache License 2.0
377 stars 126 forks source link

What is the meaning of score (e.g. 8.109188)? #450

Closed ben-xD closed 3 years ago

ben-xD commented 3 years ago

I am trying to understand what score means, its not a probability as you can see in the debugger in my ide.

For example, shoe shop has a score of 8.109188, what does that mean? Thank you Screenshot 2021-04-12 at 12 23 29

ben-xD commented 3 years ago

I noticed from here

  /**
   * Constructs a {@link Category} object.
   *
   * @param label the label of this category object
   * @param displayName the display name of the label, which may be translated for different
   *     locales. For exmaple, a label, "apple", may be translated into Spanish for display purpose,
   *     so that the displayName is "manzana".
   * @param score the probability score of this label category
   * @param index the index of the label in the corresponding label file
   */

I guess its meant to be a probability, but this 8.109188 is almost too certain to not be a probability.

Note, also the released code is outdated (0.1.0 the latest release doesn't have index yet).

ben-xD commented 3 years ago

Oh I see whats happening, it looks like it doesn’t fully support quantised models: e.g. this one: https://tfhub.dev/tensorflow/lite-model/mobilenet_v2_1.0_224_quantized/1/metadata/1

It looks like this library doesn't support quantized models. The output of a quantized model is 0-255 and it fails to convert this into probabilities. I am using the Tensorflow lite task library (ImageClassifier)

lu-wang-g commented 3 years ago

Task library should support quantized model very well. But this model looks weird. The output range is [-5.7, 0.01], where it should be something like [0, 1]. I'll ask internally what's going here.

ben-xD commented 3 years ago

Thanks @lu-wang-g

I'm curious where you got [-5.7, 0.01] range, i think the max value is 19.48 not 0.01?

I did a small bit of analysis: When looking at the model in Netron, it looks like output quantization shows: quantization: -5.735767364501953 ≤ 0.09889253973960876 * (q - 58) ≤ 19.481830596923828. I learnt that the syntax is *q_min ≤ q_scale (q - q_zero_point) ≤ q_max**. I'm not able to see the googleplex screenshot link you added: https://screenshot.googleplex.com/BoG2GnB2oLnP2P2.png

Because q can be 0 to 255, the probability outputs are ranged:

lu-wang-g commented 3 years ago

You're right about the value range. My screen shot also come from Netron, but it only snipped part of the screen.

lu-wang-g commented 3 years ago

The output of https://tfhub.dev/tensorflow/lite-model/mobilenet_v2_1.0_224_quantized/1/metadata/1 is not the probability but logics before softmax. So the range is not [0, 1]. You can apply softmax to the result as needed, or just use it as it is to indicate the probability of a class. I've created an internal bug to track this, and we'll update the documentation of it.