tensorflow / models

Models and examples built with TensorFlow
Other
76.97k stars 45.79k forks source link

Inference on quantized model #9810

Open ChowderII opened 3 years ago

ChowderII commented 3 years ago

Prerequisites

Please answer the following questions for yourself before submitting an issue.

1. The entire URL of the file you are using

https://github.com/tensorflow/models/tree/master/research/...

2. Describe the bug

The visualization tool does not generate bounding boxes.

3. Steps to reproduce

    # Load the TFLite model and allocate tensors.
    path = os.path.join(model_dir, 'quantized', "model_int8.tflite").replace('\\', '/')
    interpreter = tf.lite.Interpreter(model_path=path)
    interpreter.allocate_tensors()
    category_index = label_map_util.create_category_index_from_labelmap(path_to_label, use_display_name=True)

    # Get input and output tensors.
    input_details = interpreter.get_input_details()
    output_details = interpreter.get_output_details()

    # Test the model on input data.
    IMAGE_PATHS = []
    for (root, dirnames, filenames) in os.walk(data_dir):
        for image in filenames:
            if image.split('.')[-1] == "jpg":
                IMAGE_PATHS.append(os.path.join(root, image))

    for image_path in IMAGE_PATHS:
        image_np = load_image_into_numpy_array(image_path)
        input_tensor = np.expand_dims(image_np, 0)
        interpreter.set_tensor(input_details[0]['index'], input_tensor)

        print('Running inference on {} ... '.format(image_path), end='')
        start_time = time.time()
        interpreter.invoke()
        end_time = time.time()
        elapsed_time = end_time - start_time
        print('Done! Took {} seconds'.format(elapsed_time))

        # The function `get_tensor()` returns a copy of the tensor data.
        # Use `tensor()` in order to get a pointer to the tensor.
        detections_boxes = interpreter.get_tensor(output_details[0]['index']).squeeze()
        detections_classes = interpreter.get_tensor(output_details[1]['index']).squeeze()
        detections_scores = interpreter.get_tensor(output_details[2]['index']).squeeze()

        matplotlib.use('TkAgg')
        label_id_offset = 1
        image_np_with_detections = get_unscaled_image(image_path)
        viz_utils.visualize_boxes_and_labels_on_image_array(
            image_np_with_detections,
            detections_boxes,
            detections_classes.astype(np.int32)+1,
            detections_scores,
            category_index,
            use_normalized_coordinates=True,
            max_boxes_to_draw=100,
            min_score_thresh=.60,
            agnostic_mode=False)
        plt.figure()
        plt.imshow(image_np_with_detections)
    plt.show()

4. Expected behavior

The pictures appear and my output tensors are not empty, but my pictures do not have the bounding boxes even if I set the set the min_threshold to very low. I'm not what I am doing wrong. It used to work using this script but I had to reinstall due to some version issues etc.

5. Additional context

Here is the content of the three detections_* variables: detections_boxes : [[-19 -30 23 49], [-19 -33 60 48], [ 43 31 49 49], [ 45 -13 48 0], [ 46 -30 49 -18], [ 6 22 48 47], [ 45 34 49 48], [ 45 -4 49 7], [ 45 15 48 25], [ 44 -6 48 5]] detections_classes : [-128 -128 127 127 127 127 127 127 127 127] detection_scores: [ 62 53 -78 -85 -85 -91 -91 -91 -91 -96]

6. System information

judahkshitij commented 2 years ago

Hello @ChowderII, Have you found a fix to the issue? I am facing similar issue where I am getting wrong results (on COCO 2017 valid set images) after converting some models from TF2 detection model zoo to tflite format. The models I have tried are:

I followed the guide on running models on mobile devices but still getting wrong results from the tflite models (original models give correct results). If you made this work, any help on this is appreciated. Thanks.