google-coral / pycoral

Python API for ML inferencing and transfer-learning on Coral devices
https://coral.ai
Apache License 2.0
347 stars 144 forks source link

Runtime error There is at least 1 reference to internal data in the interpreter in the form of a numpy array or slice. #76

Closed scho0ck closed 2 years ago

scho0ck commented 2 years ago

Description

Getting the following error on my python flask app that exposes the interpreter on a web based service.

_ensure_safe data access.""") RuntimeError: There is at least 1 reference to internal data in the interpreter in the form of a numpy array or slice. Be sure to only hold the function returned from tensor() if you are using raw data access.

This issue occurs when I'm sending images from multiple source interestingly enough the predictions are still valid however the error is raised and I have no idea how to solve it.

I'm running the following code:


# Start the server:
#   python3 coral-app.py
# Submit a request via cURL:
#   curl -X POST -F image=@images/test-image3.jpg 'http://localhost:5000/v1/vision/detection'

import argparse
import io
import os
import logging
import time

import flask
from PIL import Image
from pycoral.adapters import detect, common
from pycoral.utils import dataset, edgetpu

app = flask.Flask(__name__)

LOGFORMAT = "%(asctime)s %(levelname)s %(name)s %(threadName)s : %(message)s"
logging.basicConfig(filename="coral.log", level=logging.DEBUG, format=LOGFORMAT)
stderrLogger=logging.StreamHandler()
stderrLogger.setFormatter(logging.Formatter(logging.BASIC_FORMAT))
logging.getLogger().addHandler(stderrLogger)

DEFAULT_MODELS_DIRECTORY = "models"
DEFAULT_MODEL = "ssd_mobilenet_v2_coco_quant_postprocess_edgetpu.tflite"
DEFAULT_LABELS = "coco_labels.txt"

ROOT_URL = "/v1/vision/detection"

@app.route("/")
def info():
    info_str = "Flask app exposing tensorflow lite model {}".format(MODEL)
    return info_str

@app.route(ROOT_URL, methods=["POST"])
def predict():
    data = {"success": False}
    if flask.request.method == "POST":
        if flask.request.files.get("image"):
            image_file = flask.request.files["image"]
            image_bytes = image_file.read()
            image = Image.open(io.BytesIO(image_bytes))

            _, scale = common.set_resized_input(
                interpreter, image.size, lambda size: image.resize(size, Image.ANTIALIAS))
            #start inference
            start = time.perf_counter()
            interpreter.invoke()
            inference_time = time.perf_counter() - start
            objs = detect.get_objects(interpreter, threshold, scale)
            app.logger.debug('Detection time %.2f ms' % (inference_time * 1000))

            if not objs:
                app.logger.info('No detections in image')
            if objs:
                data["success"] = True
                preds = []

                for obj in objs:
                    preds.append(
                        {
                            "confidence": float(obj.score),
                            "label": labels[obj.id],
                            "y_min": int(obj.bbox[1]),
                            "x_min": int(obj.bbox[0]),
                            "y_max": int(obj.bbox[3]),
                            "x_max": int(obj.bbox[2]),
                        }
                    )
                data["predictions"] = preds

    # return the data dictionary as a JSON response
    return flask.jsonify(data)

if __name__ == "__main__":
    parser = argparse.ArgumentParser(description="Flask app exposing coral USB stick")
    parser.add_argument(
        "--models_directory",
        default=DEFAULT_MODELS_DIRECTORY,
        help="the directory containing the model & labels files",
    )
    parser.add_argument(
        "--model",
        default=DEFAULT_MODEL,
        help="model file",
    )
    parser.add_argument(
        '--threshold', type=float, default=0.4,
        help='Classification score threshold')
    parser.add_argument("--labels", default=DEFAULT_LABELS, 
        help="labels file of model"
    )
    parser.add_argument("--port", default=5000, type=int, 
        help="port number"
    )
    args = parser.parse_args()

    global MODEL
    MODEL = args.model
    model_file = os.path.join(args.models_directory, args.model)
    assert os.path.isfile(model_file)

    labels_file = os.path.join(args.models_directory, args.labels)
    assert os.path.isfile(labels_file)

    global labels
    labels = dataset.read_label_file(labels_file)

    global threshold
    threshold = args.threshold

    global interpreter
    interpreter = edgetpu.make_interpreter(model_file)
    interpreter.allocate_tensors()
    app.logger.info("Initialised interpreter with model : {}".format(model_file))
    app.logger.info('Note: The first inference is slow because it includes loading the model into Edge TPU memory')

    app.run(host="0.0.0.0", debug=True, port=args.port, use_reloader=False)
Click to expand! ### Issue Type Support ### Operating System Linux ### Coral Device USB Accelerator ### Other Devices Rapsberry Pi 4 ### Programming Language Python 3.6 ### Relevant Log Output ```shell Traceback (most recent call last): File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 2091, in __call__ return self.wsgi_app(environ, start_response) File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 2076, in wsgi_app response = self.handle_exception(e) File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 2073, in wsgi_app response = self.full_dispatch_request() File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 1518, in full_dispatch_request rv = self.handle_user_exception(e) File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 1516, in full_dispatch_request rv = self.dispatch_request() File "/usr/local/lib/python3.6/dist-packages/flask/app.py", line 1502, in dispatch_request return self.ensure_sync(self.view_functions[rule.endpoint])(**req.view_args) File "/app/coral-app.py", line 52, in predict interpreter.invoke() File "/usr/lib/python3/dist-packages/tflite_runtime/interpreter.py", line 832, in invoke self._ensure_safe() File "/usr/lib/python3/dist-packages/tflite_runtime/interpreter.py", line 436, in _ensure_safe data access.""") RuntimeError: There is at least 1 reference to internal data in the interpreter in the form of a numpy array or slice. Be sure to only hold the function returned from tensor() if you are using raw data access. ```
hjonnala commented 2 years ago

@scho0ck you have make sure all the requests process one by one. Can you please try the solution (threading.Semaphore()) provided here: https://stackoverflow.com/questions/42325105/flask-processing-requests-1-by-1

scho0ck commented 2 years ago

@hjonnala Thanks a lot this solved my issue.

google-coral-bot[bot] commented 2 years ago

Are you satisfied with the resolution of your issue? Yes No

ZhalaBaghirova commented 4 months ago

@hjonnala @scho0ck It solved my issue as well, in django. Thanks!!!