google-coral / pycoral

Python API for ML inferencing and transfer-learning on Coral devices
https://coral.ai
Apache License 2.0
347 stars 144 forks source link

Error ValueError: Failed to load delegate from edgetpu.dll And No module named 'pycoral.pybind' #48

Closed staebchen0 closed 2 years ago

staebchen0 commented 2 years ago

Description

I have been trying to get the google coral usb accelerator running under windwos 10 for days.

Under Visual Studio 2019 i have

Install.bat Result:

Installing UsbDk
Installing Windows drivers
Microsoft-PnP-Hilfsprogramm

Treiberpaket wird hinzugefügt:  coral.inf
Das Treiberpaket wurde erfolgreich hinzugefügt. (Ist bereits im System vorhanden)
Veröffentlichter Name:         oem75.inf

Treiberpaket wird hinzugefügt:  Coral_USB_Accelerator.inf
Das Treiberpaket wurde erfolgreich hinzugefügt. (Ist bereits im System vorhanden)
Veröffentlichter Name:         oem76.inf

Treiberpaket wird hinzugefügt:  Coral_USB_Accelerator_(DFU).inf
Das Treiberpaket wurde erfolgreich hinzugefügt. (Ist bereits im System vorhanden)
Veröffentlichter Name:         oem77.inf
Treiberpaket auf dem Gerät installiert: USB\VID_1A6E&PID_089A\5&32865703&0&17
Treiberpaket auf dem Gerät installiert: USB\VID_1A6E&PID_089A\5&32865703&0&18
Das Treiberpaket auf dem Gerät ist auf dem neuesten Stand: USB\VID_1A6E&PID_089A\5&32865703&0&19

Treiberpakete insgesamt:  3
Hinzugefügte Treiberpakete:  3
Installing performance counters
Info: Anbieter {aaa5bf9e-c44b-4177-af65-d3a06ba45fe7}, der in C:\Users\anja-\Anja_Programme\AnjaCoral\envCoralPy8\edgetpu_runtime\third_party\coral_accelerator_windows\coral.man definiert ist, ist bereits im Systemrepository installiert.
Info: Die Leistungsindikatoren wurden erfolgreich in C:\Users\anja-\Anja_Programme\AnjaCoral\envCoralPy8\edgetpu_runtime\third_party\coral_accelerator_windows\coral.man installiert.Copying edgetpu and libusb to System32
        1 Datei(en) kopiert.
        1 Datei(en) kopiert.
Install complete
Drücken Sie eine beliebige Taste . . .

Then I installed the following packages: Edge-TPU-Python-API: pip install https://dl.google.com/coral/edgetpu_api/edgetpu-2.14.0-cp37-cp37m-win_amd64.whl

tflite runtime pip install https://github.com/google-coral/pycoral/releases/download/v2.0.0/tflite_runtime-2.5.0.post1-cp37-cp37m-win_amd64.whl

Pycoral installieren pip install https://github.com/google-coral/pycoral/releases/download/v2.0.0/pycoral-2.0.0-cp37-cp37m-win_amd64.whl

**

** python examples/classify_image.py --model test_data/mobilenet_v2_1.0_224_inat_bird_quant_edgetpu.tflite --labels test_data/inat_bird_labels.txt --input test_data/parrot.jpg

Result:
$ python examples/classify_image.py --model test_data/mobilenet_v2_1.0_224_inat_bird_quant_edgetpu.tflite --labels test_data/inat_bird_labels.txt --input test_data/parrot.jpg
Traceback (most recent call last):
  File "C:\Users\anja-\Anja_Programme\AnjaCoral\envCoralPy8\lib\site-packages\tflite_runtime\interpreter.py", line 160, in load_delegate
    delegate = Delegate(library, options)
  File "C:\Users\anja-\Anja_Programme\AnjaCoral\envCoralPy8\lib\site-packages\tflite_runtime\interpreter.py", line 119, in __init__
    raise ValueError(capture.message)
ValueError

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "examples/classify_image.py", line 121, in <module>
    main()
  File "examples/classify_image.py", line 71, in main
    interpreter = make_interpreter(*args.model.split('@'))
  File "C:\Users\anja-\Anja_Programme\AnjaCoral\envCoralPy8\lib\site-packages\pycoral\utils\edgetpu.py", line 87, in make_interpreter
    delegates = [load_edgetpu_delegate({'device': device} if device else {})]
  File "C:\Users\anja-\Anja_Programme\AnjaCoral\envCoralPy8\lib\site-packages\pycoral\utils\edgetpu.py", line 52, in load_edgetpu_delegate
    return tflite.load_delegate(_EDGETPU_SHARED_LIB, options or {})
  File "C:\Users\anja-\Anja_Programme\AnjaCoral\envCoralPy8\lib\site-packages\tflite_runtime\interpreter.py", line 163, in load_delegate
    library, str(e)))
ValueError: Failed to load delegate from edgetpu.dll

(envCoralPy8)

the edgetpu.dll is in the directory C:\Windows\System32

Whats going on here? do you have a tip?

I also tried to run the test script in Visual Studio 2019

# Lint as: python3
# Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
#     https://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
"""Utilities for using the TensorFlow Lite Interpreter with Edge TPU."""

import contextlib
import ctypes
import ctypes.util

import numpy as np

# pylint:disable=unused-import
from pycoral.pybind._pywrap_coral import GetRuntimeVersion as get_runtime_version
from pycoral.pybind._pywrap_coral import InvokeWithBytes as invoke_with_bytes
from pycoral.pybind._pywrap_coral import InvokeWithDmaBuffer as invoke_with_dmabuffer
from pycoral.pybind._pywrap_coral import InvokeWithMemBuffer as invoke_with_membuffer
from pycoral.pybind._pywrap_coral import ListEdgeTpus as list_edge_tpus
from pycoral.pybind._pywrap_coral import SetVerbosity as set_verbosity
from pycoral.pybind._pywrap_coral import SupportsDmabuf as supports_dmabuf
import platform
import tflite_runtime.interpreter as tflite

_EDGETPU_SHARED_LIB = {
  'Linux': 'libedgetpu.so.1',
  'Darwin': 'libedgetpu.1.dylib',
  'Windows': 'edgetpu.dll'
}[platform.system()]

def load_edgetpu_delegate(options=None):
  """Loads the Edge TPU delegate with the given options.

  Args:
    options (dict): Options that are passed to the Edge TPU delegate, via
      ``tf.lite.load_delegate``. The only option you should use is
      "device", which defines the Edge TPU to use. Supported values are the same
      as `device` in :func:`make_interpreter`.
  Returns:
    The Edge TPU delegate object.
  """
  return tflite.load_delegate(_EDGETPU_SHARED_LIB, options or {})

def make_interpreter(model_path_or_content, device=None, delegate=None):
  """Creates a new ``tf.lite.Interpreter`` instance using the given model.

  **Note:** If you have multiple Edge TPUs, you should always specify the
  ``device`` argument.

  Args:
     model_path_or_content (str or bytes): `str` object is interpreted as
       model path, `bytes` object is interpreted as model content.
     device (str): The Edge TPU device you want:

       + None      -- use any Edge TPU (this is the default)
       + ":<N>"    -- use N-th Edge TPU (this corresponds to the enumerated
         index position from :func:`list_edge_tpus`)
       + "usb"     -- use any USB Edge TPU
       + "usb:<N>" -- use N-th USB Edge TPU
       + "pci"     -- use any PCIe Edge TPU
       + "pci:<N>" -- use N-th PCIe Edge TPU

       If left as None, you cannot reliably predict which device you'll get.
       So if you have multiple Edge TPUs and want to run a specific model on
       each one, then you must specify the device.
     delegate: A pre-loaded Edge TPU delegate object, as provided by
       :func:`load_edgetpu_delegate`. If provided, the `device` argument
       is ignored.

  Returns:
     New ``tf.lite.Interpreter`` instance.
  """
  if delegate:
    delegates = [delegate]
  else:
    delegates = [load_edgetpu_delegate({'device': device} if device else {})]
  if isinstance(model_path_or_content, bytes):
    return tflite.Interpreter(
        model_content=model_path_or_content, experimental_delegates=delegates)
  else:
    return tflite.Interpreter(
        model_path=model_path_or_content, experimental_delegates=delegates)

# ctypes definition of GstMapInfo. This is a stable API, guaranteed to be
# ABI compatible for any past and future GStreamer 1.0 releases.
# Used to get the underlying memory pointer without any copies, and without
# native library linking against libgstreamer.
class _GstMapInfo(ctypes.Structure):
  _fields_ = [
      ('memory', ctypes.c_void_p),  # GstMemory *memory
      ('flags', ctypes.c_int),  # GstMapFlags flags
      ('data', ctypes.c_void_p),  # guint8 *data
      ('size', ctypes.c_size_t),  # gsize size
      ('maxsize', ctypes.c_size_t),  # gsize maxsize
      ('user_data', ctypes.c_void_p * 4),  # gpointer user_data[4]
      ('_gst_reserved', ctypes.c_void_p * 4)
  ]  # GST_PADDING

# Try to import GStreamer but don't fail if it's not available. If not available
# we're probably not getting GStreamer buffers as input anyway.
_libgst = None
try:
  # pylint:disable=g-import-not-at-top
  import gi
  gi.require_version('Gst', '1.0')
  gi.require_version('GstAllocators', '1.0')
  # pylint:disable=g-multiple-import
  from gi.repository import Gst, GstAllocators
  _libgst = ctypes.CDLL(ctypes.util.find_library('gstreamer-1.0'))
  _libgst.gst_buffer_map.argtypes = [
      ctypes.c_void_p,
      ctypes.POINTER(_GstMapInfo), ctypes.c_int
  ]
  _libgst.gst_buffer_map.restype = ctypes.c_int
  _libgst.gst_buffer_unmap.argtypes = [
      ctypes.c_void_p, ctypes.POINTER(_GstMapInfo)
  ]
  _libgst.gst_buffer_unmap.restype = None
except (ImportError, ValueError, OSError):
  pass

def _is_valid_ctypes_input(input_data):
  if not isinstance(input_data, tuple):
    return False
  pointer, size = input_data
  if not isinstance(pointer, ctypes.c_void_p):
    return False
  return isinstance(size, int)

@contextlib.contextmanager
def _gst_buffer_map(buffer):
  """Yields gst buffer map."""
  mapping = _GstMapInfo()
  ptr = hash(buffer)
  success = _libgst.gst_buffer_map(ptr, mapping, Gst.MapFlags.READ)
  if not success:
    raise RuntimeError('gst_buffer_map failed')
  try:
    yield ctypes.c_void_p(mapping.data), mapping.size
  finally:
    _libgst.gst_buffer_unmap(ptr, mapping)

def _check_input_size(input_size, expected_input_size):
  if input_size < expected_input_size:
    raise ValueError('input size={}, expected={}.'.format(
        input_size, expected_input_size))

def run_inference(interpreter, input_data):
  """Performs interpreter ``invoke()`` with a raw input tensor.

  Args:
    interpreter: The ``tf.lite.Interpreter`` to invoke.
    input_data: A 1-D array as the input tensor. Input data must be uint8
      format. Data may be ``Gst.Buffer`` or :obj:`numpy.ndarray`.
  """
  input_shape = interpreter.get_input_details()[0]['shape']
  expected_input_size = np.prod(input_shape)

  interpreter_handle = interpreter._native_handle()  # pylint:disable=protected-access
  if isinstance(input_data, bytes):
    _check_input_size(len(input_data), expected_input_size)
    invoke_with_bytes(interpreter_handle, input_data)
  elif _is_valid_ctypes_input(input_data):
    pointer, actual_size = input_data
    _check_input_size(actual_size, expected_input_size)
    invoke_with_membuffer(interpreter_handle, pointer.value,
                          expected_input_size)
  elif _libgst and isinstance(input_data, Gst.Buffer):
    memory = input_data.peek_memory(0)
    map_buffer = not GstAllocators.is_dmabuf_memory(
        memory) or not supports_dmabuf(interpreter_handle)
    if not map_buffer:
      _check_input_size(memory.size, expected_input_size)
      fd = GstAllocators.dmabuf_memory_get_fd(memory)
      try:
        invoke_with_dmabuffer(interpreter_handle, fd, expected_input_size)
      except RuntimeError:
        # dma-buf input didn't work, likely due to old kernel driver. This
        # situation can't be detected until one inference has been tried.
        map_buffer = True
    if map_buffer:
      with _gst_buffer_map(input_data) as (pointer, actual_size):
        assert actual_size >= expected_input_size
        invoke_with_membuffer(interpreter_handle, pointer.value,
                              expected_input_size)
  elif isinstance(input_data, np.ndarray):
    _check_input_size(len(input_data), expected_input_size)
    invoke_with_membuffer(interpreter_handle, input_data.ctypes.data,
                          expected_input_size)
  else:
    raise TypeError('input data type is not supported.')

then i get the error Code line: from pycoral.pybind._pywrap_coral import GetRuntimeVersion as get_runtime_version - No module named 'pycoral.pybind'

Issue Type

Build/Install

Operating System

Windows 10

Coral Device

USB Accelerator

Other Devices

No response

Programming Language

Python 3.7

Relevant Log Output

No response

hjonnala commented 2 years ago

please check this comment https://github.com/google-coral/libedgetpu/issues/29#issuecomment-909207700.

Thanks

staebchen0 commented 2 years ago

Thank you for the hint :-) now I have installed the other runtime

edgetpu_runtime_20210119

import tflite_runtime print (tflite_runtime.version) 2.5.0.post1 import edgetpu print (edgetpu.version) 2.12.1

Test works now:

anja-@LAPTOP-7FIIFTGI MINGW64 ~/Anja_Programme/AnjaCoral/coral/pycoral (master)
$ python examples/classify_image.py --model test_data/mobilenet_v2_1.0_224_inat_bird_quant_edgetpu.tflite --labels test_data/inat_bird_labels.txt --input test_data/parrot.jpg
----INFERENCE TIME----
Note: The first inference on Edge TPU is slow because it includes loading the model into Edge TPU memory.
13.3ms
4.5ms
4.7ms
4.5ms
4.3ms
-------RESULTS--------
Ara macao (Scarlet Macaw): 0.75781
(envCoralPy8)
anja-@LAPTOP-7FIIFTGI MINGW64 ~/Anja_Programme/AnjaCoral/coral/pycoral (master)
$

now I wanted to do a test with my own model

Beispiel:

from edgetpu.classification.engine import ClassificationEngine
from PIL import Image
import cv2
import re
import os

# the TFLite converted to be used with edgetpu
modelPath = './model/model_edgetpu.tflite'

# The path to labels.txt that was downloaded with your model
labelPath = './model/labels.txt'

# This function parses the labels.txt and puts it in a python dictionary
def loadLabels(labelPath):
    p = re.compile(r'\s*(\d+)(.+)')
    with open(labelPath, 'r', encoding='utf-8') as labelFile:
        lines = (p.match(line).groups() for line in labelFile.readlines())
        return {int(num): text.strip() for num, text in lines}

# This function takes in a PIL Image and the ClassificationEngine
def classifyImage(image, engine):
    # Classify and ouptut inference
    classifications = engine.ClassifyWithImage(image)
    return classifications

def main():
    # Load your model onto your Coral Edgetpu
    engine = ClassificationEngine(modelPath)
    labels = loadLabels(labelPath)

    cap = cv2.VideoCapture(0)
    while cap.isOpened():
        ret, frame = cap.read()
        if not ret:
            break

        # Format the image into a PIL Image so its compatable with Edge TPU
        cv2_im = frame
        pil_im = Image.fromarray(cv2_im)

        # Resize and flip image so its a square and matches training
        pil_im.resize((224, 224))
        pil_im.transpose(Image.FLIP_LEFT_RIGHT)

        # Classify and display image
        results = classifyImage(pil_im, engine)
        cv2.imshow('frame', cv2_im)
        print(results)
        if cv2.waitKey(1) & 0xFF == ord('q'):
            break

    cap.release()
    cv2.destroyAllWindows()

if __name__ == '__main__':
    main()

Error: In line :from edgetpu.classification.engine import ClassificationEngine No module named '_edgetpu_cpp_wrapper'

Although the edgetpu_cpp_wrapper.py script in the AnjaCoral \ envCoralPy8 \ Lib \ site-packages \ edgetpu \ swig is available

hjonnala commented 2 years ago

edgetpu api is deprecated. It has been modified to libcoral api for c++ and pycoral api for python.

Please refer to examples-camera for opencv example scripts.

staebchen0 commented 2 years ago

okay i'm trying. Can you still tell me how to install the compiler on Windows? Only the commands for Linux are shown on the info page

hjonnala commented 2 years ago

sorry, compiler does not work on windows. Please use web compiler.

staebchen0 commented 2 years ago

okay thank you very much! compiling worked.

now I get detect.py when running the script the error: list index out of range code line: objs = get_objects (interpreter, args.threshold) [: args.top_k] I have 4 classes in the model label.txt values: 0 0_none 1 1_cat 2 2_mouse 3 3_other

hjonnala commented 2 years ago

you might not be getting 4 values from get_objects. Please check how many objects you are getting.

staebchen0 commented 2 years ago

I think it's because I have a classification model and not a detection model!

is there also an example of a classification model?

under examples-camera / opencv / is just the detect.py example

hjonnala commented 2 years ago

yes, we have only object detection with opencv.

staebchen0 commented 2 years ago

okay thanks for your help :-)