google-coral / pycoral

Python API for ML inferencing and transfer-learning on Coral devices
https://coral.ai
Apache License 2.0
351 stars 145 forks source link

ImportError: DLL load failed while importing _pywrap_coral: The specified module could not be found. #22

Closed SergeyLev closed 3 years ago

SergeyLev commented 3 years ago

Hi! I'm trying to run test script using Coral USB Accelerator and I get this error:

from pycoral.pybind._pywrap_coral import GetRuntimeVersion as get_runtime_version
ImportError: DLL load failed while importing _pywrap_coral: The specified module could not be found.

OS: Win 10 TF: 2.4.1 Pycoral: 1.0.1

Script:

from os.path import join
import cv2

from pycoral.adapters.common import input_size
from pycoral.adapters.detect import get_objects
from pycoral.utils.dataset import read_label_file
from pycoral.utils.edgetpu import make_interpreter
from pycoral.utils.edgetpu import run_inference

def main():
    model_dir = '../all_models'
    model = join(model_dir, 'mobilenet_ssd_v2_coco_quant_postprocess_edgetpu.tflite')
    labels = join(model_dir, 'coco_labels.txt')
    threshold = 0.1
    top_k = 3

    test_vid_path = ''

    interpreter = make_interpreter(model)
    interpreter.allocate_tensors()
    labels = read_label_file(labels)
    inference_size = input_size(interpreter)

    cap = cv2.VideoCapture(test_vid_path)

    while True:
        ret, frame = cap.read()
        if not ret:
            break
        cv2_im = frame

        cv2_im_rgb = cv2.cvtColor(cv2_im, cv2.COLOR_BGR2RGB)
        cv2_im_rgb = cv2.resize(cv2_im_rgb, inference_size)
        run_inference(interpreter, cv2_im_rgb.tobytes())
        objs = get_objects(interpreter, threshold)[:top_k]
        cv2_im = append_objs_to_img(cv2_im, inference_size, objs, labels)

        cv2.imshow('frame', cv2_im)
        if cv2.waitKey(1) & 0xFF == ord('q'):
            break

    cap.release()
    cv2.destroyAllWindows()

def append_objs_to_img(cv2_im, inference_size, objs, labels):
    height, width, channels = cv2_im.shape
    scale_x, scale_y = width / inference_size[0], height / inference_size[1]
    for obj in objs:
        bbox = obj.bbox.scale(scale_x, scale_y)
        x0, y0 = int(bbox.xmin), int(bbox.ymin)
        x1, y1 = int(bbox.xmax), int(bbox.ymax)

        percent = int(100 * obj.score)
        label = '{}% {}'.format(percent, labels.get(obj.id, obj.id))

        cv2_im = cv2.rectangle(cv2_im, (x0, y0), (x1, y1), (0, 255, 0), 2)
        cv2_im = cv2.putText(cv2_im, label, (x0, y0 + 30),
                             cv2.FONT_HERSHEY_SIMPLEX, 1.0, (255, 0, 0), 2)
    return cv2_im
Naveen-Dodda commented 3 years ago

@SergeyLev

Are you able to solve this issue ?

tcotte commented 2 years ago

I'm facing the same issue. Is it possible to somebody to help me pls ?

hjonnala commented 2 years ago

from pycoral.pybind._pywrap_coral import GetRuntimeVersion as get_runtime_version ImportError: DLL load failed while importing _pywrap_coral: The specified module could not be found.

This error occurs if edgetpu runtime is not installed. Please install edgetpu runtime 13 (edgetpu_runtime_20210119.zip) to resolve this issue. Thanks!

tcotte commented 2 years ago

Hello @hjonnala, thank you for your fast answer.

I've tried this but it does not work in my case. I know it is maybe due to the fact there is not Coral_AI device plugged on my computer (the product is out of stock for the moment). Here is my test : one colleague is inferring with his Coral_AI device and my model on his computer and I try without device to compare the inference speed.

Therefore, I am wondering if it possible to use the Pycoral library without device. Thank you in advance for your answer if you know.

hjonnala commented 2 years ago

@tcotte it's possible to use the pycoral library without coral device to run CPU models only, but edgetpu runtime installation required. I think, the only thing you have to change the interpreter line as follows:

import tflite_runtime.interpreter as tflite
interpreter = tflite.Interpreter(model_path=model_path)

Feel free to use this script to test the inference speed with tflite runtime: https://github.com/hjonnala/snippets/blob/main/coral_inference.py

Thanks!

tcotte commented 2 years ago

I installed _edgetpu_runtime_20210119.zip, but I still get the same error. I don't understand why, because the file _pywrap_coral.cp38-win_amd64.pyd is lying in ..\Miniconda3\envs\pred_batonnets\lib\site-packages\pycoral\pybind.

I post the traceback of the error :

Traceback (most recent call last): File "C:/Users/tristan_cotte/Documents/Oignies/pred_batonnets/detect_edgetpu/detect_edgetpu.py", line 32, in <module> from pycoral.utils.edgetpu import make_interpreter File "C:\Users\tristan_cotte\Miniconda3\envs\pred_batonnets\lib\site-packages\pycoral\utils\edgetpu.py", line 31, in <module> from pycoral.pybind._pywrap_coral import GetRuntimeVersion as get_runtime_version ImportError: DLL load failed while importing _pywrap_coral: Le module spécifié est introuvable.

cdrose commented 2 years ago

I just had this issue, in my case it was not having the Microsoft Visual C++ 2019 redistributable installed.

arun-kumark commented 8 months ago

I am also facing the same issue on Windows 10 environment.

C:\windows\system32>python Python 3.8.1rc1 (tags/v3.8.1rc1:b00a2b5, Dec 10 2019, 01:13:53) [MSC v.1916 64 bit (AMD64)] on win32 Type "help", "copyright", "credits" or "license" for more information.

import tflite_runtime.interpreter as tflite Traceback (most recent call last): File "", line 1, in File "C:\Users\IPC\AppData\Local\Programs\Python\Python38\lib\site-packages\tflite_runtime\interpreter.py", line 41, in from tflite_runtime import _pywrap_tensorflow_interpreter_wrapper as _interpreter_wrapper ImportError: DLL load failed while importing _pywrap_tensorflow_interpreter_wrapper: Das angegebene Modul wurde nicht gefunden.

Could you give me the solution or debugging steps ?

thanks arun