bytedeco / javacv

Java interface to OpenCV, FFmpeg, and more
Other
7.5k stars 1.58k forks source link

Add Inference Engine to build to support model optimizer #1344

Open cansik opened 4 years ago

cansik commented 4 years ago

It would be great if the opencv would be built with the inference engine to support Intel OpenVINO and their model zoo.

Have you already thought about it?

Build OpenCV with Inference Engine to enable loading models from Model Optimizer.
saudet commented 4 years ago

We can do that for sure. Please send a pull request to that effect if you have some time!

cansik commented 4 years ago

Ok, I will work on it, already tried to build it but it seems there is more missing than just the flag.

[aac @ 0x7f821e3fd200] Could not update timestamps for skipped samples.
[ WARN:0] global /Users/cansik/Downloads/javacpp-presets-master 3/opencv/cppbuild/macosx-x86_64/opencv-4.1.2/modules/dnn/src/op_inf_engine.cpp (660) initPlugin DNN-IE: Can't load extension plugin (extra layers for some networks). Specify path via OPENCV_DNN_IE_EXTRA_PLUGIN_PATH parameter
Exception in thread "Pipeline Thread" java.lang.RuntimeException: OpenCV(4.1.2) /Users/cansik/Downloads/javacpp-presets-master 3/opencv/cppbuild/macosx-x86_64/opencv-4.1.2/modules/dnn/src/op_inf_engine.cpp:704: error: (-215:Assertion failed) in function 'initPlugin'
> Failed to initialize Inference Engine backend: Failed to create plugin /opt/intel/openvino_2019.3.376/deployment_tools/inference_engine/lib/intel64/libMKLDNNPlugin.dylib for device CPU
> Please, check your environment
> Cannot load library '/opt/intel/openvino_2019.3.376/deployment_tools/inference_engine/lib/intel64/libMKLDNNPlugin.dylib': dlopen(/opt/intel/openvino_2019.3.376/deployment_tools/inference_engine/lib/intel64/libMKLDNNPlugin.dylib, 1): Library not loaded: @rpath/libmkl_tiny_tbb.dylib
>   Referenced from: /opt/intel/openvino_2019.3.376/deployment_tools/inference_engine/lib/intel64/libMKLDNNPlugin.dylib
>   Reason: image not found
> 
    at org.bytedeco.opencv.opencv_dnn.Net.forward(Native Method)
    at ch.zhdk.pose.pipeline.LightOpenPoseIEPipeline.detectPose(LightOpenPoseIEPipeline.kt:41)
    at ch.zhdk.pose.pipeline.Pipeline.processFrame(Pipeline.kt:185)
    at ch.zhdk.pose.pipeline.Pipeline.access$processFrame(Pipeline.kt:27)
    at ch.zhdk.pose.pipeline.Pipeline$start$1.invoke(Pipeline.kt:105)
    at ch.zhdk.pose.pipeline.Pipeline$start$1.invoke(Pipeline.kt:27)
    at kotlin.concurrent.ThreadsKt$thread$thread$1.run(Thread.kt:30)
cansik commented 4 years ago

Here is an example on how to fix this, it seems it's a MacOS problem:

https://software.intel.com/en-us/forums/intel-c-compiler/topic/698021

Maybe a fix:

install_name_tool -change @rpath/libiomp5.dylib /opt/intel/compilers_and_libraries_2017.0.102/mac/compiler/lib/libiomp5.dylib main

But as a workaround, it's possible to just preload the library:

System.load("/opt/intel/openvino/deployment_tools/inference_engine/external/mkltiny_mac/lib/libmkl_tiny_tbb.dylib")