dlstreamer / pipeline-server

Home of Intel(R) Deep Learning Streamer Pipeline Server (formerly Video Analytics Serving)
BSD 3-Clause "New" or "Revised" License
126 stars 51 forks source link

GPU inference fails on 12th Gen Intel® Core™ systems #108

Closed whbruce closed 2 years ago

whbruce commented 2 years ago

If GPU device is selected, pipeline will generate an error similar to the following:

 {"levelname": "ERROR", "asctime": "2022-03-23 19:29:10,695", "message": "Error on Pipeline 83c22738aadf11ec8e070242ac110002: gst-library-error-quark: base_inference plugin intitialization failed (3): /root/gst-video-analytics/gst/inference_elements/base/inference_singleton.cpp(136): acquire_inference_instance (): /GstPipeline:pipeline27/GstGvaDetect:detection:\n\nFailed to construct OpenVINOImageInference\n\tFailed to create plugin /opt/intel/openvino/deployment_tools/inference_engine/lib/intel64/libclDNNPlugin.so for device GPU\nPlease, check your environment\n[CLDNN ERROR]. clGetPlatformIDs error -1001\n", "module": "gstreamer_pipeline"}

OpenVINO 2021.4.2 container does not support Alder Lake GPU. As a workaround use CPU inference.

whbruce commented 2 years ago

Fixed in v0.7.2