openvinotoolkit / model_server

A scalable inference server for models optimized with OpenVINO™
https://docs.openvino.ai/2024/ovms_what_is_openvino_model_server.html
Apache License 2.0
659 stars 208 forks source link

Erro environment variable LD_LIBRARY_PATH in docker image #304

Closed ZZYhho closed 4 years ago

ZZYhho commented 4 years ago

I use target device HDDL on k8s and image is openvino/ubuntu18_model_server:latest The docker images environment: docker inspect -f {{.Config.Env}} openvino/ubuntu18_model_server:latest

[PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin PYTHON=python3.6 INTEL_OPENVINO_DIR=/opt/intel/openvino PYTHONPATH=:/opt/intel/openvino/python/python3.6 LD_LIBRARY_PATH=:/opt/intel/openvino/deployment_tools/inference_engine/external/tbb/lib:/opt/intel/openvino/deployment_tools/inference_engine/external/mkltiny_lnx/lib:/opt/intel/openvino/deployment_tools/inference_engine/lib/intel64:/opt/intel/openvino/deployment_tools/ngraph/lib]

IF run HDDL need:

mzegla commented 4 years ago

Our public docker images do not include HDDL support. We include HDDL support only in docker images build from binary version of the OpenVINO Toolkit, but we do not distribute such images. You would have to build such image yourself: https://github.com/openvinotoolkit/model_server/blob/master/docs/docker_container.md#starting-docker-container-with-hddl.

Starting from release 2021.1 (that will happen very soon), our public images will include NCS, HDDL and GPU support.