openvinotoolkit / openvino

OpenVINO™ is an open-source toolkit for optimizing and deploying AI inference
https://docs.openvino.ai
Apache License 2.0
7.36k stars 2.3k forks source link

[Feature Request]: Issue with Compiling and Running OpenVINO Model on AWS Lambda #26988

Open rickspark4 opened 1 month ago

rickspark4 commented 1 month ago

Request Description

Issue: I am encountering an issue when trying to compile and run an OpenVINO model on AWS Lambda . The model works perfectly on a local environment but fails in AWS Lambda.

Errror I recieved

ERROR] Exception: Failed to find location of the openvino_telemetry file.
Traceback (most recent call last):
  File "/var/lang/lib/python3.8/imp.py", line 234, in load_module
    return load_source(name, filename, file)
  File "/var/lang/lib/python3.8/imp.py", line 171, in load_source
    module = _load(spec)
  File "<frozen importlib._bootstrap>", line 702, in _load
  File "<frozen importlib._bootstrap>", line 671, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 843, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/var/task/main.py", line 5, in <module>
    import openvino as ov
  File "/var/lang/lib/python3.8/site-packages/openvino/__init__.py", line 70, in <module>
    from openvino.tools.ovc import convert_model
  File "/var/lang/lib/python3.8/site-packages/openvino/tools/ovc/__init__.py", line 19, in <module>
    telemetry = init_mo_telemetry()
  File "/var/lang/lib/python3.8/site-packages/openvino/tools/ovc/telemetry_utils.py", line 30, in init_mo_telemetry
    return init_telemetry_class(tid=get_tid(),
  File "/var/lang/lib/python3.8/site-packages/openvino/tools/ovc/telemetry_utils.py", line 46, in init_telemetry_class
    telemetry = tm.Telemetry(tid=tid,
  File "/var/lang/lib/python3.8/site-packages/openvino_telemetry/main.py", line 27, in __call__
    cls.__single_instance = super(SingletonMetaClass, cls).__call__(*args, **kwargs)
  File "/var/lang/lib/python3.8/site-packages/openvino_telemetry/main.py", line 57, in __init__
    self.init(app_name, app_version, tid, backend, enable_opt_in_dialog, disable_in_ci)
  File "/var/lang/lib/python3.8/site-packages/openvino_telemetry/main.py", line 62, in init
    opt_in_check_result = opt_in_checker.check(enable_opt_in_dialog, disable_in_ci)
  File "/var/lang/lib/python3.8/site-packages/openvino_telemetry/utils/opt_in_checker.py", line 303, in check
    if not os.path.exists(self.consent_file()):
  File "/var/lang/lib/python3.8/site-packages/openvino_telemetry/utils/opt_in_checker.py", line 126, in consent_file
    return os.path.join(self.consent_file_base_dir(), self.consent_file_subdirectory(), "openvino_telemetry")
  File "/var/lang/lib/python3.8/site-packages/openvino_telemetry/utils/opt_in_checker.py", line 104, in consent_file_base_dir
    raise Exception('Failed to find location of the openvino_telemetry file.')

Docker file I use to build the image:

FROM public.ecr.aws/lambda/python:3.8

COPY requirements.txt ./
RUN  pip3 install -r requirements.txt
COPY main.py ./
COPY utils.py ./
COPY convert_to_openvino.py ./
COPY pytorch_predictor.py ./
COPY int8_quantization.py ./
COPY openvino_predictor.py ./
COPY models ./models

CMD [ "main.lambda_handler" ]

The code in main.py:

import numpy as np
from platform import system
from pathlib import Path
import os
import openvino as ov
import cv2
import base64
import http.client
from codecs import encode
import json
from io import BytesIO
import os

core = ov.Core()
device = "CPU"
ov_model_path = r"/var/task/depth_anything_v2_vits_int8.xml"
compiled_model = core.compile_model(ov_model_path, device)

Feature Use Case

No response

Issue submission checklist

fgraffitti-cyberhawk commented 1 month ago

following (got the same problem). I can say that I can import openvino if I use v2023/3 instead of 2024/4 (no "telemetry"-related error), however at that point I can't import my model (even if I re-exported it using openvino 2023/3) as I get a different error:

[ERROR] 2024-10-25T13:33:36.977Z aa9c6d85-1361-591d-80b7-3210b30f0256 Error loading model: Exception from src/inference/src/core.cpp:116:
2024-10-25T13:33:36.978Z
Exception from src/frontends/ir/src/ir_deserializer.cpp:356:
2024-10-25T13:33:36.978Z
Attribute and shape size are inconsistent for Const op!
fgraffitti-cyberhawk commented 4 weeks ago

@rickspark4 I got a hacky workaround to make OV 2024/4 work on AWS Lambda while we wait for the official fix.

The problem is that openvino uses openvino_telemetry to log user usage, but I think openvino_telemetry tries to write a file in Path.home(), that is a directory not editable at runtime in Lambda (you can only edit the /tmp directory at runtime I believe).

The workaround is to hardcode in openvino_telemetry a different directory in /tmp for it to use, so that the problem above doesn't arise.

Basically, you need to download theopt_in_checker.py file from the openvino_telemetry repo: https://github.com/openvinotoolkit/telemetry/blob/1f98695c0a524e06c13f2c68cb5c6880163f8ab7/src/utils/opt_in_checker.py#L4

You then need to modify the consent_file_base_dir function: in line 98 add the following line of code:

dir_to_check = Path("/tmp")

So the whole function is now:

  @staticmethod
  def consent_file_base_dir():
      """
      Returns the base directory of the consent file.
      :return: base directory of the consent file.
      """
      platform = system()

      dir_to_check = None

      if platform == 'Windows':
          dir_to_check = '$LOCALAPPDATA'
      elif platform in ['Linux', 'Darwin']:
          dir_to_check = Path.home()

      dir_to_check = Path("/tmp")

      if dir_to_check is None:
          log.info('Failed to find location of the openvino_telemetry file.')
          return None

      consent_base_dir = os.path.expandvars(dir_to_check)
      if not os.path.isdir(consent_base_dir):
          log.info('Failed to find location of the openvino_telemetry file.')
          return None

      return consent_base_dir

Finally, you need to copy this file into your docker container, replacing the corresponding file in the openvino_telemetry package. In my build this is done in the dockerfile with the command:

COPY opt_in_checker.py /var/lang/lib/python3.11/site-packages/openvino_telemetry/utils/opt_in_checker.py

With this hack I managed to import openvino in lambda at runtime :) Hope this helps while you wait for an official fix

tgalery commented 1 week ago

Any updates on this ?