Open rickspark4 opened 1 month ago
following (got the same problem). I can say that I can import openvino if I use v2023/3 instead of 2024/4 (no "telemetry"-related error), however at that point I can't import my model (even if I re-exported it using openvino 2023/3) as I get a different error:
[ERROR] 2024-10-25T13:33:36.977Z aa9c6d85-1361-591d-80b7-3210b30f0256 Error loading model: Exception from src/inference/src/core.cpp:116:
2024-10-25T13:33:36.978Z
Exception from src/frontends/ir/src/ir_deserializer.cpp:356:
2024-10-25T13:33:36.978Z
Attribute and shape size are inconsistent for Const op!
@rickspark4 I got a hacky workaround to make OV 2024/4 work on AWS Lambda while we wait for the official fix.
The problem is that openvino uses openvino_telemetry to log user usage, but I think openvino_telemetry tries to write a file in Path.home()
, that is a directory not editable at runtime in Lambda (you can only edit the /tmp
directory at runtime I believe).
The workaround is to hardcode in openvino_telemetry a different directory in /tmp
for it to use, so that the problem above doesn't arise.
Basically, you need to download theopt_in_checker.py
file from the openvino_telemetry repo: https://github.com/openvinotoolkit/telemetry/blob/1f98695c0a524e06c13f2c68cb5c6880163f8ab7/src/utils/opt_in_checker.py#L4
You then need to modify the consent_file_base_dir
function: in line 98 add the following line of code:
dir_to_check = Path("/tmp")
So the whole function is now:
@staticmethod
def consent_file_base_dir():
"""
Returns the base directory of the consent file.
:return: base directory of the consent file.
"""
platform = system()
dir_to_check = None
if platform == 'Windows':
dir_to_check = '$LOCALAPPDATA'
elif platform in ['Linux', 'Darwin']:
dir_to_check = Path.home()
dir_to_check = Path("/tmp")
if dir_to_check is None:
log.info('Failed to find location of the openvino_telemetry file.')
return None
consent_base_dir = os.path.expandvars(dir_to_check)
if not os.path.isdir(consent_base_dir):
log.info('Failed to find location of the openvino_telemetry file.')
return None
return consent_base_dir
Finally, you need to copy this file into your docker container, replacing the corresponding file in the openvino_telemetry package. In my build this is done in the dockerfile with the command:
COPY opt_in_checker.py /var/lang/lib/python3.11/site-packages/openvino_telemetry/utils/opt_in_checker.py
With this hack I managed to import openvino in lambda at runtime :) Hope this helps while you wait for an official fix
Any updates on this ?
Request Description
Issue: I am encountering an issue when trying to compile and run an OpenVINO model on AWS Lambda . The model works perfectly on a local environment but fails in AWS Lambda.
Errror I recieved
Docker file I use to build the image:
The code in main.py:
Feature Use Case
No response
Issue submission checklist