System information (Please provide as much relevant information as possible)
Have I written custom code (as opposed to using a stock example script provided in Mediapipe):
Yes, I have written some of my own codes as I am trying to build a python version iris landmark.
OS Platform and Distribution (e.g., Linux Ubuntu 16.04, Android 11, iOS 14.4):
Jetson Xavier NX platform with Jetpack 4.5.1
Programming Language and version ( e.g. C++, Python, Java):
Python
Describe the expected behavior:
I am trying to build a python version of iris landmark of my own. I coded the graph and python wrapper.
The expected behavior is that the python interface of iris landmark can be invoked properly.
Please refer to the details of what I encountered and what I tried in Other info section.
To build a python interface for iris landmark, I put a new graph iris_landmark_front_cpu.pbtxt in modules/iris_landmark. This new graph was modified from iris_tracking_cpu.pbtxt. Then I wrote a python wrapper for it by following the other wrappers.
Please note if I compile the graph iris_landmark_front_cpu.pbtxt as a c++ desktop application, the executable can be run correctly.
If I build the python interface, the building process can be done without error. However, when I was trying to run the python script, I got the following error:
Traceback (most recent call last):
File "../python/test_mp_iris_landmarks.py", line 17, in <module>
device='cpu'
File "//censored/lib/python3.6/site-packages/mediapipe/python/solutions/iris_landmark.py", line 215, in __init__
outputs=['face_landmarks_with_iris'])
File "//censored/lib/python3.6/site-packages/mediapipe/python/solution_base.py", line 225, in __init__
binary_graph_path=os.path.join(root_path, binary_graph_path))
RuntimeError: ValidatedGraphConfig Initialization failed.
TfLiteInferenceCalculator: ; Either model as side packet or model path in options is required.DEL")lator.cc:316)
TfLiteInferenceCalculator: ; Either model as side packet or model path in options is required.DEL")lator.cc:316)
The options are gone! That's strange. Is something wrong with the parsing process?
I noticed that iris_landmark_cpu.pbtxt uses "TfLiteInferenceCalculator" instead of "InferenceCalculator". So I modified iris_landmark_cpu.pbtxt by switching every TfLite calculator back to its normal counterpart.
For example, TfLiteInferenceCalculator -> InferenceCalculator; TfLiteConverterCalculator -> ImageToTensorCalculator; SplitTfLiteTensorVectorCalculator -> SplitTensorVectorCalculator; etc.
Now, everything works fine and the iris landmark python interface can be invoked correctly.
From what I can see, something's wrong with the parsing of TfLiteInferenceCalculator. But I cannot locate the error further.
UPDATE: I found another calculator that has the same problem. That is, the FaceGeometryPipelineCalculator in the face_geometry module. This calculator requires a metadata binarypb file. After parsing, the path to the binarypb file in options disappeared.
BTW, can anyone tell me what's the difference between TfLiteInferenceCalculator and InferenceCalculator, since InferenceCalculator can run tflite models as well.
System information (Please provide as much relevant information as possible)
Describe the expected behavior: I am trying to build a python version of iris landmark of my own. I coded the graph and python wrapper. The expected behavior is that the python interface of iris landmark can be invoked properly. Please refer to the details of what I encountered and what I tried in Other info section.
Standalone code you may have used to try to get what you need : Please use the following link to download my graphs and python wrapper. https://drive.google.com/drive/folders/1q7AbVlQeRrq3O_oWjtcoFZxeNOXjA8-h?usp=sharing
Other info / Complete Logs :
To build a python interface for iris landmark, I put a new graph iris_landmark_front_cpu.pbtxt in modules/iris_landmark. This new graph was modified from iris_tracking_cpu.pbtxt. Then I wrote a python wrapper for it by following the other wrappers.
Please note if I compile the graph iris_landmark_front_cpu.pbtxt as a c++ desktop application, the executable can be run correctly.
If I build the python interface, the building process can be done without error. However, when I was trying to run the python script, I got the following error:
After some digging, I located this line in python/solution_base.py https://github.com/google/mediapipe/blob/017c1dc7eaf2d45ca95769194d9c83a86a82876d/mediapipe/python/solution_base.py#L349
If I do
print(validated_graph.text_config)
, I can get the correct graph:However, if I
print(canonical_graph_config_proto)
after the parsing, I got:The options are gone! That's strange. Is something wrong with the parsing process?
I noticed that iris_landmark_cpu.pbtxt uses "TfLiteInferenceCalculator" instead of "InferenceCalculator". So I modified iris_landmark_cpu.pbtxt by switching every TfLite calculator back to its normal counterpart. For example, TfLiteInferenceCalculator -> InferenceCalculator; TfLiteConverterCalculator -> ImageToTensorCalculator; SplitTfLiteTensorVectorCalculator -> SplitTensorVectorCalculator; etc. Now, everything works fine and the iris landmark python interface can be invoked correctly.
From what I can see, something's wrong with the parsing of TfLiteInferenceCalculator. But I cannot locate the error further. UPDATE: I found another calculator that has the same problem. That is, the FaceGeometryPipelineCalculator in the face_geometry module. This calculator requires a metadata binarypb file. After parsing, the path to the binarypb file in options disappeared.
BTW, can anyone tell me what's the difference between TfLiteInferenceCalculator and InferenceCalculator, since InferenceCalculator can run tflite models as well.