Open blayer opened 7 months ago
Call stack here.
i am facing the same error did you sort this problem
Do you see the log message that should get printed by LogMessageFatal()
?
The error message is same as https://github.com/google/flutter-mediapipe/issues/56. It appears to be an issue across all platforms
Error initializing LlmInference: failedToInitializeSession(Optional("ValidatedGraphConfig Initialization failed.\nNo registered object with name: TokenizerCalculator; Unable to find Calculator \"TokenizerCalculator\"\nNo registered object with name: DetokenizerCalculator; Unable to find Calculator \"DetokenizerCalculator\"\nNo registered object with name: LlmGpuCalculator; Unable to find Calculator \"LlmGpuCalculator\"\nNo registered object with name: TokenCostCalculator; Unable to find Calculator \"TokenCostCalculator\"\nNo registered object with name: ModelDataCalculator; Unable to find Calculator \"ModelDataCalculator\""))
@alifatmi I was able to identify the root cause:
It is related to missing linker to libMediaPipeTasksGenAIC_device.a
and libMediaPipeTasksGenAIC_simulator.a
because our podspec are missing force-load
those two resource files.
you can try to add
s.pod_target_xcconfig = { 'OTHER_LDFLAGS[sdk=iphoneos*]' => '-force_load "$(PODS_ROOT)/MediaPipeTasksGenAIC/frameworks/genai_libraries/libMediaPipeTasksGenAIC_device.a"', 'OTHER_LDFLAGS[sdk=iphonesimulator*]' => '-force_load "$(PODS_ROOT)/MediaPipeTasksGenAIC/frameworks/genai_libraries/libMediaPipeTasksGenAIC_simulator.a"' }
in the .podspec file if you are using bazel.
Thanks for the update. I am glad you managed to use the CocoaPods.
The CocoaPods are built using this script: https://github.com/google-ai-edge/mediapipe/blob/master/mediapipe/tasks/ios/build_ios_framework.sh. This might help you if you want to directly link in the libraries using Bazel.
I was trying to integrate the model into our iOS app via cocoapods and Bazel. Building the app in iOS simulator with XCode 14.1, iphone 14 pro simulator. The building and compiling worked without any problem but it crashed at initializing
LlmInference
. Here is my example code:Here is the crash log