microsoft / onnxruntime-inference-examples

Examples for using ONNX Runtime for machine learning inferencing.
MIT License
1.16k stars 331 forks source link

Failed running on iPhone (14 Pro Max) #443

Closed privateLLM2024 closed 3 months ago

privateLLM2024 commented 3 months ago

I compiled and installed the phi3 example locally, but I got this issue when I try to generate tokens:

libc++abi: terminating due to uncaught exception of type std::runtime_error: Load model from /private/var/containers/Bundle/Application/D5694505-3CF9-4FAF-9722-4C1E19C97BB1/LocalLLM.app/phi3-mini-128k-instruct-cpu-int4-rtn-block-32-acc-level-4.onnx failed:Protobuf parsing failed.

Is there any way to fix this error?