Discover how phi3-mini, a new series of models from Microsoft, enables deployment of Large Language Models (LLMs) on edge devices and IoT devices. Learn how to use Semantic Kernel, Ollama/LlamaEdge, and ONNX Runtime to access and infer phi3-mini models, and explore the possibilities of generative AI in various application scenarios
- [ ] bug report -> please search issues before submitting
- [ ] feature request
- [x] documentation issue or request
- [ ] regression (a behavior that used to work and stopped in a new release)
Minimal steps to reproduce
Clone the onnxruntime-genai repository.
Ensure that ONNX Runtime is built for iOS
Run the build script for onnxruntime-genai with the following command:
CMake Error at cmake/global_variables.cmake:54 (message):
Expected the ONNX Runtime library to be found at
/Users/xxxx/Documents/xx/phi3/onnxruntime/onnxruntime-genai/ort/lib/libonnxruntime.dylib.
Actual: Not found.
Call Stack (most recent call first):
CMakeLists.txt:18 (include)
Expected/desired behavior
The build script should locate the ONNX Runtime library at the specified path and proceed with the build process without errors.
OS and Version?
macOS Sonoma 14.2.1 (23C71)
Versions
CMake: 3.28.0
Python: 3.10 (conda)
ONNX Runtime: Built from the latest source
This issue is for a: (mark with an
x
)Minimal steps to reproduce
Clone the onnxruntime-genai repository. Ensure that ONNX Runtime is built for iOS Run the build script for onnxruntime-genai with the following command:
Any log messages given by the failure
Expected/desired behavior
The build script should locate the ONNX Runtime library at the specified path and proceed with the build process without errors.
OS and Version?
macOS Sonoma 14.2.1 (23C71)
Versions
CMake: 3.28.0 Python: 3.10 (conda) ONNX Runtime: Built from the latest source