Azure-Samples / Phi-3MiniSamples

Discover how phi3-mini, a new series of models from Microsoft, enables deployment of Large Language Models (LLMs) on edge devices and IoT devices. Learn how to use Semantic Kernel, Ollama/LlamaEdge, and ONNX Runtime to access and infer phi3-mini models, and explore the possibilities of generative AI in various application scenarios
MIT License
69 stars 16 forks source link

Unlocking Generative AI with Phi-3-mini: A Guide to Inference and Deployment

Discover how Phi-3-mini, a new series of models from Microsoft, enables deployment of Large Language Models (LLMs) on edge devices and IoT devices. Learn how to use Semantic Kernel, Ollama/LlamaEdge, and ONNX Runtime to access and infer Phi-3-mini models, and explore the possibilities of generative AI in various application scenarios.

Features

inference phi3-mini model in:

Getting Started

Prerequisites

Guideline

please read my blog https://aka.ms/phi3gettingstarted to run the demo

please read my blog and follow guildeline to run iPhone demo

Resources