The Azure AI proxy service facilitates easy access to Azure AI resources for workshops and hackathons. It offers a Playground-like interface and supports Azure AI SDKs. Access is granted through a time-limited API key and endpoint.
Currently, the LLM support is expecting that the model response on the Azure OpenAI endpoint structure of /openai/deployments/microsoft/<model-name>/chat/completions (and similar). This means that we can't use it with models hosted locally, such as with ollama and possibly with BYO models to Azure AI services (phi3, llama, etc.).
This will require:
[ ] New model type in the PG enum and equivalent .NET mapping
Currently, the LLM support is expecting that the model response on the Azure OpenAI endpoint structure of
/openai/deployments/microsoft/<model-name>/chat/completions
(and similar). This means that we can't use it with models hosted locally, such as with ollama and possibly with BYO models to Azure AI services (phi3, llama, etc.).This will require: