microsoft / azure-openai-service-proxy

The Azure AI proxy service facilitates easy access to Azure AI resources for workshops and hackathons. It offers a Playground-like interface and supports Azure AI SDKs. Access is granted through a time-limited API key and endpoint.
https://microsoft.github.io/azure-openai-service-proxy/
MIT License
58 stars 34 forks source link

Generic LLM support #315

Open aaronpowell opened 2 months ago

aaronpowell commented 2 months ago

Currently, the LLM support is expecting that the model response on the Azure OpenAI endpoint structure of /openai/deployments/microsoft/<model-name>/chat/completions (and similar). This means that we can't use it with models hosted locally, such as with ollama and possibly with BYO models to Azure AI services (phi3, llama, etc.).

This will require: