microsoft / prompty

Prompty makes it easy to create, manage, debug, and evaluate LLM prompts for your AI applications. Prompty is an asset class and format for LLM prompts designed to enhance observability, understandability, and portability for developers.
https://prompty.ai
MIT License
148 stars 9 forks source link

enable the base_url parameter in openai type for better OSS support #7

Closed lebaro-msft closed 1 month ago

lebaro-msft commented 1 month ago

One feature that will be super useful in Prompty is to enable adding base_url parameter in the “openai” type in model configuration. One of the cool features of AI Toolkit for VS Code extension is to host a local development inference server exposing a OpenAI compatible API on Localhost for our Phi models and other other OSS it supports out of the box. Also many other tools like Ollama, Llama.cpp all expose an OpenAI API compatible endpoint. This will make Prompty a great tool for both OpenAI as well as OSS models (including local LLM/SLM) which all generally support an openai compatible API in many inference servers. I think Azure AI MaaS also supports OpenAI compatible endpoints for our Phi and Phi Vision model deployments.

So the prompty model config settings would look something like this:

{ "name": "Phi-3-mini-128k-directml-int4-awq-block-128-onnx", "type": "openai", "api_key": "dummy", "base_url": "http://localhost:5272/v1" }

Typically if you are calling the openai python library to override base URL you would do something like this.

from openai import OpenAI openai_api_key = "dummy" openai_api_base = " http://localhost:5272/v1"

client = OpenAI( api_key=openai_api_key, base_url=openai_api_base, )

Can you please add this feature to add a base_url in settings schema in case of openai type and use that to override the URL used when talking openai endpoint.

wayliums commented 1 month ago

good suggestion, working on it

wayliums commented 1 month ago

this has been added. new extension has been published today