microsoft / prompty

Prompty makes it easy to create, manage, debug, and evaluate LLM prompts for your AI applications. Prompty is an asset class and format for LLM prompts designed to enhance observability, understandability, and portability for developers.
https://prompty.ai
MIT License
417 stars 31 forks source link

Is azure_serverless supported as a model configuration type? #17

Closed ChristopherL-STCU closed 1 month ago

ChristopherL-STCU commented 3 months ago

I see in the Prompty Language Spec that azure_serverless is listed as a configuration type. But when I try to switch the default prompty file to use it I received this error:

[error] Error: azure_serverless type isn't supported

Here's the prompty file:

---
name: ExamplePrompt
description: A prompt that uses context to ground an incoming question
authors:
  - Seth Juarez
model:
  api: chat
  configuration:
    type: azure_serverless
    azure_endpoint: https://…
  parameters:
    max_tokens: 3000
sample:
  firstName: Seth
  context: >
    The Alpine Explorer Tent boasts a detachable divider for privacy, 
    numerous mesh windows and adjustable vents for ventilation, and 
    a waterproof design. It even has a built-in gear loft for storing 
    your outdoor essentials. In short, it's a blend of privacy, comfort, 
    and convenience, making it your second home in the heart of nature!
  question: What can you tell me about your tents?
---

Is azure_serverless a supported type?

sethjuarez commented 3 months ago

You are correct, the spec calls for openai, azure_openai, and azure_serverless support. I don't think we've implemented azure_serverless yet in the extension. Maybe @wayliums knows.

wayliums commented 3 months ago

@ChristopherL-STCU we are working on that support. Which serverless model are you trying to use?

ChristopherL-STCU commented 3 months ago

I was looking at using the Phi models. OpenAI would be great too, once access to them opens up to subscriptions not part of an enterprise.

wayliums commented 3 months ago

@ChristopherL-STCU oh phi is already supported. See this issue. https://github.com/microsoft/prompty/issues/7

If you use ollama or AI Toolkit for VSCode, you can set the base url. We are working on more native integration.

For MaaS serverless, I need to see exactly the API though. Let me know if this helps?

ChristopherL-STCU commented 3 months ago

Thanks @wayliums. I don't need to use this option, I'm just learning and exploring. It seemed like it wasn't working as expected and so I thought I'd submit an issue. Thanks

MitanshuGada commented 3 months ago

I have a phi-3-mini model deployed on azure. I am trying to use prompty with the azure_serverless type and I am running into this error.

Call Stack: Error: azure_serverless type isn't supported

sethjuarez commented 3 months ago

We are working on supporting it - apologies for the delay :(

sethjuarez commented 1 month ago

Added support for azure_serverless although we opted for just serverless. It is the latest release. I added an example of a serverless prompty to the tests for you to review.