Closed ChristopherL-STCU closed 1 month ago
You are correct, the spec calls for openai
, azure_openai
, and azure_serverless
support. I don't think we've implemented azure_serverless
yet in the extension. Maybe @wayliums knows.
@ChristopherL-STCU we are working on that support. Which serverless model are you trying to use?
I was looking at using the Phi models. OpenAI would be great too, once access to them opens up to subscriptions not part of an enterprise.
@ChristopherL-STCU oh phi is already supported. See this issue. https://github.com/microsoft/prompty/issues/7
If you use ollama or AI Toolkit for VSCode, you can set the base url. We are working on more native integration.
For MaaS serverless, I need to see exactly the API though. Let me know if this helps?
Thanks @wayliums. I don't need to use this option, I'm just learning and exploring. It seemed like it wasn't working as expected and so I thought I'd submit an issue. Thanks
I have a phi-3-mini model deployed on azure. I am trying to use prompty with the azure_serverless type and I am running into this error.
Call Stack: Error: azure_serverless type isn't supported
We are working on supporting it - apologies for the delay :(
Added support for azure_serverless
although we opted for just serverless
. It is the latest release. I added an example of a serverless prompty to the tests for you to review.
I see in the Prompty Language Spec that azure_serverless is listed as a configuration type. But when I try to switch the default prompty file to use it I received this error:
Here's the prompty file:
Is
azure_serverless
a supported type?