Open visagansanthanam-unisys opened 6 months ago
@visagansanthanam-unisys we have a bit of extra documentation here about how to setup with Azure OpenAI Service. Let me know if that doesn't end up being the solution to your problem and I'll take a look right away!
In either case, we can look at how documentation/setup experience might be improved
@sestinj I am trying to connect self-hosted models like llama, starcoder in Azure ML Studio and connect to Continue plugin. Do we have a provider for open-source models hosted in Azure ML Studio?
@visagansanthanam-unisys I believe that models deployed via Azure ML Studio all have different input formats, which means there is no good way for us to have built-in support, though it is possible to build a CustomLLM using config.ts.
Are there constraints that led you to choose Azure ML Studio over other options which might more easily be integrated?
Before submitting your bug report
Relevant environment info
Description
I am looking into integrating the LLM models hosted in Azure ML studio. However, i could not find the provider configuration for Azure ML Studio. I tried changing the Azure OpenAI configuration and replacing the Azure ML Endpoint, however, it doesn't work.
To reproduce
Open config.json configure Azure ML Studio endpoint, specify provider as "openai" Try to do a chat testing.
Log output