Closed dstarr closed 1 week ago
👋 Thanks for contributing @dstarr! We will review the issue and get back to you soon.
### try to follow pattern:
var endpoint = new Uri(System.Environment.GetEnvironmentVariable("AZURE_AI_CHAT_ENDPOINT")); var credential = new AzureKeyCredential(System.Environment.GetEnvironmentVariable("AZURE_AI_CHAT_KEY"));
var client = new ChatCompletionsClient(endpoint, credential, new AzureAIInferenceClientOptions());
var requestOptions = new ChatCompletionsOptions() { Messages = { new ChatRequestSystemMessage("You are a helpful assistant."), new ChatRequestUserMessage("How many feet are in a mile?"), }, };
Response
I am working on the second lab here.
I have created a Hub and Project in Azure AI Foundry as per the first lab guided. Now I want to talk to the model I deployed.
The following code works:
The following code throw a "Resource not found" exception using the same endpoint, credential, and model.
Does this mean using Azure.AI.Inference will not work with AI Foundry? Your instructions seem to indicate it should work. Speciafially, you state about the Inference code:
My full code file is here.
Thank you for having a look.