Open akulbe opened 9 months ago
I can't configure Terminal Chat to work either, I get a different warning Resource not found
and I have verified the endpoint works when I make a curl call... one concern I have is, what is the deployment name supposed to be? The docs don't list a name to use and forming an API request requires a deployment name (unless it does some model lookup and uses first matching model?)... the docs show TerminalSuggestions
model deployment name but that doesn't work either. I am using canary 1.22.1661.0
@akulbe FYI my Azure OpenAI service endpoint now uses .openai.azure.com
and not .cognitiveservices.azure.com
so maybe check your Azure portal for a new endpoint?
Make sure you are using the correct endpoint! The one in your screenshot is not the endpoint for your deployment, you need to go into your deployment's "playground" in azure ai studio and then click "view code" to find the correct one:
Agreed that it is quite difficult to find the correct endpoint unfortunately, even in my screenshot the correct endpoint is not the one in the textbox, it is the one in the curl
command (the one that is blanked out but pointed to with the red arrow)
Windows Terminal version
1.20.3401.0
Windows build number
22635.2841
Other Software
No response
Steps to reproduce
Expected Behavior
I was expecting similar results as this screenshot shows, where the Terminal is actually connecting to, and interacting with, the LLM.
Actual Behavior
Received the "Not a valid endpoint" error message