Open azraimahadan opened 1 week ago
Hey @azraimahadan! I'm here to help you with any bugs, questions, or contributions. Let's tackle this issue together!
The HttpResponseError
with the message Operation returned an invalid status 'Failed Dependency'
and content {"detail":"Not Found"}
suggests that there might be an issue with the endpoint URL or the credentials used for accessing the AzureML service.
Here are a few things you can check and try:
Endpoint URL: Ensure that the endpoint URL is correct and accessible. Double-check for any typos or errors in the URL.
Credentials: Make sure that the credentials provided are correct and have the necessary permissions to access the AzureML endpoint. If you're using an API key, ensure it's valid. If using a credential class like DefaultAzureCredential
, ensure it's properly configured.
Model Availability: Verify that the model you are trying to access is deployed and available at the specified endpoint.
Network Issues: Check for any network issues that might be preventing access to the endpoint.
AzureML Configuration: Ensure that the AzureML endpoint is correctly configured to handle requests from LlamaIndex. This includes checking any required headers or authentication methods.
If these steps do not resolve the issue, you may need to look into the AzureML service logs for more detailed error information.
To continue talking to Dosu, mention @dosu.
I'm not even entirely sure it's made to handle that? Feel free to check out the source code though, maybe you can spot the issue
I'm not even entirely sure it's made to handle that? Feel free to check out the source code though, maybe you can spot the issue
I've tried calling the endpoint locally and it works. But it failed when I called it with llamaindex's Azure's integrations. I suspect it does not support Azure ML endpoint. but maybe you can explain more if I missed anything.
You are probably right. But I don't have access to azure to test these endpoints, so I pointed you towards the current source code in hopes you could make a PR :)
Bug Description
LlamaIndex fails to wrap an AzureML endpoint. When attempting to use the
AzureAICompletionsModel
from LlamaIndex, anHttpResponseError
with the messageOperation returned an invalid status 'Failed Dependency'
is raised.Error traceback:
Version
0.11.19
Steps to Reproduce
Set up an AzureML endpoint with the URL: https://<>/score.
Use the LlamaIndex library to create an AzureAICompletionsModel.
Attempt to call the .complete() method with any prompt, such as "The sky is a beautiful blue and".
Relevant Logs/Tracbacks