Azure-Samples / chat-with-your-data-solution-accelerator

A Solution Accelerator for the RAG pattern running in Azure, using Azure AI Search for retrieval and Azure OpenAI large language models to power ChatGPT-style and Q&A experiences. This includes most common requirements and best practices.
https://azure.microsoft.com/products/search
MIT License
624 stars 294 forks source link

issue when test chat : The API deployment for this resource does not exist. If you created.. #65

Closed TIDIALLO closed 6 months ago

TIDIALLO commented 7 months ago

Hi,

I was implementing the chat-with-your-data-solution-accelerator solution, but I get an error during the chat test. description of the error:

/api/conversation/custom:1

Failed to load resource: the server responded with a status of 500 (INTERNAL SERVER ERROR)

{ "error": "415 Unsupported Media Type: Did not attempt to load JSON data because the request Content-Type was not 'application/json'." }

gmndrg commented 7 months ago

Hello @TIDIALLO can you please provide all of your repro steps to understand when you get that error?

TIDIALLO commented 7 months ago

Hi @gmndrg, I resumed the implementation on the azure side by following all steps and everything went well until deployment but the problem is if I start the test on the chat it generates the error, it does not even answer the greeting (for example hi)

gmndrg commented 7 months ago

Have you uploaded any files so you can start asking questions about your content? What kind of files have you uploaded to the index via the admin portal?

TIDIALLO commented 7 months ago

yes I loaded the ones on the doc folder in my storage account in a container via the admin interface (which I found cool with streamlit

gmndrg commented 7 months ago

Unfortunately, this looks related to perhaps some (or one of the files you uploaded). Unless you upload your documents online so we can check what may be the issue to assist (which is not recommended unless public), there is not much guidance we can provide for this.

krohm commented 7 months ago

Hi i'm also hitting a 500 when calling the api/conversation/custom, with {"error":"Invalid connection string"} Environment Variables and Key are looking good

gmndrg commented 6 months ago

@krohm one of your environment variables for a connection string has to be incorrect since that is a product error, not a repo error. Please double check. Also check if you have any firewalls or anything setup besides the repo config that is not letting check the keys.

krohm commented 6 months ago

Hi, it was the APPINSIGHTS_CONNECTION_STRING setting which was missing ... :-(

TIDIALLO commented 6 months ago

Hi @krohm, I would like to know if you did the implementation locally or directly in azure? thanks in advance

krohm commented 6 months ago

Hi @TIDIALLO , i have done the whole stuff in Azure, but have not used the bicep, instead of this i have written down all the neccessary components in Terraform

TIDIALLO commented 6 months ago

ai1 ai2 this s the error that i occure

gmndrg commented 6 months ago

Like mentioned above without giving us all the repro steps of everything, unfortunately we cannot repro, since this looks like a specific error when using the deployment you have, We can only recommend to redeploy the whole solution from scratch and try uploading a doc, try to repro, and until you get the error, perhaps you're able to identify if related to an unsupported doc or similar.

TIDIALLO commented 6 months ago

okay, I’ll star the solution again and see it all go well, 😉it’s weird Thank you

TIDIALLO commented 6 months ago

@gmndrg is there a possibility to make an inspection deu code a part of azure

gmndrg commented 6 months ago

Hello @TIDIALLO custom code review is not part of the support, unfortunately. We try to help with documentation and general guidance accordingly.

TIDIALLO commented 6 months ago

ai3 this the error that I have when deploying in local

gmndrg commented 6 months ago

Please review this post to see if the suggestions help: https://learn.microsoft.com/en-us/answers/questions/1401622/azure-openai-issue-with-endpoint-connection-errno. If you still face the issues after checking DNS and if you're not using a proxy, then please help opening a support case with the Azure OpenAI team for assistance: https://learn.microsoft.com/en-us/azure/azure-portal/supportability/how-to-create-azure-support-request Thanks.

daniheck-msft commented 6 months ago

Hi team, thank you for building this.

After deploying I get: "Traceback (most recent call last): File "/usr/local/src/myscripts/admin/pages/02_Explore_Data.py", line 38, in search_client = vector_store_helper.get_vector_store().client File "/usr/local/src/myscripts/admin/../utilities/helpers/AzureSearchHelper.py", line 33, in get_vector_store vector_search_dimensions=len(llm_helper.get_embedding_model().embed_query("Text")), File "/usr/local/lib/python3.9/site-packages/langchain/embeddings/openai.py", line 516, in embed_query return self.embed_documents([text])[0] File "/usr/local/lib/python3.9/site-packages/langchain/embeddings/openai.py", line 488, in embed_documents return self._get_len_safe_embeddings(texts, engine=self.deployment) File "/usr/local/lib/python3.9/site-packages/langchain/embeddings/openai.py", line 374, in _get_len_safe_embeddings response = embed_with_retry( File "/usr/local/lib/python3.9/site-packages/langchain/embeddings/openai.py", line 107, in embed_with_retry return _embed_with_retry(kwargs) File "/usr/local/lib/python3.9/site-packages/tenacity/init.py", line 289, in wrapped_f return self(f, *args, *kw) File "/usr/local/lib/python3.9/site-packages/tenacity/init.py", line 379, in call do = self.iter(retry_state=retry_state) File "/usr/local/lib/python3.9/site-packages/tenacity/init.py", line 314, in iter return fut.result() File "/usr/local/lib/python3.9/concurrent/futures/_base.py", line 439, in result return self.get_result() File "/usr/local/lib/python3.9/concurrent/futures/_base.py", line 391, in get_result raise self._exception File "/usr/local/lib/python3.9/site-packages/tenacity/init.py", line 382, in call result = fn(args, kwargs) File "/usr/local/lib/python3.9/site-packages/langchain/embeddings/openai.py", line 104, in _embed_with_retry response = embeddings.client.create(*kwargs) File "/usr/local/lib/python3.9/site-packages/openai/api_resources/embedding.py", line 33, in create response = super().create(args, **kwargs) File "/usr/local/lib/python3.9/site-packages/openai/api_resources/abstract/engine_apiresource.py", line 153, in create response, , api_key = requestor.request( File "/usr/local/lib/python3.9/site-packages/openai/api_requestor.py", line 298, in request resp, got_stream = self._interpret_response(result, stream) File "/usr/local/lib/python3.9/site-packages/openai/api_requestor.py", line 700, in _interpret_response self._interpret_response_line( File "/usr/local/lib/python3.9/site-packages/openai/api_requestor.py", line 763, in _interpret_response_line raise self.handle_error_response( openai.error.InvalidRequestError: The API deployment for this resource does not exist. If you created the deployment within the last 5 minutes, please wait a moment and try again."

I seem to face similar issues when using the playground and the chat with your own data capabilities there.

my OpenAI resource name (not the endpoint, right?) seems to be OK and so does the key.

Any suggestions or is there something broken because the API for AzureAISearch has changed? I can see that no indexer has been created.

I used this super simple PDF: https://dagrs.berkeley.edu/sites/default/files/2020-01/sample.pdf

kind regards, Daniel

gmndrg commented 6 months ago

@daniheck-msft If you're facing the same issues with chat with your data as well, it might be related to your Azure OpenAI deployment. Please open a support ticket in the portal to engage the Azure OpenAI team: https://learn.microsoft.com/en-us/azure/azure-portal/supportability/how-to-create-azure-support-request