aws-samples / serverless-pdf-chat

LLM-powered document chat using Amazon Bedrock and AWS Serverless
https://aws.amazon.com/blogs/compute/building-a-serverless-document-chat-with-aws-lambda-and-amazon-bedrock/
MIT No Attribution
228 stars 206 forks source link

Document uploaded but not getting process #32

Closed IshtiyaqKhanAlld closed 7 months ago

IshtiyaqKhanAlld commented 8 months ago

I build and deployed the application 2 times but getting the same issue mentioned below :

  1. Document uploaded in s3 but not getting process, I was waiting for 30 minutes for 130 kb doc.  Err log details : [ERROR] ValueError: Error raised by inference endpoint: An error occurred (AccessDeniedException) when calling the InvokeModel operation: You don't have access to the model with the specified model ID. Traceback (most recent call last):   File "/opt/python/aws_lambda_powertools/logging/logger.py", line 453, in decorate     return lambda_handler(event, context, *args, kwargs)   File "/var/task/main.py", line 57, in lambda_handler     index_from_loader = index_creator.from_loaders([loader])   File "/var/task/langchain/indexes/vectorstore.py", line 82, in from_loaders     return self.from_documents(docs)   File "/var/task/langchain/indexes/vectorstore.py", line 87, in from_documents     vectorstore = self.vectorstore_cls.from_documents(   File "/var/task/langchain/schema/vectorstore.py", line 510, in from_documents     return cls.from_texts(texts, embedding, metadatas=metadatas, kwargs)   File "/var/task/langchain/vectorstores/faiss.py", line 911, in from_texts     embeddings = embedding.embed_documents(texts)   File "/var/task/langchain/embeddings/bedrock.py", line 143, in embed_documents     response = self._embedding_func(text)   File "/var/task/langchain/embeddings/bedrock.py", line 130, in _embedding_func     raise ValueError(f"Error raised by inference endpoint: {e}") 

<img width="950" alt="image" src="https://github.com/aws-samples/serverless-pdf-chat/assets/112078629/7674ec79-edfc-457d-

Issue-AWS-Serverless-Pdf-chat

92ed-38bc87ee42a8">

pbv0 commented 8 months ago

Hi, please check that you have activated the Anthropic Claude model in the correct AWS Region. Per default this is us-east-1 (independently from where you deploy the SAM stack): https://docs.aws.amazon.com/bedrock/latest/userguide/model-access.html

Let me know if this solves the problem.

pushpsood commented 8 months ago

+1 to @pbv0 While trying to request the access I got INVALID_PAYMENT_INSTRUMENT in Bedrock, for now using Titan(one will get the approval for this model) instead of Llama 2, and will switch once the ticket is resolved and update how it could be fixed.

pbv0 commented 8 months ago

@pushpsood it looks like some third-party model providers require credit card as payment method and do not support some other options (e.g. direct debit): https://repost.aws/questions/QU0UOsutrWSSS4nOqgHcIUJg/invalid-payment-instrument-after-requesting-model-access-in-amazon-bedrock

pbv0 commented 7 months ago

Closing this for now, feel free to reopen if the issue remains.