This sample application allows you to ask natural language questions of any PDF document you upload. It combines the text generation and analysis capabilities of an LLM with a vector search of the document content. The solution uses serverless services such as Amazon Bedrock to access foundational models, AWS Lambda to run LangChain, and Amazon DynamoDB for conversational memory.
See the accompanying blog post on the AWS Serverless Blog for a detailed description and follow the deployment instructions below to get started.
Warning This application is not ready for production use. It was written for demonstration and educational purposes. Review the Security section of this README and consult with your security team before deploying this stack. No warranty is implied in this example.
Note This architecture creates resources that have costs associated with them. Please see the AWS Pricing page for details and make sure to understand the costs before deploying this stack.
Clone this repository:
git clone https://github.com/aws-samples/serverless-pdf-chat.git
This application can be used with a variety of Amazon Bedrock models. See Supported models in Amazon Bedrock for a complete list.
By default, this application uses Titan Embeddings G1 - Text to generate embeddings and Anthropic Claude v3 Sonnet for responses.
Important - Before you can use these models with this application, you must request access in the Amazon Bedrock console. See the Model access section of the Bedrock User Guide for detailed instructions. By default, this application is configured to use Amazon Bedrock in the
us-east-1
Region, make sure you request model access in that Region (this does not have to be the same Region that you deploy this stack to).
To select your Bedrock model, specify the ModelId
parameter during the AWS SAM deployment, such as anthropic.claude-3-sonnet-20240229-v1:0
. See Amazon Bedrock model IDs for a complete list.
The ModelId
parameter is used in the GenerateResponseFunction Lambda function of your AWS SAM template to instantiate LangChain BedrockChat and ConversationalRetrievalChain objects, providing efficient retrieval of relevant context from large PDF datasets to enable the Bedrock model-generated response.
def bedrock_chain(faiss_index, memory, human_input, bedrock_runtime):
chat = BedrockChat(
model_id=MODEL_ID,
model_kwargs={'temperature': 0.0}
)
chain = ConversationalRetrievalChain.from_llm(
llm=chat,
chain_type="stuff",
retriever=faiss_index.as_retriever(),
memory=memory,
return_source_documents=True,
)
response = chain.invoke({"question": human_input})
return response
AWS Amplify Hosting enables a fully-managed deployment of the application's React frontend in an AWS-managed account using Amazon S3 and Amazon CloudFront. You can optionally run the React frontend locally by skipping to Deploy the application with AWS SAM.
To set up Amplify Hosting:
https://github.com/user/serverless-pdf-chat/
.Create a new secret called serverless-pdf-chat-github-token
in AWS Secrets Manager and input your fine-grained access token as plaintext. Select the Plaintext tab and confirm your secret looks like this:
github_pat_T2wyo------------------------------------------------------------------------rs0Pp
Change to the backend
directory and build the application:
cd backend
sam build
Deploy the application into your AWS account:
sam deploy --guided
For Stack Name, choose serverless-pdf-chat
.
For Frontend, specify the environment ("local", "amplify") for the frontend of the application.
If you selected "amplify", specify the URL of the forked Git repository containing the application code.
Specify the Amazon Bedrock model ID. For example, anthropic.claude-3-sonnet-20240229-v1:0
.
For the remaining options, keep the defaults by pressing the enter key.
AWS SAM will now provision the AWS resources defined in the backend/template.yaml
template. Once the deployment is completed successfully, you will see a set of output values similar to the following:
CloudFormation outputs from deployed stack
-------------------------------------------------------------------------------
Outputs
-------------------------------------------------------------------------------
Key CognitoUserPool
Description -
Value us-east-1_gxKtRocFs
Key CognitoUserPoolClient
Description -
Value 874ghcej99f8iuo0lgdpbrmi76k
Key ApiGatewayBaseUrl
Description -
Value https://abcd1234.execute-api.us-east-1.amazonaws.com/dev/
-------------------------------------------------------------------------------
If you selected to deploy the React frontend using Amplify Hosting, navigate to the Amplify console to check the build status. If the build does not start automatically, trigger it through the Amplify console.
If you selected to run the React frontend locally and connect to the deployed resources in AWS, you will use the CloudFormation stack outputs in the following section.
Create a file named .env.development
in the frontend
directory. Vite will use this file to set up environment variables when we run the application locally.
Copy the following file content and replace the values with the outputs provided by AWS SAM:
VITE_REGION=us-east-1
VITE_API_ENDPOINT=https://abcd1234.execute-api.us-east-1.amazonaws.com/dev/
VITE_USER_POOL_ID=us-east-1_gxKtRocFs
VITE_USER_POOL_CLIENT_ID=874ghcej99f8iuo0lgdpbrmi76k
Next, install the frontend's dependencies by running the following command in the frontend
directory:
npm ci
Finally, to start the application locally, run the following command in the frontend
directory:
npm run dev
Vite will now start the application under http://localhost:5173
.
The application uses Amazon Cognito to authenticate users through a login screen. In this step, you will create a user to access the application.
Perform the following steps to create a user in the Cognito user pool:
Navigate back to your Amplify website URL or local host address to log in with the new user's credentials.
Run the following command in the backend
directory of the project to delete all associated resources resources:
sam delete
If you are experiencing issues when running the sam build
command, try setting the --use-container
flag (requires Docker):
sam build --use-container
If you are still experiencing issues despite using --use-container
, try switching the AWS Lambda functions from arm64
to x86_64
in the backend/template.yaml
(as well as switching to the x_86_64
version of Powertools):
Globals:
Function:
Runtime: python3.11
Handler: main.lambda_handler
Architectures:
- x86_64
Tracing: Active
Environment:
Variables:
LOG_LEVEL: INFO
Layers:
- !Sub arn:aws:lambda:${AWS::Region}:017000801446:layer:AWSLambdaPowertoolsPythonV2:51
This application was written for demonstration and educational purposes and not for production use. The Security Pillar of the AWS Well-Architected Framework can support you in further adopting the sample into a production deployment in addition to your own established processes. Take note of the following:
The application uses encryption in transit and at rest with AWS-managed keys where applicable. Optionally, use AWS KMS with DynamoDB, SQS, and S3 for more control over encryption keys.
This application uses Powertools for AWS Lambda (Python) to log to inputs and ouputs to CloudWatch Logs. Per default, this can include sensitive data contained in user input. Adjust the log level and remove log statements to fit your security requirements.
API Gateway access logging and usage plans are not activiated in this code sample. Similarly, S3 access logging is currently not enabled.
In order to simplify the setup of the demo, this solution uses AWS managed policies associated to IAM roles that contain wildcards on resources. Please consider to further scope down the policies as you see fit according to your needs. Please note that there is a resource wildcard on the AWS managed AWSLambdaSQSQueueExecutionRole
. This is a known behaviour, see this GitHub issue for details.
If your security controls require inspecting network traffic, consider adjusting the AWS SAM template to attach the Lambda functions to a VPC via its VpcConfig
.
See CONTRIBUTING for more information.
This library is licensed under the MIT-0 License. See the LICENSE file.