aws-samples / build-an-agentic-llm-assistant

Labs for the "Build an agentic LLM assistant on AWS" workshop. A step by step agentic llm assistant development workshop using serverless three-tier architecture.
https://catalog.us-east-1.prod.workshops.aws/workshops/c429f039-04d5-47e7-a039-1ded123c412f
MIT No Attribution
41 stars 13 forks source link
amazon-bedrock aws-lambda claude llm-agent question-answering serverless text-to-sql

Build an Agentic LLM assistant on AWS

This hands-on workshop, aimed at developers and solution builders, trains you on how to build a real-life serverless LLM application using foundation models (FMs) through Amazon Bedrock and advanced design patterns such as: Reason and Act (ReAct) Agent, text-to-SQL, and Retrieval Augemented Generation (RAG). It complements the Amazon Bedrock Workshop by helping you transition from practicing standalone design patterns in notebooks to building an end-to-end llm serverless application.

Within the labs of this workshop, you'll explore some of the most common and advanced LLM applications design patterns used by customers to improve business operations with Generative AI. Namely, these labs together help you build step by step a complex Agentic LLM assistant capable of answering retrieval and analytical questions on your internal knowledge bases.

Throughout these labs, you will be using and extending the CDK stack of the Serverless LLM Assistant available under the folder serverless_llm_assistant.

Prerequisites

  1. Create an AWS Cloud9 environment to use as an IDE.
  2. Configure model access on Amazon Bedrock console, namely to access Amazon Titan and Anthropic Claude models on us-west-2 (Oregon).
  3. Setup an Amazon SageMaker Studio environment, using the Quick setup for single users, to run the data-pipelines notebooks.

Once ready, clone this repository into the new Cloud9 environment and follow lab instructions.

Architecture

The following diagram illustrates the target architecture of this workshop:

Agentic Assistant workshop Architecture

Next step

You can build on the knowledge acquired in this workshop by solving a more complex problem that requires studying the limitation of the popular design patterns used in llm application development and desiging a solution to overcome these limitations. For this, we propose that you read through the blog post Boosting RAG-based intelligent document assistants using entity extraction, SQL querying, and agents with Amazon Bedrock and explore its associated GitHub repository aws-agentic-document-assistant.

Security

See CONTRIBUTING for more information.

License

This library is licensed under the MIT-0 License. See the LICENSE file.