A modular and comprehensive solution to deploy a Multi-LLM and Multi-RAG powered chatbot (Amazon Bedrock, Anthropic, HuggingFace, OpenAI, Meta, AI21, Cohere, Mistral) using AWS CDK on AWS
Description of changes:
Since the bedrock converse api allows a team to "write code once and use it with different models.", this change aims to use the same code for all Bedrock models.
The only differences are some models do not support system prompt or streaming.
The main goal of this change is to add usage token tracking to all Bedrock models.
Changes
Remove model specific code for Bedrock
Add unit test for the bedrock adapters
Add an user friendly message is a bedrock model is not enabled
Add unit test
Notes
the main changes are in lib/model-interfaces/langchain/functions/request-handler/adapters/bedrock/base.py. Github hides this file by default
since all the models use the same prompts. LLama custom prompts are no longer used and it relies on the Converse API.
Testing
Called the models one by and checked the usage
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.
Issue #, if available: Follow up of https://github.com/aws-samples/aws-genai-llm-chatbot/pull/564
Description of changes: Since the bedrock converse api allows a team to "write code once and use it with different models.", this change aims to use the same code for all Bedrock models.
The only differences are some models do not support system prompt or streaming.
The main goal of this change is to add usage token tracking to all Bedrock models.
Changes
Notes
Testing Called the models one by and checked the usage
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.