langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
88.75k stars 13.95k forks source link

Bedrock doesn't work with Claude 3 #18845

Closed minorun365 closed 3 months ago

minorun365 commented 4 months ago

Checked other resources

Example Code

import streamlit as st
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.runnables import RunnablePassthrough
from langchain_core.output_parsers import StrOutputParser
from langchain_community.llms.bedrock import Bedrock
from langchain_community.retrievers.bedrock import AmazonKnowledgeBasesRetriever

retriever = AmazonKnowledgeBasesRetriever(
  knowledge_base_id="XXXXXXXXXX", # Input KB ID here
  retrieval_config={
    "vectorSearchConfiguration": {
      "numberOfResults": 10,
      "overrideSearchType": "HYBRID"
    }})

prompt = ChatPromptTemplate.from_template("Answer questions based on the context below: {context} / Question: {question}")

model = Bedrock(model_id="anthropic.claude-3-sonnet-20240229-v1:0", model_kwargs={"max_tokens_to_sample": 1000})
chain = ({"context": retriever, "question": RunnablePassthrough()} | prompt | model | StrOutputParser())

st.title("Ask Bedrock")
question = st.text_input("Input your question")
button = st.button("Ask!")

if button:
  st.write(chain.invoke(question))

Error Message and Stack Trace (if applicable)

ValidationError: 1 validation error for Bedrock root Claude v3 models are not supported by this LLM.Please use from langchain_community.chat_models import BedrockChat instead. (type=value_error)

Traceback: File "/home/ec2-user/.local/lib/python3.9/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 535, in _run_script exec(code, module.dict) File "/home/ec2-user/environment/rag.py", line 22, in model = Bedrock(model_id="anthropic.claude-3-sonnet-20240229-v1:0", model_kwargs={"max_tokens_to_sample": 1000}) File "/home/ec2-user/.local/lib/python3.9/site-packages/langchain_core/load/serializable.py", line 120, in init super().init(**kwargs) File "/home/ec2-user/.local/lib/python3.9/site-packages/pydantic/v1/main.py", line 341, in init raise validation_error

Description

To use claude 3 (Sonnet) on Amazon Bedrock, langchain_community seems to be updated.

System Info

langchain==0.1.11 langchain-community==0.0.27 langchain-core==0.1.30 langchain-text-splitters==0.0.1 macOS 14.2.1(23C71) Python 3.9.16

clouddev-code commented 4 months ago

Claude 3 cannot use for generate responses after retrieving information from knowledge bases https://docs.aws.amazon.com/bedrock/latest/userguide/knowledge-base-supported.html

Barneyjm commented 4 months ago

Changes have been merged to work with it. Awaiting a new build

https://github.com/langchain-ai/langchain/pull/18630

stevensu1977 commented 3 months ago

I think you should use "BedrockChat" , it's work .

langchain==0.1.11 langchain-anthropic==0.1.4 langchain-community==0.0.27 langchain-core==0.1.30 langchain-openai==0.0.5 langchain-text-splitters==0.0.1

import boto3
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import ChatPromptTemplate
from langchain_community.chat_models import BedrockChat

bedrock_runtime = boto3.client(
    service_name="bedrock-runtime",
    region_name="us-west-2",
)

model_id = "anthropic.claude-3-sonnet-20240229-v1:0"

model_kwargs =  { 
    "max_tokens": 2048,
    "temperature": 0.0,
    "top_k": 250,
    "top_p": 1,
    "stop_sequences": ["\n\nHuman"],
}

model = BedrockChat(
    client=bedrock_runtime,
    model_id=model_id,
    model_kwargs=model_kwargs,
)

# Invoke Example
messages = [
    ("system", "You are a helpful assistant."),
    ("human", "{question}"),
]

prompt = ChatPromptTemplate.from_messages(messages)

chain = prompt | model | StrOutputParser()

# Chain Invoke
response = chain.invoke({"question": "tell me a joke"})
print(response)
minorun365 commented 3 months ago

I succeeded to run with Claude 3 on Bedrock🎉

import streamlit as st
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.runnables import RunnablePassthrough
from langchain_core.output_parsers import StrOutputParser
from langchain_community.chat_models import BedrockChat
from langchain_community.retrievers.bedrock import AmazonKnowledgeBasesRetriever

retriever = AmazonKnowledgeBasesRetriever(
  knowledge_base_id="XXXXXXXXXX", # Input KB ID here
  retrieval_config={
    "vectorSearchConfiguration": {
      "numberOfResults": 10,
      "overrideSearchType": "HYBRID"
    }})

prompt = ChatPromptTemplate.from_template("Answer questions based on the context below: {context} / Question: {question}")

model = BedrockChat(model_id="anthropic.claude-3-sonnet-20240229-v1:0", model_kwargs={"max_tokens": 1000})

chain = ({"context": retriever, "question": RunnablePassthrough()} | prompt | model | StrOutputParser())

st.title("Ask Bedrock")
question = st.text_input("Input your question")
button = st.button("Ask!")

if button:
  st.write(chain.invoke(question))