Closed jayanthma closed 1 week ago
Hey, @jayanthma! I'm here to help you with any bugs, questions, or contributions while you wait for a human maintainer. Let's tackle this together!
The error you're encountering is due to the prompt
parameter being expected as a string, but the code is attempting to access it as if it were a list of dictionaries. This mismatch causes the TypeError
.
In the get_request_body
function within llama_index.llms.bedrock.utils
, the prompt
parameter should be a string. For the anthropic.claude
model, the prompt should be formatted as follows:
"\n\nHuman: {user_message}\n\nAssistant: "
Here's how you can modify your code to ensure the prompt is correctly formatted:
from llama_index.llms.bedrock import Bedrock
from llama_index.core.base.llms.types import ChatMessage
llm = Bedrock(model="anthropic.claude-3-sonnet-20240229-v1:0", region_name="ap-south-1")
messages = [
ChatMessage(role="system", content="You are a pirate with a colorful personality"),
ChatMessage(role="user", content="Tell me a story"),
]
response = llm.chat(messages=messages)
Ensure that the ChatMessage
objects are correctly passed to the llm.chat
method. The prompt
will be internally formatted by the get_request_body
function to match the expected structure for the anthropic.claude
model [1].
To continue talking to Dosu, mention @dosu.
I didn't use the kwargs here
@jayanthma what happens if you do it without the kwarg?
llm.chat(messages)
?
Seems like there's a bug in the LLM class tbh. The get_request_body
function expects a string, but the anthropic version of it for some reason is written to expect a list of chat message dicts
I would suggest just using the bedrock-converse llm class, since it has support for function calling as well, as a workaround.
Not sure when I can get to this, but if you are up for it, I recommend a PR. I think we need some way to decide if get_request_body should be called on a string or on a list of chat messages ? Or something?
Can confirm this this an issue and I am running into it as well. Looked into changing into the bedrock-converse llm class but this is missing the _client attribute. Downgrading to llama-index==0.10.30 resolved the issue.
Here is the fixed code: https://github.com/run-llama/llama_index/pull/15729/files
Bug Description
llm = Bedrock(model="anthropic.claude-3-sonnet-20240229-v1:0") messages = [ ChatMessage(role="system", content="You are a pirate with a colorful personality"), ChatMessage(role="user", content="Tell me a story"), ] llm.chat(messages=[ ChatMessage(role="system", content="You are a pirate with a colorful personality"), ChatMessage(role="user", content="Tell me a story"), ])
The above code is throwing the below error File "C:\Users\Jay\OneDrive\coding_projects\RAG_AWS_Neptune\venv\Lib\site-packages\llama_index\llms\bedrock\utils.py", line 157, in get_request_body if len(prompt) > 0 and prompt[0]["role"] == "system":