I'm trying to replicate the Streamlit GPT-like app using Bedrock with Langchain, and seems that the stream operation doesn't work. As for the error, seems like the API is returning the generation_token_count with a different type than what is expected on the local side.
Could this be a library error?
Traceback (most recent call last):
File "/Users/<REDACTED>/test/genai/.venv/lib/python3.12/site-packages/streamlit/runtime/scriptrunner/script_runner.py", line 600, in _run_script
exec(code, module.__dict__)
File "/Users/<REDACTED>/test/genai/chat.py", line 40, in <module>
for chunk in chat.stream(messages):
File "/Users/<REDACTED>/test/genai/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 265, in stream
raise e
File "/Users/<REDACTED>/test/genai/.venv/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 256, in stream
generation += chunk
File "/Users/<REDACTED>/test/genai/.venv/lib/python3.12/site-packages/langchain_core/outputs/chat_generation.py", line 79, in __add__
message=self.message + other.message,
~~~~~~~~~~~~~^~~~~~~~~~~~~~~
File "/Users/<REDACTED>/test/genai/.venv/lib/python3.12/site-packages/langchain_core/messages/ai.py", line 224, in __add__
response_metadata = merge_dicts(
^^^^^^^^^^^^
File "/Users/<REDACTED>/test/genai/.venv/lib/python3.12/site-packages/langchain_core/utils/_merge.py", line 40, in merge_dicts
raise TypeError(
TypeError: Additional kwargs key generation_token_count already exists in left dict and value has unsupported type <class 'int'>.
Here is how I'm calling the model (I removed al streamlit interaction):
from langchain_aws import ChatBedrock
from langchain_core.messages import HumanMessage
prompt = "Tell me a joke"
chat = ChatBedrock(
region_name="us-east-1",
model_id="meta.llama3-8b-instruct-v1:0",
model_kwargs={"temperature": 0.8},
streaming=True,
)
messages = [HumanMessage(content=prompt)]
for chunk in chat.stream(messages):
print(chunk.content, end="", flush=True)
Hello,
I'm trying to replicate the Streamlit GPT-like app using Bedrock with Langchain, and seems that the
stream
operation doesn't work. As for the error, seems like the API is returning thegeneration_token_count
with a different type than what is expected on the local side.Could this be a library error?
Here is how I'm calling the model (I removed al streamlit interaction):
Libraries
I'm using the following libraries: