Closed tkanhe closed 1 week ago
code:
from langchain_core.prompts import PromptTemplate from langchain_databricks import ChatDatabricks llm = ChatDatabricks( endpoint="bedrock-anthropic-endpoint", streaming=True, stream_usage=True, ) prompt = PromptTemplate(input_variables=["adjective"], template="Tell me a {adjective} joke") chain = prompt | llm async for chunk in chain.astream({"adjective": "messi"}): print("chunk: ", chunk)
output:
chunk: content='Here' id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d' chunk: content="'s a M" id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d' chunk: content='essi joke for you' id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d' chunk: content=':\n\nWhy doesn' id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d' chunk: content="'t Lion" id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d' chunk: content='el Messi' id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d' ... chunk: content=' short' id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d' chunk: content=' updates' id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d' chunk: content='.)' id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d' chunk: content='' response_metadata={'finish_reason': 'stop'} id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d'
@tkanhe langchain-databricks 0.1.1 is released and now token usage is supported for streaming as well.
langchain-databricks
code:
output: