langchain-ai / langchain-databricks

MIT License
4 stars 7 forks source link

Not getting token usage metadata with ChatDatabricks streaming #9

Closed tkanhe closed 1 week ago

tkanhe commented 1 month ago

code:

from langchain_core.prompts import PromptTemplate
from langchain_databricks import ChatDatabricks

llm = ChatDatabricks(
    endpoint="bedrock-anthropic-endpoint",
    streaming=True,
    stream_usage=True,
)

prompt = PromptTemplate(input_variables=["adjective"], template="Tell me a {adjective} joke")

chain = prompt | llm

async for chunk in chain.astream({"adjective": "messi"}):
    print("chunk: ", chunk)

output:

chunk:  content='Here' id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d'
chunk:  content="'s a M" id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d'
chunk:  content='essi joke for you' id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d'
chunk:  content=':\n\nWhy doesn' id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d'
chunk:  content="'t Lion" id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d'
chunk:  content='el Messi' id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d'
...
chunk:  content=' short' id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d'
chunk:  content=' updates' id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d'
chunk:  content='.)' id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d'
chunk:  content='' response_metadata={'finish_reason': 'stop'} id='run-e8b2bd2b-5575-409c-8083-8c10e98a4e1d'
B-Step62 commented 1 week ago

@tkanhe langchain-databricks 0.1.1 is released and now token usage is supported for streaming as well.