Closed jsumners-nr closed 7 months ago
LangChain supports streaming responses as described in https://js.langchain.com/docs/modules/model_io/llms/streaming_llm. In short, there are two ways to stream responses:
.stream
streaming: true
handleLLMNewToken
We will need to instrument both methods. We should be able to instrument base methods as discussed in the non-streaming ticket.
Spans should be named Llm/agent/Langchain/invoke.
Llm/agent/Langchain/invoke
https://new-relic.atlassian.net/browse/NR-218483
LangChain supports streaming responses as described in https://js.langchain.com/docs/modules/model_io/llms/streaming_llm. In short, there are two ways to stream responses:
.stream
method on model entitiesstreaming: true
when constructing a model entity and supplying ahandleLLMNewToken
callback handler.We will need to instrument both methods. We should be able to instrument base methods as discussed in the non-streaming ticket.
Spans should be named
Llm/agent/Langchain/invoke
.