add a new agent "langserve-invoke" that is able to make a call to LangServe
support both "invoke" and "stream" endpoints
in case of "stream" endpoints you can configure the "stream-to-topic" property, like in the ai-chat-completion agent
When doing streaming answers the chunking is performed the same way as the ai-chat-completion agent, and we are adding the same properties to the messages. This way the sample JS applications still work out of the box.
There is a sample application included with this PR.
Summary:
When doing streaming answers the chunking is performed the same way as the ai-chat-completion agent, and we are adding the same properties to the messages. This way the sample JS applications still work out of the box.
There is a sample application included with this PR.
Sample pipeline: