langchain-ai / langserve

LangServe 🦜️🏓
Other
1.89k stars 211 forks source link

Weird behavior with Langserve and streaming #632

Closed zinebbearing closed 4 months ago

zinebbearing commented 4 months ago

This is the endpoint I create with langserve

from main import app as llm_app

add_routes(
    app,
    llm_app.with_types(input_type=Input, output_type=Output),
    path="/api",
)

llm_app is the exact same as https://python.langchain.com/docs/modules/agents/how_to/custom_agent/

when I stream event using llm_app directly, everything works okay (using the same example input as the tutorial). However, when I make the exact same call over /stream_events I get this error from Error code: 400 - {'error': {'message': "'messages' must contain the word 'json' in some form, to use 'response_format' of type 'json_object'.", 'type': 'invalid_request_error', 'param': 'messages', 'code': None}}

Why?

On side note, streaming the final output token by token is the most important thing. Why is it only possible with event streaming wich is very inconsitant and doesn't work 9/10

eyurtsev commented 4 months ago

Hi @zinebbearing there's not enough context to provide help. A 4xx indicates that there is error on the user side with formulating the request.

Could you add more information to show what the client code is?

zinebbearing commented 4 months ago

Hi @eyurtsev , sorry I actually made a stupid mistake in the code and forgot to close the issue