miguelgrinberg / APIFairy

A minimalistic API framework built on top of Flask, Marshmallow and friends.
MIT License
323 stars 30 forks source link

How to set response schema on route with stream_with_context? #89

Closed cris-m closed 2 months ago

cris-m commented 2 months ago

I am trying to use langchain simple chatbot in a flask API. Unable to get the response when @response(chatbot_message_schema, 200) is set. The route will return directly the response without waiting for the chatbot response. I only get the response when I comment @response(chatbot_message_schema, 200), the response is just plaintext.

@chatbots.route("/chat", methods=["POST"])
@authenticate(token_auth)
@body(chatbot_message_schema)
@response(chatbot_message_schema, 200)
def chat(data):
    """Handle chatbot interactions."""
    user = token_auth.current_user()
    config = {
        "user_info": {
            "first_name": user.first_name,
            "last_name": user.last_name,
            "email": user.email
        },
        "thread_id": user.id
    }

    message = data.get('message')

    @stream_with_context
    def chatbot_response():
        for response in chat_assistance.ask_question(config, message):
            if isinstance(response, str):
                yield response

    return chatbot_response()
miguelgrinberg commented 2 months ago

APIFairy helps generate documentation for JSON-based APIs. Anything that is not JSON needs to be added to the OpenAPI schema manually, since APIFairy does not understand it.