pistacheio / pistache

A high-performance REST toolkit written in C++
https://pistacheio.github.io/pistache/
Apache License 2.0
3.12k stars 683 forks source link

Cant stream MJPEG with multipart/x-mixed-replace #1207

Open InfiniteLife opened 2 months ago

InfiniteLife commented 2 months ago

I created server that supposed to provide MJPEG stream to browser.

Here is implementation:

void VideoStreamer::streamVideoFrames(const Pistache::Rest::Request& request, Pistache::Http::ResponseWriter response) 
{
    std::shared_ptr<Pistache::Http::ResponseWriter> shared_response = std::make_shared<Pistache::Http::ResponseWriter>(std::move(response));
    auto cameraId = request.param(":cameraId").as<std::string>();

    auto frameServe = [shared_response, cameraId, this](ssize_t bytes) mutable {
        auto frameServeImpl = [shared_response, this](auto& frameServe_ref) mutable -> void
        {
            std::vector<unsigned char> frameBuffer;
            {
                std::lock_guard<std::mutex> lock(frameMutex);
                auto it = frameMap.find(cameraId);
                if (it != frameMap.end()) {
                    cv::imencode(".jpg", it->second.first, frameBuffer);
                }
                else {
                    ....
                }
            }

            if (frameBuffer.size() > 0) {
                std::string frameBufferStr(frameBuffer.begin(), frameBuffer.end());
                shared_response->headers().add<Pistache::Http::Header::Connection>(Pistache::Http::ConnectionControl::KeepAlive);
                std::string responseStr = "--frame\r\nContent-Type: image/jpeg\r\n\r\n" + frameBufferStr + "\r\n";
                shared_response->send(Pistache::Http::Code::Ok, responseStr).then([shared_response, cameraId, this, frameServe_ref](ssize_t bytes) mutable 
                {
                    LOG("Pistache: callback for camera {}", cameraId);
                    frameServe_ref(frameServe_ref);
                },
                Pistache::Async::Throw);
            }
            std::this_thread::sleep_for(std::chrono::milliseconds(500));
        };
        LOG("Pistache: Starting frameServeImpl");
        frameServeImpl(frameServeImpl);
    };

    shared_response->headers()
        .add<Pistache::Http::Header::Server>("pistache/0.1")
        .add<Pistache::Http::Header::Connection>(Pistache::Http::ConnectionControl::KeepAlive);
//        .add<Pistache::Http::Header::ContentType>("multipart/x-mixed-replace; boundary=frame");
    try 
    {
        auto sendResult = shared_response->send(Pistache::Http::Code::Ok, "", Pistache::Http::Mime::MediaType::fromString("multipart/x-mixed-replace; boundary=frame"));
        sendResult.then(
            frameServe,
            Pistache::Async::Throw
        );
    } catch (const std::exception& e) {
        // Handle exceptions if needed
        LOG("Pistache exception: {}", e.what());
    }
}

Its based a lot on https://stackoverflow.com/questions/54869598/c-pistache-and-mjpeg-server, https://github.com/pistacheio/pistache/issues/466

Problem with it, it doesnt work, LOG("Pistache: Starting frameServeImpl"); is printed, andLOG("Pistache: callback for camera {}", cameraId); is not printed, although execution reaches "send" inside of lambda. Curl for this call print this:

HTTP/1.1 200 OK
Content-Type: multipart/x-mixed-replace; boundary=frame
Server: pistache/0.1
Connection: Close
Content-Length: 0
InfiniteLife commented 2 months ago

I also created simple flask streaming scirpt, which works as expected(showing mjpeg stream in browsers).

import io
import random
from flask import Flask, Response
from flask import make_response
from PIL import Image
import time

app = Flask(__name__)

@app.route('/stream')
def stream():
    def generate_noise_image():
        while True:
            # Generate a random noise image
            img = Image.frombytes('RGB', (640, 480), bytes(random.getrandbits(8) for _ in range(640 * 480 * 3)))

            # Convert the image to JPEG format
            img_io = io.BytesIO()
            img.save(img_io, 'JPEG')
            img_io.seek(0)
            # Yield the image data
            yield (b'--frame\r\n'
                   b'Content-Type: image/jpeg\r\n\r\n' + img_io.getvalue() + b'\r\n')

            #pause script execution for half a second
            time.sleep(0.5)

    return Response(generate_noise_image(), mimetype='multipart/x-mixed-replace; boundary=frame')

if __name__ == '__main__':
    app.run(host='0.0.0.0', port=8013, debug=True)

I tried to analyze the difference in responses between Flask and Pistache.

Here is Flask with curl:

 curl -v 127.0.0.1:8013/stream
*   Trying 127.0.0.1:8013...
* Connected to 127.0.0.1 (127.0.0.1) port 8013 (#0)
> GET /stream HTTP/1.1
> Host: 127.0.0.1:8013
> User-Agent: curl/7.81.0
> Accept: */*
> 
* Mark bundle as not supporting multiuse
< HTTP/1.1 200 OK
< Server: Werkzeug/2.2.2 Python/3.10.12
< Date: Sat, 27 Apr 2024 04:44:57 GMT
< Content-Type: multipart/x-mixed-replace; boundary=frame
< Transfer-Encoding: chunked
< Connection: close
< 
Warning: Binary output can mess up your terminal. Use "--output -" to tell 
Warning: curl to output it to your terminal anyway, or consider "--output 
Warning: <FILE>" to save to a file.
* Failure writing output to destination
* Failed reading the chunked-encoded stream
* Closing connection 0

I wanted to see the body of request, so I replaced image with shorter string:

curl --output - -v 127.0.0.1:8013/stream
*   Trying 127.0.0.1:8013...
* Connected to 127.0.0.1 (127.0.0.1) port 8013 (#0)
> GET /stream HTTP/1.1
> Host: 127.0.0.1:8013
> User-Agent: curl/7.81.0
> Accept: */*
> 
* Mark bundle as not supporting multiuse
< HTTP/1.1 200 OK
< Server: Werkzeug/2.2.2 Python/3.10.12
< Date: Fri, 26 Apr 2024 14:38:44 GMT
< Content-Type: multipart/x-mixed-replace; boundary=frame
< Transfer-Encoding: chunked
< Connection: close
< 
--frame
Content-Type: image/jpeg

happy string
--frame
Content-Type: image/jpeg

happy string
--frame
Content-Type: image/jpeg

happy string
--frame
Content-Type: image/jpeg

I noticed that Flask sends response with "chunked" encoding. So through Pistache docs I wrote new function:

void VideoStreamer::streamVideoFramesChunked(const Pistache::Rest::Request& request, Pistache::Http::ResponseWriter response)
{
    response.headers()
        .add<Pistache::Http::Header::Server>("pistache/0.1")
        .add<Pistache::Http::Header::Connection>(Pistache::Http::ConnectionControl::KeepAlive)
        .add<Pistache::Http::Header::ContentType>("multipart/x-mixed-replace; boundary=frame");

    auto stream = response.stream(Pistache::Http::Code::Ok);

    auto cameraId = request.param(":cameraId").as<std::string>();
    const int camSearchTimeout = 20000; // ms
    int time_passed = 0; // ms
    const int frameRetrievalInterval = 500; //ms
    uint64_t lastFrameTimestamp = UINT64_MAX;

    try {
        while (true) 
        {
            std::vector<unsigned char> frameBuffer;
            {
                std::lock_guard<std::mutex> lock(frameMutex);
                auto it = frameMap.find(cameraId);
                if (it != frameMap.end() && it->second.second != lastFrameTimestamp) {
                    cv::imencode(".jpg", it->second.first, frameBuffer);
                    frameBuffer.size() > 0 ? time_passed = 0 : time_passed += frameRetrievalInterval;
                    lastFrameTimestamp = it->second.second;
                }
                else {
                    time_passed += frameRetrievalInterval;
                    if ((time_passed % 5000) == 0) UMOJO_LOG("Pistache: Waiting for cam {}/{} ...", cameraId, time_passed / 1000);
                }
            }

            if (frameBuffer.size() > 0) {
                std::string frameBufferStr(frameBuffer.begin(), frameBuffer.end());
                LOG("Pistache: Sending frame for camera {}, buffer length: {}", cameraId, frameBufferStr.size());
                stream << "--frame\r\nContent-Type: image/jpeg\r\n\r\n";
                stream << frameBufferStr.c_str();
                stream << "\r\n";
                stream << Pistache::Http::flush;
            }

            if (time_passed >= camSearchTimeout) {
                LOG("Pistache: Camera {} not found, exiting by timeout", cameraId);
                break;
            }

            std::this_thread::sleep_for(std::chrono::milliseconds(frameRetrievalInterval));
        }
    }
    catch (const std::exception& e) {
        // Handle exceptions if needed
        LOG("Pistache exception: {}", e.what());
    }
}

here is curl:

curl --output - -v 127.0.0.1:8013/stream
*   Trying 127.0.0.1:8013...
* Connected to 127.0.0.1 (127.0.0.1) port 8013 (#0)
> GET /stream/arlington_SCR2326W_media-inference-arlingtons-k8s_dev HTTP/1.1
> Host: 127.0.0.1:8013
> User-Agent: curl/7.81.0
> Accept: */*
> 
* Mark bundle as not supporting multiuse
< HTTP/1.1 200 OK
< Content-Type: multipart/x-mixed-replace; boundary=frame
< Server: pistache/0.1
< Connection: Close
< Transfer-Encoding: chunked
< 
--frame
Content-Type: image/jpeg

����
--frame
Content-Type: image/jpeg

����
--frame
Content-Type: image/jpeg

����
--frame
Content-Type: image/jpeg

Everything looks the same as a Flask output, in structure. Except with --output - flag curl with Flask with real image will print out huge portions of unreadable binary data, which is to be expected. But with this flag with Pistache all binary data curl prints is "����", altough string Im sending is quite big, buffer length: 523798 on average. And it still doesnt work. I feel like Ive broken both my legs here trying to make it work, in last three days... Any help appreaciated

InfiniteLife commented 2 months ago

Well, I couldnt make it work. As previous person who had similar issue, I decided to switch to Asio, which worked as expected. To me it feels like there is some issue with library with multipart, or would be nice to have a code example with multipart streaming. I liked modern Pistache interface, but it didnt worked out for me as expected.