dapicard / opencv-parallelize-example

An example project showing how to pipeline and parallelize
GNU General Public License v3.0
20 stars 4 forks source link

information about Queue that used to broadcast signals to underlying threads #3

Closed mohand150 closed 5 years ago

mohand150 commented 5 years ago

Hello sir, I saw your work that you did on the parallelization of a processing of a video stream. Currently I am working on a small project on stereo vision, and I have some questions if you have a little time for me I want to perform several processing on each frame of the 2 video streams captured by two cameras as the following

flux

this processing flow causes me a lot of latency for real-time processing, and based on your work, I wonder if you can give me some details 1- you have put this vector for the one used in multi streaming

    // for (every source) {
    // Queue used to broadcast signals to underlying threads
   }

how can i use your loop (for 2 streams, how can i use the loop? can you give me an example? 2- since you used each thread for a step of your treatment, can I follow your process by adding other treatments (other thread while staying real time in my treatment) for my use case? Thanks for your help

dapicard commented 5 years ago

I actually do exactly what you described (computing 2 video streams in the same process).

You just have to create as many "Stream" as you have sources. As an example :

`threads.push_back(std::thread([&](std::string video_url, std::shared_ptr<moodycamel::ReaderWriterQueue> signals) {

    std::cout << "Input : " << video_url << std::endl << std::flush;
    Stream stream = Stream(video_url, signals);
    stream.current_frame = std::make_shared<Frame>();
    stream.start();
}, video_url1, signals));

threads.push_back(std::thread([&](std::string video_url, std::shared_ptr<moodycamel::ReaderWriterQueue> signals) {

    std::cout << "Input : " << video_url << std::endl << std::flush;
    Stream stream = Stream(video_url, signals);
    stream.current_frame = std::make_shared<Frame>();
    stream.start();
}, video_url2, signals));`

And you can of course, create as many threads as you want to split up your processing steps.

mohand150 commented 5 years ago

thank you Mr @dapicard

mohand150 commented 5 years ago

@dapicard , sorry for my English the problem is that I want to reduce latency (which means separating the processing on multiple threads) so treat the 2 video streams in two different processes)

    std::shared_ptr<moodycamel::ReaderWriterQueue<int>> signals = std::make_shared<moodycamel::ReaderWriterQueue<int>>(1);
    signal_queues.push_back(signals);
    threads.push_back(std::thread([&](std::string video_url_left, std::shared_ptr<moodycamel::ReaderWriterQueue<int>> signals) {

        std::cout << "Input : " << video_url_left << std::endl << std::flush;
        Stream stream = Stream(video_url_left, signals);
        stream.current_frame = std::make_shared<Frame>();
        stream.start();
    }, video_url_left, signals));

    std::shared_ptr<moodycamel::ReaderWriterQueue<int>> signals_2 = std::make_shared<moodycamel::ReaderWriterQueue<int>>(2);
    signal_queues.push_back(signals_2);
    threads.push_back(std::thread([&](std::string video_url_right, std::shared_ptr<moodycamel::ReaderWriterQueue<int>> signals_2) {

        std::cout << "Input : " << video_url_right << std::endl << std::flush;
        Stream stream = Stream(video_url_right, signals_2);
        stream.current_frame = std::make_shared<Frame>();
        stream.start();
    }, video_url_right, signals_2));

is it true what am I doing here? because I do not understand how it works "signals" that you create?

mohand150 commented 5 years ago

@dapicard when I run the program with the code I added https://github.com/dapicard/opencv-parallelize-example/issues/3#issuecomment-476192585 it shows on the console that it has opened the two streams, but on the screen it just shows the video stream of a single camera (not two streams) and then after 2 seconds the display window crashes but it continue to display the number of frames / s I think I made a mistake in what I just added just before?

I changed it as you show it here

I actually do exactly what you described (computing 2 video streams in the same process).

You just have to create as many "Stream" as you have sources. As an example :

`threads.push_back(std::thread([&](std::string video_url, std::shared_ptr signals) {

    std::cout << "Input : " << video_url << std::endl << std::flush;
    Stream stream = Stream(video_url, signals);
    stream.current_frame = std::make_shared<Frame>();
    stream.start();
}, video_url1, signals));

threads.push_back(std::thread([&](std::string video_url, std::shared_ptr signals) {

    std::cout << "Input : " << video_url << std::endl << std::flush;
    Stream stream = Stream(video_url, signals);
    stream.current_frame = std::make_shared<Frame>();
    stream.start();
}, video_url2, signals));`

And you can of course, create as many threads as you want to split up your processing steps.

it gives me the same result (I already uncommented the imshow ("Mask", frame-> cpu_mask);) How can I display the same treatments by displaying the results on two different windows (2 masks in 2 windows with 2 threads separately)?

dapicard commented 5 years ago

You have to use two different named OpenCV windows (the name is the first parameter of the imshow function).

And the signals I use are the operating system interrupts : https://www.tutorialspoint.com/cplusplus/cpp_signal_handling.htm