justadudewhohacks / opencv4nodejs

Nodejs bindings to OpenCV 3 and OpenCV 4
MIT License
4.92k stars 812 forks source link

opencv to http mjpeg stream? #482

Open mark-hahn opened 5 years ago

mark-hahn commented 5 years ago

I have working opencv4nodejs code to detect camera focus blur using laplacian, median etc. I also have mjpg-streamer working. Both use video0 uvc device on raspi but not at the same time.

I want to run both at the same time so I can view the video stream at a decent rate while opencv picks out a frame now and then to calculate focus blur. The blur values can go to stdout or any other output.

I have tried loopback to generate video1 from video0 using gstreamer with no success.

My current idea is to somehow send video frames from opencv to mjpg-streamer or any other http server. How can I do this?

I know about videowriter but the only examples I can find go to a file or ffmpeg. I looked at ffmpeg/ffserver but it looks slow and complicated. Is it the only solution? I guess I am just looking for options.

Thanks in advance.

goulash1971 commented 5 years ago

@mark-hahn I had a 'similar' challenge - I ended up using the VideoWriter to generate an H264 encoded RTP/UDP stream (using gstreamer) and then I use that for rendering - In my case I pass it through an RTSP muxer (there are multiple consumers in my use-case) and then on to a packet forwarder that strips the H264 frames out and forwards them to the browser across a web-socket where I use the Broadway decoder to decode the frames and render them in a canvas.

This is the gStreamer pipeline that I used for the H264 RTP/UDP stream ...

appsrc ! queue ! videoconvert ! video/x-raw ! x264enc ! h264parse ! rtph264pay ! udpsink host=127.0.0.1 port=5000 sync=false

... if you want MJPEG frames then you will need to use the jpegenc and multipartmux pads which you can then send across TCP (using the tcpserversink pad) to your http-server that should handle the HTTP requests and supply data to the browser clients. You will still need to decode the frames in the browser (Firefox used to be the only browser that could play MJPEG natively, but that might be different now).

mark-hahn commented 5 years ago

Thanks for the response. I don't want any mpeg encoding like H264. I just want the raw frames from opencv to be sent to the browser as mjpeg. My current setup runs very low cpu when either using opencv or mjpg-streamer.

goulash1971 commented 5 years ago

@mark-hahn in that case I would play around with the gStreamer pipeline, maybe just pump the raw frames into a udpsink or tcpserversink - for example:

appsrc ! queue ! udpsink host=127.0.0.1 port=5000 sync=false

Not sure what the raw frames will "look" like at the far-end ... I would assume :) that they will be the raw Mat blocks. I'd be interested to hear how that works out for you.

Seikon commented 5 years ago

Hello @mark-hahn @goulash1971 maybe you can find solutions for your projects in this library:

https://github.com/agsh/rtsp-ffmpeg

With this you can pipe a RTP steam and get the raw frames in the server for serving (by base64, or blob) to the client.

Hope it´s help you