CESNET / UltraGrid

UltraGrid low-latency audio and video network transmission system
http://www.ultragrid.cz
Other
492 stars 53 forks source link

Is it possible to directly stream from python? #218

Closed cansik closed 2 years ago

cansik commented 2 years ago

We are using python to capture and pre-process a stream of images which then have to be sent over the network. At the moment we use the following pipeline:

python -> syphon / spout -> ultragrid sender -> network -> ultragrid receiver -> syphon / spout -> application

It works but it would be great to skip the syphon / spout step, at least on the sending side. How is it possible to create a stream in python which then can be consumed by ultragrid directly?

We currently convert opencv mats into textures and render them in an opengl context to be shared by syphon / spout. And it would of course make more sense to directly offer a stream to ultragrid. I assume ffmpeg or GStreamer are needed, and I would prefer ffmpeg.

Maybe I missed it in the documentation, but I only read about examples where you are specifying the source which will be streamed, instead of having a third party app (except syphon / spout / ndi) which offers the stream.

mpiatka commented 2 years ago

Hi,

what operating system are you using? On Linux it should be already possible using the file video capture and pipes.

You can either use a named pipe, or standard input. Then it should be possible to pass uncompressed data through the pipe in a suitable container, for example NUT. In these examples I'm using ffmpeg as the video source, but you can swap it out with your application if it outputs a stream that ffmpeg can decode.

To use named pipe:

mkfifo <pipe name>
uv -t file:<pipe name>

#example input
ffmpeg -i Video.mp4 -c rawvideo -f nut - > <video_pipe>

Or stdin could be used directly:

ffmpeg -i Video.mp4 -c rawvideo -f nut - | uv -t file:/dev/stdin
cansik commented 2 years ago

Thanks for your answer. I am currently working on MacOS, the actual system will run on Windows. I have implemented a basic example which sends the frames into a pipe and it seems to work (when changing -c with -t in your example).

It is currently quite slow (only 17-20 FPS + 500ms delay), but I will investigate further.

Update: Using .bmp as encoding for cv2.imencode speeds it up to 25 FPS and about 50-100ms delay.