Open RKelln opened 1 year ago
360 projection system!! Wouahou that would be a premiere for vimix :) !!
The resolution is 7680x742, so I tried a quick test today (using the flatpak) by making a new project with that custom resolution. I can use the Geometry interface to position videos, great.
good
The Display interface doesn't show the full window size, but hopefully that's not a problem.
The Display view shows the screens actually connected to the computer : if you plug a monitor, it appears here. In the place where you will do the projection, can you connect a plug to your computer with a virtual screen of the target resolution? Or can you run vimix on the machine connected to the displays?
0013 Video Broadcast uses hardware-accelerated encoder (vaapih264enc) 0014 Video Broadcast started SRT on port 7070 0015 Frame capture : Unnexpected EOF signal (no space left on drive? File deleted?)
Replicated
I do not get the same error with projects with more typical sizes, so I'm wondering if it has something to do with the resolution? What can you recommend to test? Thanks?
Exactly! it is a problem with hardware accelerated capture of frames : the resolution is too big for the GPU frame capture. It works if you disable 'Hardware en/decoding' in Settings (but indeed then tall decoding will be on CPU)
NB: surprisingly, the documentation does not specify this resolution limit for vaapih264enc
cf; gstreamer documentation : sink resolution MAX is 2147483647 : https://gstreamer.freedesktop.org/documentation/vaapi/vaapih264enc.html?gi-language=c
That 7680x742 is not accepted on my machine with nvidia is expected, as the nvh264enc MAX sink resolution is 4096: https://gstreamer.freedesktop.org/documentation/nvcodec/nvh264enc.html?gi-language=c#nvh264enc-page
Therefore I conclude that the vaapih264enc gstreamer module is generic and cannot specify the hardware limit without knowing the hardware behind, and you have to test with your hardware.
Thanks! Yeah there seem to be a bunch of issues trying to get this running and at a reasonable framerate, going in to test tonight and got a dummy HDMI plug that helps test virtual monitors. May do a 4k broadcast and then double that on the machine running the projectors as I was getting 4 fps trying to do virtual screens the full 7680px.
EDIT: No longer valid:
Sorry to derail issue, but I saw you just merged the beta branch, so I pulled and rebuilt, just to see what that looked like, but strangely that version had no SRT broadcast option listed under Stream? Was hoping to have a local build running in case I wanted to do some hacking to get everything working for the projection.-
Would also appreciate a new flatpak release if you could, thanks!
Got SRT working in compiled version.
OK, got in to test, but unfortunately was not able to make things work. :disappointed:
Using the flatpak on Ubuntu 22.04 with 12thgen Intel CPU and integrated graphics. Tried with both hardware and software decoding and with h264 and JPEG variations.
Receiving (SRT caller) the stream was a Windows 11 machine using latest ffmpeg. We tried a number of params generally of the format:
ffplay "srt://<vimix machine IP>:<port>?mode=caller&trans_type=live"
Errors (on receiver) looked like:
non-existing PPS 0 referenced
decode_slice_header error
no frame!
(Just tested to see if I could receive SRT using ffmpeg on the same machine as vimix and got the same errors. Hrm.)
Vimix showed no errors. They do connect, we can see a spike of traffic initially and then nothing and get different errors when the connection doesn't happen at all.
ffmpeg was able to use SRT to send a video though, so this worked from the same machine as vimix to that receiver:
ffmpeg -i ~/Videos/test.mp4 -f h264 "srt://:<port>?mode=listener"
Played with many different options and could get anything to work. We also tried gstreamer window capture to virtual webcam and then ffmpeg webcam to srt, but couldn't quite get the format working (not being gstreamer or ffmpeg experts). That looked a bit like:
gst-launch-1.0 ximagesrc xid=0x2c00070 use-damage=0 show-pointer=0 ! video/x-raw,framerate=30/1 ! videoconvert ! queue ! v4l2sink device=/dev/video2
ffmpeg -f v4l2 -i /dev/video2 -pix_fmt yuv420p -c:v hevc -f hevc "srt://:<port>?mode=listener"
Note: the virtual webcam does show up fine in vimix.
Any ideas or suggestions?
Err, my bad with compiled vimix missing SRT broadcast, for some reason I had missed out installing the bad plugins on this machine. Now showing up!
Testing with gstreamer SRT receiver on same machine as vimix (compiled latest):
$ gst-launch-1.0 -v playbin uri=srt://127.0.0.1:4200
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: ring-buffer-max-size = 0
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: buffer-size = -1
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: buffer-duration = -1
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: force-sw-decoders = false
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: use-buffering = false
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: download = false
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: uri = srt://127.0.0.1:4200
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: connection-speed = 0
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0: source = "\(GstSRTSrc\)\ source"
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstDecodeBin:decodebin0/GstTSDemux:tsdemux0.GstPad:sink: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstDecodeBin:decodebin0/GstMultiQueue:multiqueue0.GstMultiQueuePad:src_0: caps = video/x-h264, stream-format=(string)byte-stream
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstDecodeBin:decodebin0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstDecodeBin:decodebin0/GstMultiQueue:multiqueue0: max-size-buffers = 5
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstDecodeBin:decodebin0/GstMultiQueue:multiqueue0: max-size-time = 0
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstDecodeBin:decodebin0/GstMultiQueue:multiqueue0: max-size-bytes = 8388608
/GstPlayBin:playbin0/GstURIDecodeBin:uridecodebin0/GstDecodeBin:decodebin0/GstMultiQueue:multiqueue0.GstMultiQueuePad:sink_0: caps = video/x-h264, stream-format=(string)byte-stream
No output video window though. Same output using srtsrc
and srtclientsrc
.
I tried on my machine with the following receiver and it works:
gst-launch-1.0 srtsrc uri=srt://127.0.0.1:7070 latency=200 ! tsdemux ! decodebin ! videoconvert ! autovideosink
I will add more info in the wiki
Thanks, more details on the wiki would help. Especially for testing and connections to gstreamer and ffplay.
Using that receiver I can get a video test:
$ gst-launch-1.0 -v videotestsrc ! video/x-raw, height=360, width=640 ! videoconvert ! x264enc tune=zerolatency ! video/x-h264, profile=high ! mpegtsmux ! srtsink uri=srt://:7070
I can get ffmpeg SRT with a video working with:
$ ffmpeg -i ~/Videos/HD_test.mp4 -f h264 "srt://:7070?mode=listener&latency=200"
$ gst-launch-1.0 -v srtsrc "uri=srt://<ip>:7070?mode=caller" ! decodebin ! videoconvert ! autovideosink
Note: 127.0.0.1 does not work for listener, only port works, Including tsdemux
on caller doesn't work either.
With and without tsdemux
and vimix SRT I get:
$ gst-launch-1.0 -v srtsrc "uri=srt://<ip>:7070?mode=caller" ! tsdemux ! decodebin ! videoconvert ! autovideosink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstGhostPad:sink.GstProxyPad:proxypad0: caps = video/x-h264, stream-format=(string)byte-stream
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream
/GstPipeline:pipeline0/GstDecodeBin:decodebin0/GstTypeFindElement:typefind.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream
/GstPipeline:pipeline0/GstDecodeBin:decodebin0.GstGhostPad:sink: caps = video/x-h264, stream-format=(string)byte-stream
But no video display window.
Note we are trying to do vimix to Windows with ffmpeg. When I test vimix to ffplay
on Linux I get the original error:
$ ffplay -i "srt://<ip>:7070?mode=caller&latency=200"
non-existing PPS 0 referenced
decode_slice_header error
no frame!
But can receive the videotestsrc
from gstreamer using ffplay
.
Receiver I'm testing is Ubuntu 22.04:
$ gst-inspect-1.0 --version
gst-inspect-1.0 version 1.20.3
GStreamer 1.20.3
https://launchpad.net/distros/ubuntu/+source/gstreamer1.0
So then I dug into the code and recreated the gstreamer init:
$ gst-launch-1.0 -v videotestsrc ! video/x-raw, height=1080, width=1920 ! videoconvert ! x264enc tune=zerolatency ! video/x-h264, profile=high ! queue ! h264parse config-interval=-1 ! mpegtsmux ! srtsink name=sink "uri=srt://:7070?mode=listener&latency=200"
This causes the error in ffplay
and no video window in gstreamer on the receiver. On a hunch I removed h264parse config-interval=-1
and got videotestsrc
to work in ffplay
and gstreamer!
So then I removed that from the code and recompiled... turned off hardware decode and no luck in ffplay or gstreamer. Argh. But, when I turned hardware decode on I did get a video briefly in ffplay before it died but gstreamer still has no output window.
I'm not sure what would be different from my linux tests to yours. :|
Thanks a lot! You indeed had a quite deep debugging session :smile:
The minimal pipeline working with vimix is
gst-launch-1.0 srtsrc uri=srt://<ip>:7070 ! decodebin ! videoconvert ! autovideosink
Note that I do not use the quotes (" ") for the parameters of srtsrc
. I don't think it works like for the decodebin
gstreamer plugin (which parses the string as argument).
I tried with ffmpeg and it works with;
ffplay -i "srt://127.0.0.1:7070?mode=caller&latency=200"
Note: it worked, but it there were errors on h264 decoding, packets size mismatch, and visual artefacts. Also it is normal to have to wait for a second at least until SRT connection is established.
Following your different points:
mpegtsmux
was badly configured and I added a parameter that is recommended for network streaming: after some tests with ffplay, gst-launch and Laryx, it seems to have improved the transfer qualityconfig-interval=-1
of h264parse
is required because otherwise the receiver (here SRT caller) can never get a picture description with the frames (like for key-frame) and, to my experience, the caller never starts.I updated a bit the code of VideoBroadcaster in vimix (commit 088cf97ebf94e317d757c050a979e1fe8394081b in beta), hoping that it would make things better also on your side (here I am under Ubuntu 22.10, amd64).
Thanks! Can confirm those fixes made linux-linux SRT work with both ffplay and gstreamer.
ffplay was about 4 sec delayed and gstreamer about 1 sec, so still not ideal for live shows, is there any receiver-side options to reduce the buffer maybe?
Got ffplay done to half second delay or so with this?
ffplay -fflags nobuffer -flags low_delay "srt://<ip>:7070?mode=caller&latency=200"
Here are more tips in the SRT wiki
In particular, it took me a while to understand why ffmpeg and gstreamer behave differently. With the hack to run ffmpeg twice (with for loop in bash), vimix is happy to reveive ffmpeg SRT streams !
Thanks, yeah using that I'm seeing everything working. I was able to reduce the delay when receiving to gstreamer by doing autovideosink sync=false
.
Thanks, yeah using that I'm seeing everything working.
Good!
ffplay was about 4 sec delayed and gstreamer about 1 sec, so still not ideal for live shows, is there any receiver-side options to reduce the buffer maybe?
The SRT protocol has priority on not loosing data, thus includes several checks and corrections, which rely on the possibility to store several frames. To my experience, it always introduce some delay indeed...
This is why I use direct UDP streaming for the Peer-to-peer streaming between vimix instances; but then I had to implement my own handshake protocol between two programs (here in UDP). It is fast but is not portable to other programs as this is a custom protocol. Alternatively, the user has to know all the information and enter the settings manually; I did not go for this option.
Other protocols I saw require a server and a full implementation of a client connection mechanism: I didn't have the motivation (nor the knowledge) to implement this. Maybe you can tell and help ?
aha... I just found out that webtrcbin is soon available in gstreamer package... This could be an option: i'll keep an eye on it
Back to this issue with a new option: I added a mechanism to obtain a direct SRT stream on UDP from vimix using the same mechanism than the peer-to-peer. vimix can now receive stream requests from any program that follows the following procedure:
/vimix/peertopeer/request i 9000
gst-launch-1.0
/vimix/peertopeer/disconnect i 9000
This is available on Beta branch. You can use and/or adjust this simple shell script : https://github.com/brunoherbelin/vimix/blob/beta/rsc/osc/vimix_connect.sh
So, for your example application case, the target machine only needs standard programs (liblo-tools and gstreamer on command line) to be able to request and receive a video stream from vimix. You could also implement this in any other way you want, like e.g. with Python.
I'm hoping to do a gig at the end of the month using vimix where they have a 360 projection system! (Sweet!) The resolution is 7680x742, so I tried a quick test today (using the flatpak) by making a new project with that custom resolution. I can use the Geometry interface to position videos, great. The Display interface doesn't show the full window size, but hopefully that's not a problem.
However, I'm hoping to use SRT broadcast to connect to the projection system, and when I try that it gives an error:
I do not get the same error with projects with more typical sizes, so I'm wondering if it has something to do with the resolution? What can you recommend to test? Thanks?