Closed whymarrh closed 8 years ago
You should never do something in hardware that you can do in software
Well I guess that's one take on our discussion about software vs. hardware.
In all seriousness, that does reinforce the fact that we should lean on hardware encoding and decoding where possible (both on the Raspberry Pi itself and on the topside).
Agreed! Though it's worth noting that the Pi has a RISC processor (ARM) and the topsides has a CISC processor, so there are significant differences in what it means when we say 'hardware' encoding/decoding
Also, the results from Tuesday: 200-300ms[1] [2] [3] [4] [5] [6] [7] [8] [9] [10] using mpv and similar with MPlayer.
Commands:
raspivid -t 0 -w 1296 -h 730 -fps 49 -o -
(Raspberry Pi, 1296x730 at 49 fps)mpv --no-osc --no-cache --vo=opengl-hq --demuxer-readahead-secs=0 --demuxer-lavf-probescore=1 --demuxer-lavf-buffersize=1 --fps=$(( 49 * 8 )) -
(Topside)
Flags:
Also relevant (as we did see decreased latency with a full FOV): Picamera Camera Modes
Not sure if we've discussed this before (if we have, sorry), but: https://www.raspberrypi.org/forums/viewtopic.php?f=43&t=45793
Paying particular attention to MrBunsy » Sun Jun 02, 2013 8:26 pm:
I've been mucking about with the camera trying to get the delay down as low as possible. I can get it down to what I think is just TCP's buffering by using mplayer without a cache, and trying to play back at higher framerate than received from the camera, though I'm not entirely sure if that's necessary.
nc\nc.exe``192.168.1.76 5001 | mplayer\mplayer.exe -fps 30 -demuxer h264es -
(unlike the instructions on the blog). I also tried streaming over UDP, but the picture was broken most of the time.
Ideally I think you'd want to muck around with something to stream over udp and an intelligent buffer just large enough to get most of the frames, and ditch if the frame takes too long to collect.
We're doing the same two things he's doing with cache and framerate, but his mention of UDP is interesting. Reading through the rest of that thread there's talk of not using netcat and using UDP but I don't really understand the networking aspect of that. Worth a read through the rest of it to see though. Couple references to other threads that have some good info as well.
And one more, with a claim of no latency using GStreamer: http://raspberrypi.stackexchange.com/questions/22288/how-can-i-get-raspivid-to-skip-h264-encoding-getting-rid-of-5-second-latency-s
For a school project, I tried some streaming options (on RPi too!) : VLC MJPEG GStreamer Using VLC and MJPEG (and some other less known), I had latency between 3 and 5 seconds.. Using GStreamer, NO LATENCY and with a best resolution (and lots of more options) !
And if you'll use it, here is my pipeline :
raspivid -t 0 -w 640 -h 480 -fps 25 -b 1200000 -p 0,0,640,480 -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=YOUR_IP port=YOUR_PORT'
I won't pretend to know what all those settings mean, but does anything strike you as being something we have yet to try?
This guy uses Direct3d instead of openGL and a couple other different settings and has some real nice latency https://www.youtube.com/watch?v=xmE99sHBgy0
raspivid -t 999999 --hflip -0 - -w 512 -h 512 -fps 15 | nc [IP address of your computer here] [port number of your computer here]
[netcat executable file path] -L -p [port number to communicate on] | [mplayer executable path] -vo direct3d -fps 24 -cache 512 -
Cache is recommended when using network streaming: https://www.mplayerhq.hu/DOCS/HTML/en/streaming.html
This seems like it's worth a try: http://raspberrypi.stackexchange.com/questions/13382/what-streaming-solution-for-the-picam-has-the-smallest-lag
With Ubuntu 14.10 and Gstreamer I reach 100 to 116 ms latency with 1280 x 720 @ 60Hz.
Tanks to @Antonvh who puts me on the right way. I reproduce here the solution for latter reference.
To stream from the Pi :
raspivid -t 0 -b 2000000 -fps 60 -w 1280 -h 720 -o - \ | gst-launch-1.0 -e -vvv fdsrc ! h264parse ! rtph264pay pt=96 config-interval=5 \ ! udpsink host=10.42.0.1 port=5001
To receive it on your computer with gst-0.10 and send it to a virtual v4l2 device (indeed you need >v4l2loopback):
gst-launch -v udpsrc port=5001 ! application/x-rtp, payload=96 ! rtph264depay \ ! ffdec_h264 ! ffmpegcolorspace ! v4l2sink device=/dev/video1
Then you can open the device /dev/video1 in any software supporting v4l2 capture.For a gst-1.0 solution (v4l2loopback doesn't work with gst-1.0), I let you see the Antonvh blog post.
[U]sing mplayer without a cache, and trying to play back at higher framerate than received from the camera
That's something that @cal-pratt had talked about. I indeed ran raspivid
at 30 fps (before we changed resolutions) and MPlayer at 60 fps. In fact, one of the things I noticed was that increasing the player fps influenced the latency the most. Interesting though that the command shared there does not explictly lower or turn off MPlayer's cache. I ran it with varying -cache
values (e.g. 512k and 1024k) and noticed differences in latency there too.
As for network protocols (TCP vs UDP): it's a hot topic. There are a lot of ways to configure TCP for performance before even considering UDP (since it brings with it a number of new challenges in this context). Just what I can think of off the top of my head: we could tinker with TCP_NODELAY
, MSS, delayed ack, or even its slow-start. Whether all (or any) of those option are applicable or useful is something we'd need to figure out, but it illustrates just how deep we can go.
If we did go the route of UDP, we'd likely need to be a lot more clever with respect to how we receive and display frames.
Ideally I think you'd want to muck around with something to stream over udp and an intelligent buffer just large enough to get most of the frames, and ditch if the frame takes too long to collect.
My interpretation of that is we'd need to parse the packets and decode the video ourselves. Something that, while possible, is good bit of work.
We have yet to try GStreamer, it's on the list, and from the links it does seem quite promising (though I'm always skeptical). A few other notes:
udpsink
in the GStreamer command. Again, since I've not tested it, I can't say anything factual but that does seem to imply UDP (which could be a perf boost).As discussed yesterday, this issue (and ideally #2) should be resolved before we make another camera.
(Comment moved from #82)
Topside changes tested yesterday:
mesa-vdpau-drivers
to get Mesa VDPAU video acceleration drivers--hwdec=vdpau
and --vo=opengl
to enable hardware video decodingTests today indicated that since implementing the hardware decoding noted above our latency has dropped to a consistent 160-200 ms.
Further options including raspividYUV
are being investigated as a way to further decrease latency.
160-200 ms
The current latency is not prohibiting flight.
I'm closing this for now, we'll reopen this if something crazy happens.
refs: #2
Now that the topside computer is assembled, we can start performing video latency tests using various video players e.g. MPlayer, mpv, GStreamer, Humble Video (#13), etc. The desired latency is 150ms.