willowgarage / interactive-manipulation-sandbox

Playground for Interactive Manipulation code that aren't quite baked enough for a full-on repo.
0 stars 7 forks source link

ROS-UDP image streaming #159

Open jkammerl opened 11 years ago

jkammerl commented 11 years ago

Testing UDP based image streams via unreliable communication channels.

If we observe improved streaming behavior, a checkbox should be added to rviz which allows to select the transport scheme in the image display.

jkammerl commented 11 years ago

I tested the streaming of compressed image data via the ROS-UDP transport stack between the PR2 and RBH (my desktop at willow), and between PR2 and my apt in San Francisco via VPN in more detail.

Unfortunately, it does not work as expected. ROS-UDP does not allow to control the datagram size . This parameter is negotiated during an initial handshake (http://www.ros.org/wiki/ROS/UDPROS) . Whenever a jpeg or png compressed image is transmitted via ROS-UDP, the image is split into several small UDP datagrams. As soon as a single UDP packet is lost during transmission, the received compressed image data misses a few bytes and OpenCV throws an exception during decoding and drops the image. This means, it works when streaming data locally via UDP (since no data is lost), it works okay-ish when streaming data between a PR2 and a desktop via LAN, but it does not work at all when streaming compressed image data via WIFI since even a small packet loss leads to a corruption of almost all image data. I got similar results when using the libtheora codec. In addition, this codec expects an initial header. If this header packet is lost during transmission, it will never start decoding.

So why does real-time streaming work using RTP?

Can we use ROS-UDP for real-time streaming? Maybe :) ROS-UDP is not designed for low-latency image streaming but nevertheless, it provides similar functionality than RTP and therefore it could work. The reason why the current image_transport_plugins “compressed” and “theora” don’t work is because they are using a container format which is not robust to errors and don’t allow frame/macroblock-based packetization. JPEGs and PNGs don’t have any container. The current theora encoding is using the “Ogg” container which does not support RTP streaming.

Another problem using ROS-UDP is that it does not provide any rate control. We would need to add a feedback channel in order to find out if the encoding data rate matches to the channel bandwidth. RTSP uses the RTCP protocol for that.

I still think it is possible to do RTP-based streaming via ROS-UDP. However, we would need a different image transport plugin which deploys an container format that supports RTP streaming or a codec that can deal with partly lost data such as vp8, theora (without ogg).

btw, here is an interesting overview of RTP explaining differences between RTP and UDP/RTP-based streaming: http://www.cs.columbia.edu/~hgs/rtp/faq.html

ps. I also found a bug in ROS-UDP. If you connect to a node from a machine that does not have a fully-qualified domain name (for instance, when you connect to the WG network via VPN), the ROS node crashes. I’ll look into this in more detail and will file a bug report on Monday.