Open fa-se opened 8 months ago
Image_transport is an extremely welcome addition
I didn't use image_transport in the original plugins because I thought it would be more efficient to allow gstreamer to handle encoding and decoding. That turned out to be wildly impractical to integrate with image_transport.
I'd like to see image_transport take over the majority of uses, but composable_nodes (on the develop
branch) makes raw transport desirable, so the repo should maintain bare publishers somewhere.
The image_transport::camera_publisher class would also address the long-standing camera_info issue, but would probably get in the way in most cases.
It might be reasonable to use the original image_sink as a base class, and have two derived classed for each of image_transport::publisher, and image_transport::camera_publisher. Derived elements should only need to override the open(), close(), and render() methods.
I can't find where the ffmpeg encoding string is used, how does that get picked up by the compression mechanism? Image_transport and ffmpeg aren't the smallest dependencies, so splitting image_transport into an additional elements package (still welcome in this repo) would be a polite way of managing dependencies for embedded systems users.
Thanks for your feedback!
I originally hacked together above implementation for my specific use case, where I want subscribers to be able to choose at runtime in which format they want to receive images published by a GStreamer/DeepStream pipeline.
Your suggestion of keeping the original image_sink as a base class and deriving from that makes perfect sense to me.
I haven't looked into image_transport::camera_publisher yet, but I assume this is related to #49 and involves exposing camera_info_url
and using camera_info_manager.
As for the ffmpeg encoding string:
Because we were recently looking into h264 support in Foxglove, I was experimenting with ffmpeg-transport. ffmpeg-image-transport exposes some ros parameters like encoder or bitrate, which can be set upon node launch.
Since ros-gst-bridge plugins wrap ros nodes, I saw no "good" way to pass these parameters to ffmpeg-image-transport, which is why I exposed the encoder parameter as a plugin property in order to allow me to use h264_nvenc
.
This is not a very pretty solution, because this adds logic for configuring a specific type of transport that many users won't care about, when ideally the plugin would be agnostic to the actual transport used.
I am very interested in alternative approaches to facilitate configuration of ffmpeg-image-transport (or transports generally) - the one I used was just the most obvious to me.
Regarding the dependencies: In my understanding, ffmpeg wouldn't be a direct dependency. It should only be required if users want to use ffmpeg-image-transport, which they would have to install separately. Without ffmpeg and ffmpeg-image-transport present, the node would advertise topics only for actually available transports, such as CompressedImage. Nevertheless, adding a package to contain the image_transport and potential future dependencies seems reasonable.
In the coming days/weeks, I will probably implement your suggestions and think about different ways to configure ffmpeg-image-transport.
I suppose camera_info_manager would be the correct helper class to use, I haven't had to look into it before, but yes it would need a camera_info url property, and maybe even individual lens properties, let's discuss that in https://github.com/BrettRD/ros-gst-bridge/issues/49
... looking into h264 support in Foxglove
Amazing! That deserves proper support.
Gstreamer has very many platform specific h264 encoders which are more efficient than anything ffmpeg can offer on embedded systems. The rpicamsrc in particular allows the raspi zero to transmit beautiful compressed h264 video, but you must compress before it leaves the GPU because the CPU can't compress the feed, and there's not enough memory bandwidth to pass raw frames back to the GPU.
The CompressedVideo message type should have a sink node that packages buffers from h264parse. That allows you to use any up-stream video compression element that suits the hardware, and even re-payload and forward compressed binary data received over other transports.
The Parameter plugin already allows the raspberry zero h264 compression bitrate to be changed remotely on the fly as a ros parameter. If there's a new ros parameter convention or api that image_transport expects, it might deserve a dedicated pipeline plugin to make the api transparent into gstreamer.
I would like to work with Foxglove to make webrtc easier to use in fleet robots, and use the data_channels as a telemetry bridge.
The image_transport sink is still worthwhile, and seems like low-hanging fruit
@BrettRD I would be interested in creating a h264 image sink. It would probably suffice to copy the image sink, modify it to extract the binary stream and package it in the Foxglove compressed video message, right?
I believe support for image_transport could be a great addition to rosimagesink (or a new plugin). image_transport allows using different formats / encodings for the published images, in a way that's transparent to the publisher. The publisher publishes standard ImageMsgs which are encoded on-demand by the available image_transport plugins. This allows to easily offer images encoded as Image, CompressedImage, theora, h264 or h265 (via ffmpeg-image-transport, which also supports hardware encoding via nvenc) from a single rosimagesink. Images are only encoded to the respective formats if there's at least one subscriber requesting that format, which means there's no unnecessary overhead for offering multiple formats.
I went ahead and implemented a basic version by creating a new plugin based on rosimagesink and using image_transport for the publisher: https://github.com/fa-se/ros-gst-bridge/tree/imagetransport_sink
Is support for image_transport something you've been considering? I could polish my implementation a bit and open a pull request - I just wanted to get some feedback on the idea first.