daniilidis-group / ffmpeg_image_transport

image transport that uses libavcodec for compression
Apache License 2.0
42 stars 13 forks source link

Build Issues #3

Closed siddharthayedida closed 2 years ago

siddharthayedida commented 3 years ago

Hi, I am trying to build the workspace , it is throwing me a lot of declaration errors for the variables or functions used. Here is the screenshot image I am using : Ubuntu 18.04, ROS melodic

berndpfrommer commented 3 years ago

Not sure why it doesn't compile. For me AVHWDeviceType is defined in the build directory of my ffmpeg library under include/libavutil/hwcontext.h. Are you using your own private ffmpeg library? That's what I always do because the standard Ubuntu one is not compiled with support for nvidia hardware acceleration (which was essential for my use case). In case you are not already doing so, try following the instructions for building your own version of ffmpeg

siddharthayedida commented 3 years ago

Thanks ! It worked.

One more question , I am publishing a sensor image::image raw type message and wanted to encode it into h264 , transmit and decode on the other remote PC. Would like to understand if i have to write a script using the functions in encode.cpp and decode.cpp or is there any other way ?

berndpfrommer commented 3 years ago

You can use it like a standard ROS image transport: [ROS image publisher] -> image transport encoding -> network -> image transport decoding ->[ROS image subscriber] The encoding/decoding happens transparently by the image transport. You need to compile the ffmpeg image transport on both the publisher and the subscriber host, and you must "source" the environment variables again to make sure the ffmpeg image transport plugin is actually available (check with rosrun image_transport list_transports)

There was a way to tell the subscriber which image transport to use (raw, compressed, or ffmpeg) but I forgot how to do that. I think if you use something like "rqt_image_view" to decode and view the stream it should just work.

siddharthayedida commented 2 years ago

Thanks, I am trying a simple publisher with image transport and havent seen ffmpeg compressed topic when i ran my simple pub node. I ran rosrun image_transport list_transports and found the ffmpeg plugins are not build eventhough i built everything (followed instruction in readme) as shown below. image It also suggests using catkin_make --pkg ffmpeg_image_transport. When i try doing this, it throws me the declaration errors for the variables or functions as shown below image

berndpfrommer commented 2 years ago

Interesting. It looks like the transport is registered, but it cannot find the plugin. I don't recall seeing that. Not sure what catkin_make --pkg ffmpeg_image_transport does, because I always use catkin build. But obviously during the compile it doesn't find your own compiled ffmpeg library, which certainly is a problem.

Questions: 1) What's the output of catkin config (must run this in the workspace where your ffmpeg_image_transport package is)? 2) Do you have the right workspace sourced, i.e. "setup.bash" in the "devel" directory of the built workspace that has the ffmpeg_image_transport sources?

Most likely though the problem you are seeing is because the (custom compiled) ffmpeg library is not in your LD_LIBRARY_PATH. Add it like this (change the path below to where your custom ffmpeg "build" directory actually is) and see what list_transports gives you?

export LD_LIBRARY_PATH=$HOME/catkin_ws/ffmpeg/build/lib:$LD_LIBRARY_PATH

siddharthayedida commented 2 years ago

Thanks again ! exporting solved the issue

Now i can see the ffmpeg transport listed . I am also able to see the ffmpeg compression topic when i run my simple publisher node , but I am not able to echo the ffmpeg compressed image topic as shown below image

berndpfrommer commented 2 years ago

Just tested this on my machine and it works. You should get messages. First, are you testing on the same node or is this going across the network? Sometimes strange things happen in the latter scenario, in particular when ROS_IP is not set right, but ROS_MASTER_URI is set correctly. Try running locally first (unless you are already doing so). Check the window where your roscore is running to see if the roscore spits out anything interesting. Also, test if you can rostopic hz any of the other image topics (raw, compressed, etc) to make sure the publisher is actually publishing.

siddharthayedida commented 2 years ago

Issue is with the nvidia driver requirement. Fixed it .

Now I am able to publish or encode the image but when I am having build issues when trying to have a sample subscriber node subscribing to the encoded message (image/ffmpeg topic). Is it appropriate to subscribe to the ffmpeg topic ?

berndpfrommer commented 2 years ago

I have trouble decoding your question: Are you saying you still have build issues? If yes, please provide error logs etc. When using an image transport, the subscribing node typically subscribes to the image topic without the /ffmpeg suffix. The image transport then (under the hood) subscribes to the image topic with the ffmpeg suffix, decodes the image, and presents it in a callback to the node.

siddharthayedida commented 2 years ago

In my case I wanted to transmit only the ffmpeg encoded message to a remote pc via udp . On the remote pc I will receive the udp buffer and have a publisher which publishes the same ffmpeg message and a subscriber to decode the ffmpeg message . In this case, on remote pc I won’t be having raw image topic . So how would I subscribe to image topic ?

berndpfrommer commented 2 years ago

So you want to use UDP. Interesting. Not sure how far you'll get with that, and how much help I can be with that. Here some cautionary notes on using ROS on udp: http://wiki.ros.org/Topics Please see if you can subscribe via tcp from a remote node. Use something like rqt_image_view, and see if that works. That's the first step. That republishing node you are talking about, is that supposed to republish the raw ffmpeg packets, or the decoded image? If the latter, that's easy because then you just subscribe with an image transport from the remote node, and the callback you get holds the decoded image. I believe you can even make this work without the publisher running an image transport. Try this: record the image/ffmpeg topic into a rosbag. Then run a republish node (see http://wiki.ros.org/image_transport) and explicitly specify ffmpeg as input transport. See if that works. This is a good test because rosbag play does not instantiate any image transport, so it's really just playing the bare ffmpeg packets.

siddharthayedida commented 2 years ago

So finally my decoder is working but the decoded image looks distorted as shown below . My image resolution is (3088x2064 with rgb ). I have also tried reducing the image resolution to 1900x1200 but still the same. Any clue what might be done to achieve a good decoded image ?

Left one is the raw image from camera right one is the decoded one image

berndpfrommer commented 2 years ago

no idea what's going wrong. I'd suggest hooking up the 1900x1200 image directly on the same node. It should work. What encoder are you using? Nividia nvenc? What is the resolution advertised in the ffmpeg messages? They have image_width and image_height fields. Please try this without a republish node. Hook up directly via image transport using rqt_image_view on the same node that is publishing the messages.

On Fri, Sep 10, 2021 at 12:57 PM siddharthayedida @.***> wrote:

So finally my decoder is working but the decoded image looks distorted as shown below . My image resolution is (3088x2064 with rgb ). I have also tried reducing the image resolution to 1900x1200 but still the same. Any clue what might be done to achieve a good decoded image ?

Left one is the raw image from camera right one is the decoded one [image: image] https://user-images.githubusercontent.com/48033887/132889822-7178cb64-9312-4101-8c57-930e5a2a4228.png

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/daniilidis-group/ffmpeg_image_transport/issues/3#issuecomment-917058712, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABPLK2QULZEJG2NATRJORE3UBI2IXANCNFSM5DTBQCEQ . Triage notifications on the go with GitHub Mobile for iOS https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675 or Android https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub.

siddharthayedida commented 2 years ago

I am using the "hevc_nvenc" encoding, ffmpeg advertised correct resolution (which is 1900 x1200) in my case.I hooked up the same node and still the result is same as shown below .

Left most screen shot is the ffmpeg advertised message, middle one os the rqt_image_view of ffmpeg, and the rigt most is the image from image transport. Not sure what could be done ?

image

berndpfrommer commented 2 years ago

upload a very short rosbag (just a few frames) to e.g. gdrive, with the original raw images and the ffmpeg encoded messages. I'll have a look at it.

siddharthayedida commented 2 years ago

Hi I uploaded a ros bag with raw image and the ffmpeg encoded messages. Here is the log

https://drive.google.com/file/d/13tHCyO7vA4yF9iiDrlut8DuykV-n07Y-/view?usp=sharing

berndpfrommer commented 2 years ago

I can reproduce the issue but I don't know the root cause. Unfortunately I'm very busy at the moment so it may take a few days before I can analyze this further.

berndpfrommer commented 2 years ago

looked at it today several times and verified that it works with my images, but they are 1920x1200, yours are 1900x1200. Note the difference in line width. Can you resize to 1920x1200 and try? I suspect that the nvidia cards require the line size to be a multiple of 32 because similar problems have been reported in issue #2, where the line size was 720.

siddharthayedida commented 2 years ago

Thanks! You are right width should be a multiple if 32 . Now , I am trying to subscribe the ffmpeg topic on the host side and transmit it to the remote side in the form of udp packets . On the remote side I receive the ffmpeg udp packets and try to publish the ffmpeg message using image transport but image transport won’t allow me to publish the ffmpeg message ! Any idea on how to solve this ?

berndpfrommer commented 2 years ago

Can you not just run a republish node on the remote side with input transport "ffmpeg" and output transport "raw"? See http://wiki.ros.org/image_transport

siddharthayedida commented 2 years ago

I tried republishing on the remote side with input transport "ffmpeg" and output transport "raw and end up with the error below eventhough i have all the image transports available

image

berndpfrommer commented 2 years ago

You have ":" instead of ":=". I only have limited time on hand to support my software. Please do basic syntax checking before posting on github!

siddharthayedida commented 2 years ago

I tried with appropriate syntax and seems all the image transport topics being republished but with nothing ( no data ) in it

SimiVoid commented 2 years ago

I have problem with build package on jetson tx2: Screenshot from 2022-02-16 21-55-13 I don't know what I done wrong because the day before I build this pkg on another jeston tx2?

berndpfrommer commented 2 years ago

Sorry I have not seen this one yet. No idea what's going on there.

berndpfrommer commented 2 years ago

closing due to non-activity