Closed EduFdez closed 4 years ago
Dear Eduardo
I must admit, your question is rather a question for the ROS community. I do not know, how to use two cameras with ROS. We only used one single camera until now.
Therefore, please ask the question to the ROS community.
One question: Does
gst-launch-0.10 v4l2src device=/dev/video0 ! video/x-raw-rgb, width=640,height=480,framerate=30/1 ! ffmpegcolorspace ! ximagesink
show a live stream? Stefan
Thank you for your reply Stefan,
the live stream is not shown with (it shows the error msg below):
gst-launch-0.10 v4l2src device=/dev/video0 ! video/x-raw-rgb, width=640,height=480,framerate=30/1 ! ffmpegcolorspace ! ximagesink
Setting pipeline to PAUSED ... libv4l2: error set_fmt gave us a different result then try_fmt! ERROR: Pipeline doesn't want to pause. ERROR: from element /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Device '/dev/video0' cannot capture in the specified format Additional debug info: gstv4l2object.c(2218): gst_v4l2_object_set_format (): /GstPipeline:pipeline0/GstV4l2Src:v4l2src0: Tried to capture in RGB3, but device returned format GREY Setting pipeline to NULL ... Freeing pipeline ...
I understand that the question can also be appropriate for the ROS community, but maybe the TIS community has more experience on that.
Hi Eduardo
Supporting the outdated GStreamer 0.10 is a little bit hard, because some of the old modules are missing. Please try
gst-inspect-0.10 | grep bayer
if you do not get an output with "bayer2rgb", you wont get colored images from the camera. You will get Bayer raw data only, which shows a little checkerboard pattern. Unfortunately you need that bayer2rgb module, because the DFK 23 series does not perform the debayer on board.
The pipeline
gst-launch-0.10 v4l2src device=/dev/video0 ! video/x-raw-gray, width=640,height=480,framerate=30/1 ! ffmpegcolorspace ! ximagesink
should give a gray scale image.
In case you checked out the "gstreamer-0.10" branch, you get the Gstreamer 0.10 tiscamera modules. After you built and installed them, you can use
gst-launch-0.10 v4l2src device=/dev/video0 ! video/x-raw-gray, width=640,height=480,framerate=30/1 ! tisvideobufferfilter ! tis_auto_exposure ! ffmpegcolorspace ! ximagesink
Now the exposure time is done automatically and with the tisvideobufferfilter module the incomplete frames are dropped. However, the image will look grayscale.
As far as I know, there is a ROS module, that supports GStreamer 1.0. I never tested that on my own, but you may test this.
Stefan
Hi Stefan,
gst-inspect-0.10 | grep bayer
does not return anything.
and
gst-launch-0.10 v4l2src device=/dev/video0 ! video/x-raw-gray, width=640,height=480,framerate=30/1 ! ffmpegcolorspace ! ximagesink
just opens a window with a black image (not the real image that I can see with ./tis_rosstarter)
I'm on the master branch of tiscamera.
My main problem is to record a synchronized sequence to disk, I started trying to do that with ROS because it seemed to me the easiest option after looking at the examples inside tiscamera. But I can use other alternatives if you think there are easier options.
Maybe the black image comes from a very low exposure time. You only want to save AVI files or images to disk and not to control a robot?
Stefan
That's it. I need to record stereo sequences that I will later use for stereo calibration, visual odometry and semantic segmentation, that's all. But I need to do it in Linux.
Hello Eduardo,
Then ROS is not necessary. With the master branch you got a software tcam-capture. You can use it for one camera, but you can start the program twice, one for each camera. You can do AVI capture with it. How to get the matching image pairs from the both video files is another thing.
I would start with a C++ program or Python scripts, that creates a pipeline for each camera and starts the pipelines more or less at the same point of time. Maybe it is a good idea to look at https://www.linuxtv.org/wiki/index.php/GStreamer for getting an idea about GStreamer first.
Unfortunately I do not have sample for this yet.
How do you read the AVI files for your calibration stuff?
Stefan
Hi Stefan, I will need to save the image sequences with timestamp (I prefer to avoid video files). What I have done before with other cameras is to save each image sequence in a separate directory, using the timestamp to name the images. Thus, the synchronized images have the same name-timestamp.
I will have a better look at GStreamer for that. But after browsing it, I think that the main problem will still be getting the synchronized sequences of images. It's a pity that the utilities and libraries provided by TIS for that support only Windows.
Eduardo
It's a pity that the utilities and libraries provided by TIS for that support only Windows.
Well, what you git-cloned is more or less Linux only.... And even IC Capture does not do, what you want to do.
But the good news is, I know, what you want to do now and have an idea, how to solve it. For this I have two questions: Which frame rate do you want to achieve? At least , that will be limited by the speed of your hard disc Which programming language do you prefer?
Thank you in advance!
Stefan
I'm looking for a frame rate of 30/1 and a resolution of 1280x720, with color images. I prefer C++, but python could work as well. Thank you very much for your help Stefan!
Eduardo
30 fps for two cameras could be a little bit too fast for your hard disc. How long are the sequences you want to capture? We should consider to capture them in memory and save them after the sequence has been captured.
I was unclear for the image format, I meant BMP or JPEG.
C++ is fine for me.
Currently my Linux System makes the 18.04 update, so I can not start hacking now.
Stefan
Actually I need to capture very long sequences of about 30 min, thus it needs to be saved almost directly disk. I have an SSD disk and data transfer should not be a problem for that resolution and frame rate. JPEG format is OK.
Eduardo
well, in our experiences saving longer sequences at 60 frames per second, you have to cameras, will not work, even on fast SSDs. We tested that in Windows. Maybe in Linux the hard disc handling is different, but I can not guarantee, that all images will be saved to hard disc. On my older Linux computer, this wont work.
However, it is worth a try.
Stefan
OK, I could reduce the resolution or the frame rate to 20 fps if it does not work.
Hi Stefan, did you find out how can I do the synchronized recording?
Eduardo
Please excuse me, being not that fast writing a program for you.
Stefan
Thank you for that Stefan. I didn't know that you were writing the program for me.
Hello Eduardo
here we go. tcam-stereo-capture.zip It is the source code of a command line application. You need to adapt serial numbers, video formats and frame rates, before compiling. See the included Readme.html
I hope that helps.
Stefan
Thank you for the code Stephan. I'm going to test it and I will let you know.
Eduardo
I am curios, whether this will work for you. Do not hesitate to ask questions about that. At least, it was some fun writing the sample.
Stefan
Dear Stefan,
I tested your program and it seems to work :-) Thank you very much!
There are 2 points that I wanted to discuss with you:
1) I'm trying to use the highest possible resolution and thus I set the frame rate accordingly with
set_capture_format("BGRx", FrameSize{1920,1200}, FrameRate{15,1})
but the program fails for resulutions >= 20 fps with the following error (and same thing for resolution {1920,1080}):
(tcamstereocapture:17800): GStreamer-CRITICAL **: gst_caps_get_structure: assertion 'index < GST_CAPS_LEN (caps)' failed
(tcamstereocapture:17800): GStreamer-CRITICAL **: gst_structure_get_value: assertion 'structure != NULL' failed
(tcamstereocapture:17800): GStreamer-CRITICAL **: gst_structure_set_value: assertion 'G_IS_VALUE (value)' failed
(tcamstereocapture:17800): GStreamer-CRITICAL **: gst_structure_get_value: assertion 'structure != NULL' failed
(tcamstereocapture:17800): GStreamer-CRITICAL **: gst_structure_set_value: assertion 'G_IS_VALUE (value)' failed
(tcamstereocapture:17800): GStreamer-CRITICAL **: gst_structure_get_value: assertion 'structure != NULL' failed
(tcamstereocapture:17800): GStreamer-CRITICAL **: gst_structure_set_value: assertion 'G_IS_VALUE (value)' failed
2) When there are frame drops (specially with higher resolutions) and I get a msg like:
2.Trigger Did not receive an image from all camera 1. after I check the saved images I see that the corresponding image exists (for a drop msg) and that seems to be a bug in the program (e.g. for the case above: cam1, 2.Trigger I get a saved image with name 00003).
Hello Eduardo
if more that 15 fps do not work, I guess, your cameras are connected to an usb 2.0 controller. You can check this with the tcam-ctrl porgram:
tcam-ctrl -c <serial>
If the camera is connected to USB 3.0, you will get
Available gstreamer-1.0 caps:
video/x-raw, format=(string)GRAY8, width=(int)1920, height=(int)1200, framerate=(fraction){ 54/1, 50/1, 40/1, 30/1, 25/1, 20/1, 15/1, 10/1, 5/1 };
Which gives 54 fps max.
Connected to USB 2.0 you will get the output
Available gstreamer-1.0 caps:
video/x-raw, format=(string)GRAY8, width=(int)1920, height=(int)1200, framerate=(fraction){ 15/1, 10/1, 5/1 };
which indicates only 15 fps max
I connected my camera to an USB 3.0 controller. The maximum allowed frame rate for both cameras running on this controller is 30 fps.
Now to your second issue: It is a timeout problem. Getting the images is fast, but saving them to hard disc is slow. I guess, I mentioned that above somewhere up. For saving both images to hard disc the timeout loop seems to wait too short, therefore you got an image saved, where was indicated no image. On my old Core i5 I changed the timeout wait loop like this:
// Wait with timeout until we got images from both cameras.
int tries = 10;
while( !( CustomData1.ReceivedAnImage || CustomData2.ReceivedAnImage) && tries > 0)
{
usleep(100000);
tries--;
}
You may play with this timeout on your own.
Stefan
Hi Stefan,
You're right, I connected one of my cameras to an USB2, when I connect it to an USB3 it works on higher resolution/fps.
And thank you for the clarification about the timeout for receiving/saving images. Just one more question, about the line
while( !( CustomData1.ReceivedAnImage || CustomData2.ReceivedAnImage) && tries > 0)
should not we use AND to check that we have both images? : ( !( CustomData1.ReceivedAnImage && CustomData2.ReceivedAnImage) )
while( !( CustomData1.ReceivedAnImage || CustomData2.ReceivedAnImage) && tries > 0)
should not we use AND to check that we have both images? no, because it was
!CustomData1.ReceivedAnImage && !CustomData2.ReceivedAnImage
Which goes to
!(CustomData1.ReceivedAnImage || CustomData2.ReceivedAnImage )
its boolean mathematic.
BTW. I think about moving the saving part from the callback into the main loop, because the customData struct contains the cv::Mat. Then you could save the images after they are received and the save process itself does not affect the timeout.
Stefan
You're right, thank you for the clarification. I also think that saving the images out of the callback makes sense.
Hello Eduardo
here comes the updated source: tcam-stereo-capture.zip
The image saving is now called from the main loop. Currently I am not sure, whether the usleep() works fine on my computer, because i get provides timeouts, where no are expected, but I also do not see the program waiting.
The documentation is also updated in the zip file.
Have a nice weekend!
Stefan
Thank you very much Stefan! Have a nice weekend too!
Eduardo
I have to apologize. You are right about the "&&".
// Wait with timeout until we got images from both cameras.
int tries = 8;
while( !( CustomData1.ReceivedAnImage && CustomData2.ReceivedAnImage) && tries >= 0)
{
usleep(10000);
tries--;
}
This loop waits 80 ms for images. At 30 fps it is somewhat above the double frame rate time, in which you could expect an image to be delivered.
I am very sorry for my error.
Stefan
Hi Stefan,
I have been using your program and so far it's good to get the pairs of synchronized images for doing stereo calibration, where I need a low frame rate like 5Hz. Then, I wanted to get color sequences at (1920x1200, 20Hz) but the real frame rate that I get is much slower (around 10Hz). I have played with different frame rate values (10/20/25/30/40) but I always get less than half of the value that I set (except for 5Hz for which I get around 4Hz). I tried switching to SoftwareTrigger but the problem persists. I also stopped saving the images to disk but the problem persists. Do you have any idea on how to solve this? Besides, the time among subsequent images has a high variability, as if the trigger is not doing his job.
Also, my colleagues from another lab have built a 2nd stereo camera like the one I have, and we are wondering if it's possible to make a ROS module for synchronized stereo acquisition. Is that possible?
Thank you for your help
Hello Eduardo
if both cameras are connected to the same USB 3.0 controller, the cameras will run half frame rate only. However, in your case, that should be 25 fps.
The other thing is the speed of your hard disc. You can not save 50 single images per second with that resolution to hard disk. That is far too much data.
Usually for this task AVI (video capture) is used. But you must expect frame drops due to hard disc speed, so synchronizing of images of two files becomes very hard.
Regarding the ROS module: I am the wrong to ask that. That is better to be asked the ROS community. And it wont speed up saving to hard disc.
Stefan
The cameras are actually connected to the same USB 3.0 controller, but as I told you, I'm checking the real frame rate without saving the images to disk (so, that's not the problem).
After some more checks, it looks like the real frame rate is around 10Hz whenever I set it to a value equal or above 20. That means that I get a similar frame rate when I set it to 25 or to 50. Do you have an idea of why this happens?
Hmm. To be checked. What does the tcam-capture program say about the real frame rate?
I get the following msg:
Available gstreamer-1.0 caps: video/x-raw, format=(string)GRAY8, width=(int)1920, height=(int)1200, framerate=(fraction){ 54/1, 50/1, 40/1, 30/1, 25/1, 20/1, 15/1, 10/1, 5/1 }; video/x-raw, format=(string)GRAY8, width=(int)1920, height=(int)1080, framerate=(fraction){ 60/1, 50/1, 40/1, 30/1, 25/1, 20/1, 15/1, 10/1, 5/1 }; video/x-raw, format=(string)GRAY8, width=(int)1280, height=(int)720, framerate=(fraction){ 120/1, 100/1, 90/1, 80/1, 70/1, 60/1, 50/1, 40/1, 30/1, 25/1, 20/1, 15/1, 10/1, 5/1 }; video/x-raw, format=(string)GRAY8, width=(int)640, height=(int)480, framerate=(fraction){ 5000000/20833, 200/1, 150/1, 145/1, 120/1, 100/1, 90/1, 60/1, 30/1, 25/1, 20/1, 15/1, 10/1, 5/1 }; video/x-bayer, format=(string)gbrg, width=(int)1920, height=(int)1200, framerate=(fraction){ 54/1, 50/1, 40/1, 30/1, 25/1, 20/1, 15/1, 10/1, 5/1 }; video/x-bayer, format=(string)gbrg, width=(int)1920, height=(int)1080, framerate=(fraction){ 60/1, 50/1, 40/1, 30/1, 25/1, 20/1, 15/1, 10/1, 5/1 }; video/x-bayer, format=(string)gbrg, width=(int)1280, height=(int)720, framerate=(fraction){ 120/1, 100/1, 90/1, 80/1, 70/1, 60/1, 50/1, 40/1, 30/1, 25/1, 20/1, 15/1, 10/1, 5/1 }; video/x-bayer, format=(string)gbrg, width=(int)640, height=(int)480, framerate=(fraction){ 5000000/20833, 200/1, 150/1, 145/1, 120/1, 100/1, 90/1, 60/1, 30/1, 25/1, 20/1, 15/1, 10/1, 5/1 }; video/x-raw, format=(string)GRAY16_LE, width=(int)1920, height=(int)1200, framerate=(fraction){ 27/1, 25/1, 20/1, 15/1, 10/1, 5/1 }; video/x-raw, format=(string)GRAY16_LE, width=(int)1920, height=(int)1080, framerate=(fraction){ 30/1, 25/1, 20/1, 15/1, 10/1, 5/1 }; video/x-raw, format=(string)GRAY16_LE, width=(int)1280, height=(int)720, framerate=(fraction){ 60/1, 50/1, 40/1, 30/1, 25/1, 20/1, 15/1, 10/1, 5/1 }; video/x-raw, format=(string)GRAY16_LE, width=(int)640, height=(int)480, framerate=(fraction){ 120/1, 100/1, 80/1, 70/1, 60/1, 30/1, 25/1, 20/1, 15/1, 10/1, 5/1 }; video/x-bayer, format=(string)gbrg16, width=(int)1920, height=(int)1200, framerate=(fraction){ 27/1, 25/1, 20/1, 15/1, 10/1, 5/1 }; video/x-bayer, format=(string)gbrg16, width=(int)1920, height=(int)1080, framerate=(fraction){ 30/1, 25/1, 20/1, 15/1, 10/1, 5/1 }; video/x-bayer, format=(string)gbrg16, width=(int)1280, height=(int)720, framerate=(fraction){ 60/1, 50/1, 40/1, 30/1, 25/1, 20/1, 15/1, 10/1, 5/1 }; video/x-bayer, format=(string)gbrg16, width=(int)640, height=(int)480, framerate=(fraction){ 120/1, 100/1, 80/1, 70/1, 60/1, 30/1, 25/1, 20/1, 15/1, 10/1, 5/1 }
Hello
I made a test with a similar program. Videoformat 1920x1200. At 40 fps, I got 347 frames in 10 seconds. Removing the output with ximagesink, I got 400 images in 10 seconds. At 54 fps I got 541 frames without ximagesink. With ximagesink I got 528 frames.
It does not matter, whether you use GRAY8 or BGRx.
Maybe your USB 3 controller is too slow? Or you have too much CPU load, so you get damaged images and therefore frame drops?
Stefan
Hi Stefan, actually I have an important CPU load when I run the program. If I disable ximagesink image display I still get a large CPU load and a much lower framerate than the one I set. I don't know how to verify the speed of my USB 3 controller. Do you have any idea on how can I track down the problem?
Hello Eduardo
which computer model do you use? I use an old Core i5.
Stefan
I'm using a laptop HP ZBook 15 G3. It's 2 years old, but with high specifications (i7, 32Gb RAM, 512 SSD, ...).
Also, below is the usb information I get from lshw: *-usb description: USB controller product: Sunrise Point-H USB 3.0 xHCI Controller vendor: Intel Corporation physical id: 14 bus info: pci@0000:00:14.0 version: 31 width: 64 bits clock: 33MHz capabilities: xhci bus_master cap_list configuration: driver=xhci_hcd latency=0 resources: irq:124 memory:e4420000-e442ffff
Hello Eduard
"less than half" regardless the set frame rate. That remembers me to a problem on Jetson TX2 boards, where engineers limited the maximum USB transfer size. Images bigger than a particular size are received also with "less than half" speed. We solved this problem by specifying the USB Transfer Size in the camera with a script. However, that works in the USB 33U cameras, not in the 23U cameras. On the Jetson TX2 board is no way to specify a bigger USB Transfer size. Maybe this is possible on your laptop, but I do not think so.
What happens, if you use a lower resolution, e.g 1024x768? I guess, you will receive the full frame rate.
Stefan
When I reduce the resolution the problem persists, even with 640x480. Might the problem come from the trigger?
Maybe. You may try without trigger and see, what happens.
It happens the same without trigger :-(
Then it is a computer problem. Did you try tcam-capture? It shows the current fps in the lower right corner.
Yes, I have tried and it had also a deficient frame rate. But I know that when I tried it some time ago it worked perfectly. So, I disconnected the cameras, connected them again and I run tcam-capture and it works fine, at the right frame rate. Then, I run again the stereo camera program, it still drops too many frames. And after that, I run again tcam-capture with deficient frame rate.
My conclusion: when I use the stereo capture I set some options in the cameras that result in a very high frame drop. What do you think?
I am clueless. I never heard something like that, except if a program set properties and another one uses them unknowingly and getting weird results. What about the exposure time? Which exposure time do you use?
Sorry, I have to guess around.
Hi Stefan, The slow frame rate came from my own mistake, I was playing around with different parameters and I had changed by mistake "Softwaretrigger" by "TriggerMode" in the for loop. Then, since the behavior was not what I expected I forced the program to exit without setting the correct options for the cameras. That's probably why tcam-capture did not work properly.
After correcting this issue I'm getting a frame rate of around 25fps for maximum resolution, and that's enough for my application.
One more question about the trigger, in my camera setup I have an IMU from Xsens that produces the pulses to synchronize the cameras through the camera's BNC connector. In this case, do I still need Softwaretrigger in order to obtain synchronized images?
Hi Eduardo
Good to read, you sorted the problem out.
If your IMU generates trigger pulses, you do not need to make the software trigger. While the cameras run in trigger mode, the hardware trigger from your IMU will let the cameras to make images. The complete source code in the loop for the images is not needed. The only hard part will be finding the image pairs in case of a frame drop, but I would start first with the IMU trigger and see, what you get.
Stefan
OK. Thank you very much for your help Stefan, I appreciate it very much! I think that this issue can be closed already.
Hi Stefan,
I re-open this issue for a question that requires remembering our previous exchange. I'm recording a stereo sequence using a modified version of your code. I have a external trigger set at 20Hz, so I have removed the instances of SoftwareTrigger. The program works without trigger (but probably the images are not synchronized): TriggerMode->set(cam1,0); TriggerMode->set(cam2,0); But it seems that the trigger is actually working, even though it's disabled on the code, because when I reduce the frequency of the external trigger (e.g. 4Hz). I get the images at 4Hz, even with cam1.set_capture_format("BGRx", FrameSize{1920,1200}, FrameRate{m_frequency,1}); cam2.set_capture_format("BGRx", FrameSize{1920,1200}, FrameRate{m_frequency,1});
That is extrange! isn't it?
What it is more strange is that if I set TriggerMode->set(cam1,1); TriggerMode->set(cam2,1); I stop receiving images (the callback is not called anymore).
I went through the doc that you created for the stereo program but I do not find the reason for that. It looks like setting the TriggerMode is activated by default in the cameras when a trigger signal is received, and that TriggerMode->set(cam2,1); does not work properly?
Do you have any suggestion to try to debug this problem?
Eduardo
Hi Eduardo
First of all, I would try with one camera only. Set the frame rate to maximal 25 Hz,. I am not sure, which value is the nearest values offered to 25Hz by V4L2.
However, we know, that this is working fine with software trigger, so there is no problem with image data transfer.
How does your external trigger signal looks like? The voltage change is from 0V to 5V and higher? The signal is clear and not noisy? You contacted the correct pins 12 with - and 11 with +?
If the trigger mode is enabled, then it does not make any difference whether you use software trigger or hardware trigger. For the camera this is the same.
Stefan
Hi Stefan, Once again, thank you very much for your help! We found the problem, the cable that linked one camera was broken, we have replaced it and now everything works. We just have too much frame drops at 1920x1280 and 20Hz, but I realized that setting the automatic settings (like the gain) to fixed values reduces a lot the frame drops. If you have any advice on this issue in order to reduce more the frame drop, that could be very helpful for us.
Eduardo
I have built a stereo system with 2 cameras DFK23UX174 and a synchronization unit and I'm trying to use ROS to record a synchronized video sequence. I have compiled tiscamera and I can use all the examples. I would like to create a roslaunch file to save to disk a synchronized sequence that I can use later for calibration. Since I have little experience with ROS, I'm starting simple with:
But I get the following error msgs: [ INFO] [1534515426.603167215]: Using gstreamer config from env: "v4l2src device=/dev/video0 ! video/x-raw-rgb width=640,height=480,framerate=30/1 ! ffmpegcolorspace" [ INFO] [1534515426.608749658]: using default calibration URL [ INFO] [1534515426.608807766]: camera calibration URL: file:///home/efernand/.ros/camera_info/camera.yaml [ INFO] [1534515426.608880960]: Unable to open camera calibration file [/home/efernand/.ros/camera_info/camera.yaml] [ WARN] [1534515426.608920420]: Camera calibration file /home/efernand/.ros/camera_info/camera.yaml not found. [ INFO] [1534515426.608946052]: Loaded camera calibration from [ INFO] [1534515426.629491578]: Time offset: 1533113047,013 libv4l2: error set_fmt gave us a different result then try_fmt! [FATAL] [1534515426.741726749]: Failed to PAUSE stream, check your gstreamer configuration. [FATAL] [1534515426.741775569]: Failed to initialize gscam stream!
Could you help me with this? Please