Open PauloFavero opened 3 years ago
Looking forward to get a feedback about this issue :)
@PauloFavero Did you try arv-camera-test
?
Hi @EmmanuelP
I just tried with arv-camera-test
and it works. It is streaming at a maximum of 6 frames per second.
Does this test suggest that something else is wrong in the pipeline? Or is it something related to the arvsrc element?
Thanks
It is probably an issue with your pipeline. The simplest pipeline that is supposed to work is:
aravissrc ! videoconvert ! xvimagesink
You may want to use the helper script in the gst build directory:
./gst/gst-aravis-launch aravissrc ! videoconvert ! xvimagesink
This is the setup that I am using to trigger the camera:
The Arduino is generating a square wave of 8Hz. This is the reason why the arv-camera-test-0.8 is showing 8 frames per second. The arv-camera-test keeps streaming in case If I disconnect the trigger or the trigger frequency drops down to zero during runtime.
This is the output from arv-camera-test:
$arv-camera-test-0.8
Looking for the first available camera
vendor name = BALLUFF GmbH
model name = BVS CA-GX0-0032AC
device serial number = GX301282
image width = 2064
image height = 1544
horizontal binning = 1
vertical binning = 1
exposure = 500 µs
gain = 0 dB
payload = 3186816 bytes
gv n_stream channels = 1
gv current channel = 0
gv packet delay = 0 ns
gv packet size = 576 bytes
8 frames/s - 25.5 MiB/s
8 frames/s - 25.5 MiB/s
7 frames/s - 22.3 MiB/s
9 frames/s - 28.7 MiB/s
8 frames/s - 25.5 MiB/s
8 frames/s - 25.5 MiB/s
8 frames/s - 25.5 MiB/s
8 frames/s - 25.5 MiB/s
8 frames/s - 25.5 MiB/s
8 frames/s - 25.5 MiB/s
8 frames/s - 25.5 MiB/s
8 frames/s - 25.5 MiB/s
The trigger in our application is not constant or continuous. It means that the camera is triggered when something appears in front of it. It could be in a certain frequency or sporadically.
We did more testing with the suggested pipeline and we narrowed down the problem a little bit. We created the following pipeline:
gst-launch-1.0 aravissrc camera-name="BALLUFF GmbH-BVS CA-GX0-0032AC-GX301282" ! video/x-bayer, format=rggb ! bayer2rgb ! video/x-raw, format=RGBA ! videoconvert ! xvimagesink
When we have a constant frame rate being generated from the external trigger (Arduino square wave) it works perfectly. The problem appears when the pipeline starts without a buffer (no trigger) or if the trigger stops while the pipeline is running. The error output log below was generated when starting the pipeline without the trigger running. The error is the same when the trigger stops.
Test with GStreamer launcher with a frequency of 0Hz.
gst-launch-1.0 aravissrc camera-name="BALLUFF GmbH-BVS CA-GX0-0032AC-GX301282" ! video/x-bayer, format=rggb ! bayer2rgb ! video/x-raw, format=RGBA ! videoconvert ! xvimagesink
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
ERROR: from element /GstPipeline:pipeline0/GstAravis:aravis0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstAravis:aravis0:
streaming stopped, reason error (-5)
Execution ended after 0:00:02.004351405
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
It seems that there is a timeout somewhere that stops the pipeline when there is no buffer being streamed.
Does you know how to proceed in this case? Does it seem to be something related to the aravissrc element?
Hi,
Thanks for the analysis.
You are probably hitting the default 2s timeout aravissrc is using trying to pull a buffer when operating without a fixed frame rate.
A quick fix would be to increase this default timeout, but that obviously does not sound like a proper solution.
The correct fix implies a better understanding of what is the expected behaviour of GstPushSrc::create override.
Do we have any workaround ?
The easiest workaround is to increase the timeout value and recompile Aravis.
But a proper solution needs a better understanding of GStreamer. Any help welcome.
Hi Emmanuel,
Maybe I can help. I suspect your problem is not the create function but the gstreamer latency handling. At least that is where we had problems when dealing with TriggerMode.
Basically what happens is your source must be synchronized with the gstreamer clock. Since it is a live source it can't give reliable increasing timestamps and instead works with durations. Ideally that would mean first timestamp (0) + duration of frame. New timestamp + duration of frame, etc. Since we cannot know what the duration of a frame is (some come faster, some slower, thus never being 100% accurate) gstreamer wants to know a latency range. Aravis uses the default implementation which is either min==max==0 or min==max==framerate that I could not find.
I suspect that that is the cause of your problems. Our cameras do not offer 0/1 so i can't verify this. A better implementation would be min=framerate and max=GST_CLOCK_TIME_NONE. That basically says expect the next buffer between average duration and the heat death of the universe, giving you more than enough time to trigger or extended exposure times, etc.
A latency implementation that can almost be copied would be ours: https://github.com/TheImagingSource/tiscamera/blob/master/src/gstreamer-1.0/gsttcammainsrc.cpp#L1500
video/x-bayer, format=rggb, width=1920, height=1080, framerate=0/1
I am surprised that that is even a legal value. Setting framerate=8/1 seems like a good idea. It would certainly enable the default implementation to work better.
To get a better idea of what the source is doing some logging might help.
gst-launch-1.0 --gst-debug=aravissrc:5basesrc:5 aravissrc camera-name="BALLUFF GmbH-BVS CA-GX0-0032AC-GX301282" ! video/x-bayer, format=rggb ! bayer2rgb ! video/x-raw, format=RGBA ! videoconvert ! xvimagesink
Does that help or am I overlooking an obvious bit of information?
@EmmanuelP tl;dr nobody from this project knows how to add a property to aravissrc and pull it up in code as a timeout value?
@turowicz Adding yet another property is not the right solution. @TIS-Edgar gave some thoughts on how to solve this issue properly, but nobody stepped up with a fix yet.
@EmmanuelP looks like this issue is not exactly the same as one I've reported. I do not need to change the framerate.
Sorry, I was unsure if anyone wanted to give additional input and had a bit of additional projects to deal with.
@PauloFavero I created a branch that implements the latency feature I described. You can find it here https://github.com/TIS-Edgar/aravis/tree/issue/528/add-gst-query-latency
Would it be possible for you to test and verify that this solves this problem? I am unable to test that with our cameras and would like to know that before making an official merge request.
@EmmanuelP Since I received no comments on the feature branch from @PauloFavero, should I simply open a merge request so that this can keep moving?
@EmmanuelP Since I received no comments on the feature branch from @PauloFavero, should I simply open a merge request so that this can keep moving?
Yes please.
Should be fixed by #686. Please reopen if the issue persists.
It looks like the issue is not completely fixed, as reported here: #704
We still use a 2s timeout when waiting for a buffer.
We can not really increase this default timeout, and already busy waiting for 2 seconds is wrong. A proper solution requires a better knowledge of how we are supposed implement this sort of intermittent source.
Any help appreciated !
My use case is with 4x cameras with hardware trigger on each except one.
I used a spinnaker gst pipeline before which seems to wait for the initial buffer.
I am looking at
Which seems the right place to wait for a frame, eventually checking if a trigger is enabled.
@EmmanuelP above PR has just been tested in lab with hardware trigger configured. It works properly by starting streaming once the camera without trigger stream the first frame. Thanks
Hello, we are very interested in this feature. Will this be merged in main anytime soon? Thank you for your time.
Describe the bug
Not able to use aravissrc as a source when triggering the camera by hardware. The pipeline does not send the buffer downstream. The goal is to generate snapshots when triggering the camera from the hardware.
To Reproduce 1 - Set the camera to generate frames on demand when triggered by hardware events:
2 - Launch gstd client:
3 - In another terminal, create and play the pipeline:
4 - Trigger the camera.
In our setup, we have a camera with pin 4 connected to a laser sensor. When the laser is interrupted, we have a failing edge that triggers the camera to capture an image. This behavior works very well when running the wxPropView provided by the Balluff SDK.
Expected behavior Store a jpeg image in the filesystem when a frame is present in the pipeline.
Camera description:
Platform description:
Additional context
GSTD Logs: gstd_logs.txt
Multifilesink without async prop:
When running the pipeline without setting the multifilesink async prop to false we get the following output:
Balluff settings (wxPropView):