Open Williangalvani opened 5 years ago
I think we should get some sort of live statistics in QGC and/or companion in order to explore this more under various conditions and with different cameras.
I think a good test will be using a simple pipeline. I suspect some uv4l settings are probably to blame
Although in the video that I attached in the BlueRobotics forum I did not a good job on keeping the camera with the same field of view, I thought that I tried under same conditions over the repetitions of the same experiments, but apparently not.
I did another test with the spare camera I have, and indeed, it is the auto exposure that does not constrain the maximum exposure time according to the framerate set.
In particular, the experiment involves camera without any obstruction, and camera with my hands covering the field of view to make it darker, and indeed the framerate drops, when the auto exposure is set to the default value which according to the get command of v4l2-ctl
is 3. Setting the auto exposure to false through this command
v4l2-ctl --set-ctrl=exposure_auto=1
works. Note that this was tested with the exposure left to the default (156)
And just to test this hypothesis, I also tried to change exposure_absolute
, if we set it to the maximum, it dropx to 1fps, if we set it to the minimum clearly it works. Experimentally, it seems that the value where the dropping of the frames starts is around 310. If I have time I will plot the dropping.
I suggest adding rtpjitterbuffer
to the test pipeline:
gst-launch-1.0 -v udpsrc port=5600 caps='application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264' ! rtpjitterbuffer ! rtph264depay ! avdec_h264 ! videoconvert ! fpsdisplaysink
The framerate still drops in low light, but in bright light fpsdisplaysink
now displays 30fps.
That is right, but note that the rtpjitterbuffer will likely add some latency to the stream, too.
I just checked if videorate
could help with this by adding it to the stream pipeline and it didn't seem to make a difference. Annoying because that seems to be the kind of functionality we want, although in this case perhaps the source and sink agree on the framerate, but the source is failing to actually meet that agreement (so maybe it doesn't apply?).
Perhaps also worth noting that the sink/src options don't have video/x-h264
, just video/x-raw(ANY)
and video/x-bayer(ANY)
, so it's possible it doesn't work for that reason.
I'm at least somewhat heartened by
A conversion to a specific framerate can be forced by using filtered caps on the source pad.
Not sure how to add a filtered cap, or whether it's feasible to do so, but might be worth looking into as a next step.
On an unrelated note, I had a bit more of a play and it seems plausible that this might be a gstreamer-specific issue. I tried the V4L2 stream rate testing described here and it's very consistent, either at or just above 30fps, or at the highest it can be by the set exposure level.
In comparison, when I'm using gstreamer to my surface computer the framerate in fpsdisplaysink
is consistently lower, and seems to fluctuate a decent amount regardless of lighting (although low light still generally reduces the received framerate). Doesn't seem to be dropped packets as there are very few dropped frames (but that may be an invalid conclusion depending on how the dropped frame count is implemented).
The issue was previously open in https://github.com/bluerobotics/qgroundcontrol/issues/217.
Discussion was started here.
The framerate can be checked by playing the video with this gstreamer pipeline:
I was unable to make it fixed by playing with the camera settings. Also while increasing exposure time gives us a low framerate, I was unable to get 30fps even with manual minimum exposure.