Open MV10 opened 3 years ago
This line pads the horizontal resolution to a 16 byte boundary, should it be 32?
https://github.com/techyian/MMALSharp/blob/dev/src/MMALSharp/Components/MMALCameraComponent.cs#L321
This is the RPi forum post I found that says horizontal should be 32 (and even mentions the mode I'm using specifically):
https://www.raspberrypi.org/forums/viewtopic.php?t=62364&start=275#p522410
Because of the way the HW works, the width needs to be expanded to the nearest 32 bytes. So 1296 gets rounded up to 1312. The amount of valid data is still the same.
Hi Ian, while we were working on the convolution changes, you commented (here) that horizontal resolution is padded to 32 bytes and vertical is padded to 16 bytes. At the time, I also found comments in the Pi forums to that effect, too. However, I'm running into a problem with raw frames using a v1 camera in mode 4, which is an output resolution of 1296 x 972.
Today I started playing with that image effects delegate idea, and when I switched my test to raw input, it failed to invoke the delegate -- which was the problem I had during the convolution PR. So I turned on debug logs, and I saw the same error message that led me to the padding issue originally:
The vertical resolution is padded (972 to 976), but 1296 isn't evenly divisible by 32, it's divisible by 16. The cell-count lookup table only lists 32-byte offsets, so my table has 1312 x 976. To be clear, this does work properly for encoded images (JPEG input rather than RGB24). But there is no match for 1296 x 976 in the table so it throws the exception. (This is also the exception-handler in
StreamCaptureHandler
that I pointed out makes a non-logging app fail silently, and I still question whether it should be swallowing exceptions when there is no logging.)I think we discussed that it might be a bug, but then it got lost in the shuffle.
It seems horizontal reflects the output resolution and vertical reflects the padded resolution. For processing purposes, we need the padded resolutions, but probably most library clients will want the output resolution, so it seems to me that both should be stored to(See below, it really is padded to 16 ... and I supposeImageContext
.ImageContext
ought to reflect the buffer, a library client can always get the requested resolution from the camera config.)