Closed nobre84 closed 11 months ago
Thank you for the information! Actually, I was a total iOS beginner when I took up this project. Though, we had a urgent deadline before. It might be quite often that you should find inappropriate operations in it. Feel free to change the code if necessary. Sorry that I am not able to modify it since I have had no OSX laptop at hand for a while.
in the configuration, I see that
#define ICP_MAX_DISTANCE 8.0
#define ICP_MAX_ANGLE 20.0
In the shader, outDepthMapBuffer[outvid]=1000.0/inDisparityValue;
seems to imply that the disparity values are being normalized in a way that this conversion would result in depth in millimeters. I'm guessing ICP_MAX_DISTANCE
is given in the same units as the depth values in your depth map. Could you quickly clarify the units used in the depth conversion and ICP thresholds?
Thanks!
in the configuration, I see that
#define ICP_MAX_DISTANCE 8.0 #define ICP_MAX_ANGLE 20.0
In the shader,
outDepthMapBuffer[outvid]=1000.0/inDisparityValue;
seems to imply that the disparity values are being normalized in a way that this conversion would result in depth in millimeters. I'm guessingICP_MAX_DISTANCE
is given in the same units as the depth values in your depth map. Could you quickly clarify the units used in the depth conversion and ICP thresholds?Thanks!
That is right in millimeters.
When reading frames from the iPhone cameras with TrueDepth sensors, one can request depth data in the
kCVPixelFormatType_DepthFloat16
, which outputs float16 depth values in meters. Maybe FusionProcessor could bypass this initial conversion in this case depending on the pixelbufferOSType
format, or even a specific parameter, for when reading opaque binary buffers.