Open mrousavy opened 8 months ago
Hey 👋 I have an updated depth implementation in a Vision Camera fork so I can add this here.
One important API consideration though is that the depth map returned from the LiDAR or TrueDepth camera on iOS is just a CVPixelBuffer - no CMSampleBuffer container. I've tried to create one but it never creates successfully due to there not being a compatible format description for the CMSampleBuffer for the depth pixel format. This means it can't be wrapped in a Frame HostObject itself and instead I just have a depthMap
on the native Frame class and report a hasDepthMap
on the Frame HostObject. Maybe there is a better approach here e.g. an ImageBuffer HostObject that can wrap the CVPixelBuffer and expose all of its properties in a similar way to the Frame setup with a CMSampleBuffer - wdyt?
Yup, good idea - a minimal Frame instance that just holds pixels. I can reshape the API for that
any updates on this?
Hey @thomas-coldwell could you link your updated fork? 🙇
@thomas-coldwell A link to the updated fork would be great :)
Hi @mrousavy, what is the status of this? Is this on the roadmap yet?
Love your work on this repo!
Hey - thanks!
nope, not on the roadmap. I'd build this in consultancy, but not in my free time
What feature or enhancement are you suggesting?
It would be great to have depth data streaming support in VisionCamera Frame Processor plugins. Something like this:
...assuming the
CameraDevice
supports depth data streaming (supportsDepthData
)What Platforms whould this feature/enhancement affect?
iOS, Android
Alternatives/Workarounds
Currently I need to create a custom fork of VisionCamera to add depth data streaming and synchronization (
AVCaptureSynchronizedDepthData
) to it.Also, @thomas-coldwell created a public PR for this here: https://github.com/mrousavy/react-native-vision-camera/pull/745
..but it is quite old already and would need an update.
On Android, there is no such thing as depth data streaming I think.
Additional information