Closed jimzou closed 5 years ago
Hi! Do you mean rendering depth frames to a texture similarly to how the color stream is rendered in provided Unity sample? Or you want to render 3D point cloud obtained from depth data?
Hi! Do you mean rendering depth frames to a texture similarly to how the color stream is rendered in provided Unity sample? Or you want to render 3D point cloud obtained from depth data?
I want to render depth frames to a texture!
We don't have a ready code for this. The difference from rendering a color stream is that you cannot just take the raw data and copy it into a texture to get a color image. You need to transform pixels from 16-bit depth values in millimeters to colors (usually using some palette). You can find an example of such transformation made on CPU in DepthImageVisualizer class. Note that you'll need to allow unsafe code in Player Settings to use that code in Unity scripts.
However, if you're looking to implement this in production then it's better to perform depth->color transformation with a custom shader. I suppose you can use Graphics.Blit method for this, with depth data loaded into source texture in R16 format, and custom shader assigned to a material passed to the method.
@jimzou Check out updated Unity sample. I've implemented rendering of the depth stream using shader-based approach outlined above. The most important are new DepthStreamRenderer class and DepthToColor shader.
Thank you very much for your work.
Thank you very much for your amazing work. Can you share an example of depth image stream demo in unity3D?