wanmeihuali / taichi_3d_gaussian_splatting

An unofficial implementation of paper 3D Gaussian Splatting for Real-Time Radiance Field Rendering by taichi lang.
Apache License 2.0
648 stars 61 forks source link

Visualization #116

Closed hardikdava closed 11 months ago

hardikdava commented 1 year ago

Thank you for your amazing work. I have 2 questions regarding visualization.

wanmeihuali commented 1 year ago

Web visualizer based on vulkan is in future plans, but not available for now. I'm still trying to implement the radix sort by taichi, but meet some problem for int64 support. The parquet file is just a pandas table, the columns are very straightforward. You can just take the x,y,z,alpha columns to build any point cloud file. However, most of the points in the point cloud are almost transparent, I've tried to build mesh by the point cloud but the result is not very good. Maybe MarchCube method still works, but it requires a lot more code.

hardikdava commented 1 year ago

@wanmeihuali Thanks for your reply. I was able to extract pointcloud from .parquet file. Is there anyway to get textured mesh output? Like converting feature vectors to meaningful RGB values?

wanmeihuali commented 1 year ago

@hardikdava Currently, there isn't an efficient way to obtain a textured mesh output. The difficulties stem from two aspects:

You might have noticed that the alpha value in the parquet represents the opacity of each point. Algorithms like the one used here, akin to NeRF, assume all 3D objects are semi-transparent and render them through volume rendering. The issue arising from this is that, unlike traditional point clouds where points are distributed on the surface of objects, the point cloud generated by this algorithm contains numerous near-transparent points distributed near the surface of objects. The color stacking of these points accurately portrays the surface of the object. Conventional point-cloud-to-mesh algorithms struggle with this scenario, often not taking into account the opacity of the points at all. My previous experiments in this area haven't been very successful. You might need a customized point-cloud-to-mesh conversion algorithm.

Even if you manage to extract a mesh from this point cloud, obtaining the corresponding colors remains imprecise. This is because the current algorithm employs spherical harmonic functions to represent colors, meaning that the color of the same point changes depending on the camera angle. For approximate color values, I would suggest referring to the r_sh0, g_sh0, and b_sh0 columns, taking the sigmoid of each, and then mapping the result to an integer between 0 to 255. Though this color estimation may not be entirely accurate, it shouldn't be too far off.


(r', g', b')=(sigmoid(r_sh0), sigmoid(g_sh0), sigmoid(b_sh0))\times 255
hardikdava commented 11 months ago

Is it possible to use this webgl based viwer for visualizing the data? https://antimatter15.com/splat/

wanmeihuali commented 11 months ago

@hardikdava It should be possible, but you might need to modify the color computing formula by adding a sigmoid after the SH function.