Open dlazares opened 1 year ago
Thanks for your interest.
We do something different to the existing visualizers. I guess I just left lib_gart/model_utils.py
without cleaning it in the release for sanity check purposes and never used it in our main code. As in Eq.16 and Eq.5, like many other dynamic Gaussian rendering, we compute color manually outside the rasterizer (https://github.com/JiahuiLei/GART/blob/16c11f8a5bb3ae249a9d04dc9d98c316e10f1126/lib_render/gauspl_renderer.py#L104C10-L104C10), where the view angle is in each gaussian's local coordinate frame. But most visualizers are designed for static scenes, using the view angles in a global world coordinate frame is sufficient for static scenes. That's why you observe a discrepancy.
I used the
save_gauspl_ply
function inlib_gart/model_utils.py
to export the gaussian for the training view but the results don't look the same as the exported renders and I'm trying to figure out why.Is it a bug in the save function or something you do differently in the renderer?
Here's an example using the lab scene from the Neumann dataset with InstantAvatar format
It's not the best looking but you can still see his facial details in the gif but they seem to get lost somehow on the .ply export. here's a copy of the .ply visualized in antimatter's web viewer