yihua7 / SC-GS

[CVPR 2024] Code for SC-GS: Sparse-Controlled Gaussian Splatting for Editable Dynamic Scenes
https://yihua7.github.io/SC-GS-web/
MIT License
427 stars 21 forks source link

Question for paper #2

Closed SYSUykLin closed 6 months ago

SYSUykLin commented 6 months ago

Hello: Great paper. I have two questions. 1) In 4.2. Dynamic Scene Rendering, u say

We derive the dense motion field of Gaussians using linear blend skinning (LBS) [37] by locally interpolating the transformations of their neighboring control points.

What I understand is the current number of the GS can not support the keypoints training, so we need the generate more points via LBS to train. Is that right?

2) Another paper "EditableNeRF: Editing Topologically Varying Neural Radiance Fields by Key Points" is similar to u, Is there any difference at the method aspect?

Thanks

yihua7 commented 6 months ago

Thank you for your interest in our work.

  1. I am not completely clear on your statement. The meaning of the sentence is just that LBS is introduced as a method to determine the deformation of dense Gaussians using sparse control points.

  2. Our paper aims to synthesize high-quality and novel views of dynamic scenes while also facilitating the manipulation of motion through the use of ARAP deformation strategy and LBS. Gaussian splatting is employed as the 3D representation to ensure fast and high-quality rendering. With the user's selection of control points and subsequent dragging, motion editing can be achieved in real-time. EditableNeRF is an impressive piece of work that just caught my attention and thank you for your introduction. While it appears to be focused on editing only and lacks real-time interactive editing, it is nonetheless a remarkable achievement that we plan to cite in our paper!

SYSUykLin commented 6 months ago

Thanks for your reply. I see.