CUHK-AIM-Group / EndoGaussian

EndoGaussian: Real-time Gaussian Splatting for Dynamic Endoscopic Scene Reconstruction
https://yifliu3.github.io/EndoGaussian/
MIT License
100 stars 5 forks source link

About the rendering images number #19

Closed darthandvader closed 3 months ago

darthandvader commented 4 months ago

Hi, I found that the rendering number is based on the camera views from colmap, but the it is not matched with the number of training images. I wonder if you guys generated camera views of each frame or you did a train/test split somewhere. Thank you !

yifliu3 commented 4 months ago

Hi, thanks for the attention. The rendering number is based on the training/testing split. We follow EndoSurf to get the dataset split. For example, we divide the pulling and cutting dataset into 8:1 training/testing split. You can find the detailed split strategy in the EndoNeRF_Dataset function of the scene/endo_loader.py

darthandvader commented 4 months ago

Thanks for your reply! I'm also trying to figure out if you've ever tried converting the final shs back to RGB? Since the final SH would be n 16 3 and the SH2RGB function is linear which would only get 16*3 results.

yifliu3 commented 4 months ago

Hi, the final SH (nx16x3) is actually features (the combination of colors and SH coefficients), and thus can not be directly converted to RGB. Its (nx16x3) function is mainly to participate in the rendering process to achieve view-dependent color modeling (render.py). If you want to get pure RGB of 3D Gaussians, you can use the first 3 channels of features (nx3), and then convert it to the color.