Closed smandava98 closed 1 year ago
I have a similar question, i am looking to embed a renderer for these outputs in a realtime engine (c++/ DX12)
Would you be able to outline how the data is stored in the ply file please?
Hi,
1) sorry, we can't support Mac systems or specific cloud computing setups 2) the point cloud should resemble the scene, at least in close proximity to the cameras. The further away, the more the geometry will be off, since the system is free to optimize in any way it wants as long as it looks good. So the geometry close to the cameras should be fine. The colors is a different issue, we can look into exporting a version of the point cloud with colors, that should be easy enough. 3) The cameras need not be centered around an object, though it does work best if that is the case.
Hi, I implemented gaussian splatting viewer on NeRFStudio, may be you can use this on your Mac OS. Here is the repository: https://github.com/yzslab/nerfstudio/tree/gaussian_splatting
You need to install the nerfstudio denpendices first, then plyfile==0.8.1
, submodules/diff-gaussian-rasterization
and submodules/simple-knn
. By running python nerfstudio/scripts/gaussian_splatting/run_viewer.py --model-path GAUSSIAN_TRAINING_OUTPUT_DIR
, you can use the web-based viewer of nerfstudio to view your scene.
It has not been well tested yet, and not as fast as SIBR_viewer, and only rendering is support (no training).
Preview:
@yzslab very cool!!!! I just want to give a pointer to the new Nerfstudio Template https://github.com/nerfstudio-project/nerfstudio-method-template (edit: link to official repo) which can perhaps provide an affordance for re-using much of Nerfstudio without actually forking the whole repo. Looking at your diff, it strikes me the template API would be enough for this interop, with perhaps some small nerfstudio
changes to accommodate the GaussianSplattingCamera
camera.
@yzslab very cool!!!! I just want to give a pointer to the new Nerfstudio Template https://github.com/nerfstudio-project/nerfstudio-method-template (edit: link to official repo) which can perhaps provide an affordance for re-using much of Nerfstudio without actually forking the whole repo. Looking at your diff, it strikes me the template API would be enough for this interop, with perhaps some small
nerfstudio
changes to accommodate theGaussianSplattingCamera
camera.
Thanks for your recommendation. I will try it later.
Hi @yzslab
we discussed this now internally, thank you for your interest in our work. But please note that the Gaussian splatting code cannot be distributed with an Apache license. This should be clarified in the LICENSE file of your fork, stating clearly that the files in the "submodule" directory are subject to a different license; please include our license and a pointer to the original repo very clearly.
The initial page of your repo should clearly state this issue (at the top of the page) so there is no misunderstanding.
Hi @yzslab
we discussed this now internally, thank you for your interest in our work. But please note that the Gaussian splatting code cannot be distributed with an Apache license. This should be clarified in the LICENSE file of your fork, stating clearly that the files in the "submodule" directory are subject to a different license; please include our license and a pointer to the original repo very clearly.
The initial page of your repo should clearly state this issue (at the top of the page) so there is no misunderstanding.
Hi, sorry about this, I will fix it soon.
Hi. I have a couple questions around rendering.
Firstly, I'm on a cloud GPU that I am ssh-ed into. My machine is a MacBook Pro. I'm a bit confused on how to set this up so I can utilize SIBR locally as my cloud GPU does not have a display? Looking at the SIBR repo it is only available on Windows and Linux so I can't run it on my Macbook.
Secondly, I viewed the image renderings in my output folder and they look fine. But when I view the point_cloud.ply file and import it as a mesh in Meshlab, the output does not resemble a 3D scene representation of those images at all. Is there a better way to get a 3D point cloud representation of my scene alongside with color and texture?
One note: The video I took is not particularly centered around an object but an entire scene and the camera moves around slightly (enough to render new views laterally but not enough to where the entire scene can be represented). Please let me know if this is an issue.