Open asalan570 opened 1 week ago
Maybe I don't have enough GPU memory?
Hello @asalan570,
Indeed, it's possible that 8GB VRAM for the GPU is not enough. As this is research code, it's not fully optimized and probbaly requires a bit more memory than what you have (I think 12GB should be enough). I'm sorry for that.
However, I see that you get an error during the mesh projection part, which is not supposed to take that much memory, so I'm a bit surprised... Let's investigate that!
Could you try rerunning the extract_mesh.py
script with the option --project_mesh_on_surface_points False
? This will skip the mesh projection and probably remove your error.
However, please note that the mesh projection greatly improves the quality of the mesh: It consists in reprojecting the vertices of the Poisson mesh on the dense surface point cloud sampled from Gaussians. This is very helpful and helps reducing the number of artifacts in the Poisson mesh.
If the problem comes from the mesh projection, then I will try to update the code and propose a mesh projection done "chunk by chunk" so that it uses less memory.
Hello @asalan570,
Indeed, it's possible that 8GB VRAM for the GPU is not enough. As this is research code, it's not fully optimized and probbaly requires a bit more memory than what you have (I think 12GB should be enough). I'm sorry for that.
However, I see that you get an error during the mesh projection part, which is not supposed to take that much memory, so I'm a bit surprised... Let's investigate that!
Could you try rerunning the
extract_mesh.py
script with the option--project_mesh_on_surface_points False
? This will skip the mesh projection and probably remove your error. However, please note that the mesh projection greatly improves the quality of the mesh: It consists in reprojecting the vertices of the Poisson mesh on the dense surface point cloud sampled from Gaussians. This is very helpful and helps reducing the number of artifacts in the Poisson mesh.If the problem comes from the mesh projection, then I will try to update the code and propose a mesh projection done "chunk by chunk" so that it uses less memory.
Thanks for your reply!
Just yesterday when I tried to extract the model on another computer (4070 12G), I also reported a Segmentation fault (core dumped).
I saw your reply this morning and tried the parameter :--project_mesh_on_surface_points False and successfully extracted the mesh!
Thank you again!
@Anttwo Hello, author!
I always encounter a Segmentation fault
when I executetrain.py
or train_full_pipeline.py
.
The extract_mesh.py
was successfully run using the --project_mesh_on_surface_points False
parameter using your comments. However, the '.ply 'file is only generated, and the'.obj 'file is not obtained.
When I try to use python extract_refined_mesh_with_texture.py -s /mnt/d/Project/res-gaussian/data/penhu/ -c /opt/project/SuGaR/output/vanilla_gs/penhu/ - m /opt/project/SuGaR/output/coarse/penhu/sugarcoarse_3Dgs7000_densityestim02_sdfnorm02/15000. Pt
,
this time an error:
Traceback (most recent call last): File "/opt/project/SuGaR/extract_refined_mesh_with_texture.py", line 45, in <module> extract_mesh_and_texture_from_refined_sugar(args) File "/opt/project/SuGaR/sugar_extractors/refined_mesh.py", line 32, in extract_mesh_and_texture_from_refined_sugar n_gaussians_per_surface_triangle = int(refined_model_path.split('/')[-2].split('_gaussperface')[-1]) ValueError: invalid literal for int() with base 10: 'sugarcoarse_3Dgs7000_densityestim02_sdfnorm02'
How do I use Sugar to generate the '.obj' model correctly?
Hi, I always encounter the error of [Segmentation fault] when executing [train_full_pipel. py] and [extract_mesh.py].
WSL-Ubuntu 2204
i7-12700KF
32GB RAM
4060 8G
commands :
info: