zju3dv / EasyVolcap

[SIGGRAPH Asia 2023 (Technical Communications)] EasyVolcap: Accelerating Neural Volumetric Video Research
Other
617 stars 45 forks source link

vram question #29

Closed strix214 closed 6 months ago

strix214 commented 6 months ago

Hello, I am currently using RTX2060s 8g graphics card and encountering a gpu out of memory issue when running the demo inference. Is there a solution? If not, which graphics card with 12gb or 16gb of video memory is more recommended for development on easyvolcap?

dendenxu commented 6 months ago

Hi @prettycrazy, thanks for using our code! Could you provide the exact demo script you used?

The EasyVolcap framework itself doesn't have any specific requirement for VRAM size. VRAM usage depends on the typical algorithm that you want to run. For example, training an NGP model (the config begins with l3mhet) shouldn't take more than 5GB. And training a standard-resolution 4K4D model shouldn't take more than 6GB either.

One possible solution for OOM is to lower the rendered image size by setting val_dataloader_cfg.dataset_cfg.ratio (e.g. append val_dataloader_cfg.dataset_cfg.ratio=0.1 to your command) if you're running offline rendering and setting viewer_cfg.render_ratio (e.g. append viewer_cfg.render_ratio=0.15) if you're running the GUI.