graphdeco-inria / gaussian-splatting

Original reference implementation of "3D Gaussian Splatting for Real-Time Radiance Field Rendering"
https://repo-sam.inria.fr/fungraph/3d-gaussian-splatting/
Other
14.84k stars 1.94k forks source link

Changed gpu, now training slow(er) #872

Open DuVogel87 opened 4 months ago

DuVogel87 commented 4 months ago

Hello. I changed my gpu. I had an rtx 1070ti and now i am on a rtx 2080ti. when i run python convert everything seems to be pretty normal when i comes to speed. i feel a boost in time there. when it comes to training, python train, the process is veeeery slow. even slower then before when i used my rtx 1070ti. do other people experienced this issue as well and is there a work around? do i have to re-install everything. I have no idea!

for notice: when i use jawset postshot, i have no issues and everything runs faster compared to the old gpu.

thanks in advance!

jaco001 commented 4 months ago

GTX and RTX are different architectures. It's better to recompile project with proper CUDA.