google-research / jax3d

Apache License 2.0
729 stars 94 forks source link

MobileNeRF Inference on server side GPU #185

Open DWhettam opened 1 year ago

DWhettam commented 1 year ago

Hi,

I'm able to do inference successfully with MobileNeRF, and I'm now trying to benchmark it's performance on a few different devices, however many of these are remote servers so I only have terminal access. I've setup an ssh tunnel so I can access the interface remotely, however the code is running on my local machines GPU, not the GPU of the server, which I'm trying to benchmark. Is there anyway I can do inference on the server side, either by doing inference not through the browser, or by running webGL on the server instead of the client?

Thanks!