Closed asbeg closed 3 years ago
By default the code expects 1 gpu on which it trains (and evals), so it won't work if you don't have gpu. https://github.com/kwea123/nerf_pl/blob/19a290103fd8df211a85a150daff861b53d59942/train.py#L173 If you really want to train without gpu (not at all recommended), please look at the pytorch-lightning documentation of how to use cpu to train, it should be easy to just specify another argument.
Hello, I have followed your example to train NERF on my own data. So I have seen you and other guys have some success with single object scene? and get this error.
Im using MacOS based M1 Chip, does anyone have any idea what this might be related to. I can't figure out what's going on. Can you explain what the problem might be and how to fix it ? Thank you in advance for any help.
I am trying to run