nerfstudio-project / nerfacc

A General NeRF Acceleration Toolbox in PyTorch.
https://www.nerfacc.com/
Other
1.37k stars 113 forks source link

proposal net estimator - low reconstruction quality #199

Closed yyashpatel closed 1 year ago

yyashpatel commented 1 year ago

I was testing with the proposal network estimator to train the nerf . The reconstruction quality is not good.

What could be the reason ? When I have trained the same data using nerfstudio, it performs well .

Was just curious whether there is any implementation difference .?

Thanks

liruilong940607 commented 1 year ago

There is not any major difference that we know of. One thing is that we got rid of the annealing bc we didn’t find it useful, and it is not quite compatible with our implementation.

Actually how were you using it? Using the script in this repo or porting it into another repo?

yyashpatel commented 1 year ago

I am using the python APIs in my code . Not using the script directly given in the nerfacc repo

liruilong940607 commented 1 year ago

Any more information that you can provide? Like how did you actually using it in your code. For your reference, there is another example where we plug in the prop net into TiNeuVox’s code and that works fine:

https://github.com/liruilong940607/tineuvox/blob/a4feb27444d86c0ced5952930f7a41d5345d2b59/lib/tineuvox.py#L563

yyashpatel commented 1 year ago

Thanks for the reference, based on it and also the code provided in the examples folder in this nerfacc repo, I have a similar implementation.

In one dataset it works well, in some other dataset it produces a bad quality. So I don't know what exactly is going wrong.

I have attached the bad results below -

000004 000009

are there any hyperparamters needed to be tuned or it can be a data capture issue ?

liruilong940607 commented 1 year ago

Are those two images both with nerfacc’s proposal? The first one seems to be much worst than the second one.

One info that I can share is, from my experiments the proposal way seems to be quite sensitive to the network you set as the proposal network. Your first image reminds me some of previous failure experiments, where I resolved by tuning the hyper parameters of the proposal network.

Are you using tinycudann as the underlying network or your own thing?

liruilong940607 commented 1 year ago

Also when you say you were training the same data with nerfstudio, does that mean you were using the nerfacto model? And when you use nerfacc in your own repo, do you still use the nerfacto model?

yyashpatel commented 1 year ago

1) Yes the outputs are from the proposal network of nerfacc. And I am using tinycudann as the underlying network. By hyper-parameter tuning of the proposal network you mean proposal samples ?

2) when training with nerfstudio yes I use the nerfacto model. And when I use nerfacc in my repo, I use the network provided by you in the nerfacc repo. The only change I see in the network is appearance embeddings which is present in nerfacto and the scene contraction modification.

yyashpatel commented 1 year ago

Hi, changing parameters of proposal network and also number of samples gives considerable results .

I am closing the issue for now , thanks again !