nerfstudio-project / nerfstudio

A collaboration friendly studio for NeRFs
https://docs.nerf.studio
Apache License 2.0
8.87k stars 1.18k forks source link

Adding a New Method: Different (bad) Results than nerfacto while trying to inherit from nerfacto #3238

Closed aalolexx closed 1 week ago

aalolexx commented 1 week ago

Hello Everybody!

In terms of a universitary project, I am trying to implement a new method for nerfstudio. The new method will be able to faster generate NeRFs of binarized sillhouette images - but that's not relevant for this issue now. Sadly already at the first step, which is setting up a new custom method, I am facing some problems.

I followed the "Addind a new Method" Guide in the nerfstudio documentation and created a new method based on the provided template. I tried to make everything inherit from the nerfacto model, as the provided template does too.

Unfortunatley, when training the model, the results seem really random and basically look like a big fractal/noise:

One of the Input Images:

The results of the custom method:

The results of the original nerfacto method with the same training dataset:

Since the model class etc inherits from nerfacto, I am assuming the the error is somewhere in the model configurations. So here are my custom model configurations:

alex_silhouette_model = MethodSpecification(
  config=TrainerConfig(
    method_name="alex-silhouette-model",
    steps_per_eval_batch=500,
    steps_per_save=2000,
    max_num_iterations=30000,
    mixed_precision=True,
    pipeline=CustomPipelineConfig(
        datamanager=CustomDataManagerConfig(
            dataparser=CustomDataParserConfig(),
            train_num_rays_per_batch=4096,
            eval_num_rays_per_batch=4096,
        ),
        model=CustomModelConfig(
            eval_num_rays_per_chunk=1 << 15,
        ),
    ),
    optimizers={
        "proposal_networks": {
            "optimizer": AdamOptimizerConfig(lr=1e-2, eps=1e-15),
            "scheduler": ExponentialDecaySchedulerConfig(lr_final=0.0001, max_steps=200000),
        },
        "fields": {
            "optimizer": RAdamOptimizerConfig(lr=1e-2, eps=1e-15),
            "scheduler": ExponentialDecaySchedulerConfig(lr_final=1e-4, max_steps=50000),
        },
        "camera_opt": {
           "optimizer": AdamOptimizerConfig(lr=1e-3, eps=1e-15),
            "scheduler": ExponentialDecaySchedulerConfig(lr_final=1e-4, max_steps=5000),
        },
    },
    viewer=ViewerConfig(num_rays_per_chunk=1 << 15),
    vis="viewer",
  ),
  description="Custom description"
)

The training data is the hook dataset from dnerf. I binarized it to fit my needs, but camera positions etc are copied exactley.

So my questions are:

Thanks in advance.

maturk commented 1 week ago

Hi, one question to start off, I made the template repo a long time ago now, and I am wondering if it works still? Basically, does the template repo train properly on any of the default sample datasets provided in nerfstudio (poster scene or other easy examples)?

aalolexx commented 1 week ago

Hi @maturk, thanks for your quick response and thanks for providing the template! Well it works in terms of not throwing any runtime errors, but the results are as stated in my issue. So maybe some newer configurations are missing?

aalolexx commented 1 week ago

I have found the issue. The problem was, that the custom conf file provied in the template had the average_init_density=0.01 param missing. Now when setting it to 0.001 the model seems to produce the same results than nerfacto.

maturk commented 1 week ago

I have found the issue. The problem was, that the custom conf file provied in the template had the average_init_density=0.01 param missing. Now when setting it to 0.001 the model seems to produce the same results than nerfacto.

@aalolexx, thanks for finding the issue. Would you like to make a pull-request in the template repository and fix this average_init_density=0.01 missing parameter in the config? It would indeed be very good that the template works... and does not cause other users any annoying issues. Let me know if you are able to make a PR for it and I can merge it.

Thanks, maturk

aalolexx commented 1 week ago

Hi @maturk. Sure, I added a PR to the repo here

maturk commented 1 week ago

Great @aalolexx, thanks for this :)