creiser / kilonerf

Code for KiloNeRF: Speeding up Neural Radiance Fields with Thousands of Tiny MLPs
471 stars 52 forks source link

multi-network using for the distill procedure #16

Open lukeju opened 2 years ago

lukeju commented 2 years ago

Hi! Firstly, I would like to thank you for your outstanding work speeding up NeRF.

I have trained a multi-network in the pretrain procedure(with 27 middle MLP).I want to know can I use this model as the pretrain in your code during the distill procedure.

creiser commented 2 years ago

Hi :) You trained a multi network consisting of 27 middle-sized MLPs from scratch and want to use these for bootstrapping a multi network with a higher number of small MLPs? Sounds like a cool idea. The current code only supports distillation from a single network to a multi network, but it should not be hard to adapt it to suit your use case: multi network -> multi network.

lukeju commented 2 years ago

Okay,thank you for your reply,I'm now adapting the code.