theialab / banf

BANF: Band-limited Neural Fields for Levels of Detail Reconstruction
13 stars 2 forks source link

Interpolation is simply for upsampling MLP result? #2

Open DennisInTw opened 1 month ago

DennisInTw commented 1 month ago

Hello, Thanks for your good work. I have two questions about the interpolation operation. Could you please help provide more information about it?

Q1. The interpolation is only for upsampling MLP output? Why it can propagate the features? "...then the convolution above amounts to typical linear interpolation that is used to propagate features within voxels..."

Q2. Why didn't use the interpolation function from pytorch? Instead, you implement interpolation functions.

Thanks.

Ahanio commented 1 month ago

Hello @DennisInTw,

Thank you for your questions and for taking an interest in our work.

Q1: Interpolation is used to make the output signal continuous from discrete samples, allowing the model to query any point in space, not just at discrete positions. In terms of feature propagation, this means you can store features (such as color, density, or neural network features) at the grid nodes, and by using an interpolation kernel, you can obtain feature values continuously in space.

Q2: We did use linear interpolation for the 3D reconstruction problem, as seen here. For lower-dimensionality problems, we implemented the interpolation functions ourselves. Our implementations are already efficient in 2D or 1D, and we also explored different interpolation kernels not available in PyTorch, such as the sinc kernel.

Let us know if you have any additional questions.

Best regards, Ahan