self.pts_linears = nn.ModuleList(
[nn.Linear(input_ch, W)] + [nn.Linear(W, W) if i not in self.skips else nn.Linear(W + input_ch, W) for i in range(D-1)])
Why (D-1) instead of D? There are 8 fully-connected ReLU layers in the paper, why did you use 7 layers?
self.pts_linears = nn.ModuleList( [nn.Linear(input_ch, W)] + [nn.Linear(W, W) if i not in self.skips else nn.Linear(W + input_ch, W) for i in range(D-1)])
Why (D-1) instead of D? There are 8 fully-connected ReLU layers in the paper, why did you use 7 layers?