NVlabs / tiny-cuda-nn

Lightning fast C++/CUDA neural network framework
Other
3.7k stars 450 forks source link

[Question]: can tiny-cuda-nn build a network with layer's bias=0? #424

Closed zyc-bit closed 5 months ago

zyc-bit commented 6 months ago

Hello everyone, I have a question I would like to ask you, if you can reply to me, I would be very grateful.

In pytorch, I can do as follows:

self.mlp = torch.nn.Sequential(
    layer1,
    torch.nn.ReLU(inplace=True),
    layer2,
    torch.nn.ReLU(inplace=True),
    layer3,
)
**if bias_enable:
    torch.nn.init.constant_(self.mlp[-1].bias, 0)**

which I set mlp[-1] layer's bias=0 by usingtorch.nn.init.constant_()

Can tiny-cuda-nn do the same thing? How?

btw, I build network using tiny-cuda-nn as follows:

network_config = {
      "otype": "CutlassMLP",
      "activation": "ReLU",
      "output_activation": "Sigmoid",
      "n_neurons": layer_width,
      "n_hidden_layers": num_layers - 1,
}

self.tcnn_encoding = tcnn.Network(
    n_input_dims=in_dim,
    n_output_dims=out_dim,
    network_config=network_config,
)