NVlabs / tiny-cuda-nn

Lightning fast C++/CUDA neural network framework
Other
3.71k stars 452 forks source link

How to compose multiple networks with a single optimizer? #186

Open rnithin1 opened 1 year ago

rnithin1 commented 1 year ago

Hi,

Apologies if I may have missed this in the code. I had a question regarding how I might be able to use TCNN to train multiple networks/encodings end-to-end with a single optimizer with different learning rates. In my architecture, I have two hash-encodings that produce feature vectors which should be composed together somehow (by addition or multiplication), and another which takes in this composed feature vector and extra information and outputs a 3D vector. In pseudocode, it would look like the following:

enc_x = ingp_encoding_1(x)  # x is an N-dim vector
enc_y = ingp_encoding_2(y)
feat_xy = mlp_1(enc_x * enc_y)
rgb = mlp_2(composite_encoding(feat_xy, a, b, c, d))

In PyTorch, one could train this in an end-to-end manner by adding more params into torch.optim.Adam with different learning rates. Is it possible to do something similar in TCNN by somehow composing all these things into a DifferentiableObject and then creating a Trainer with a single optimizer?

kzhang2 commented 1 year ago

bumping for visibility!

lvmingzhe commented 1 year ago

Hi, have you solved this problem or some way to find an alternative?