How to train the forward rendering network, and whether it is necessary to retrain the forward rendering network using your own data set
The rendering network in model/rendnet.py has associated weights in the .npy files of the renderer directory.
The training code for that part is not in the repository but that training process is quite standard (simple pix2pix-like image transformation) and it was done separately. We used the images in our dataset.
If the intent was to couple the training of the main inverse modules with that of the renderer, then this would become trickier (although likely interesting).
Using Google Translate:
The rendering network in
model/rendnet.py
has associated weights in the.npy
files of therenderer
directory.The training code for that part is not in the repository but that training process is quite standard (simple pix2pix-like image transformation) and it was done separately. We used the images in our dataset.
If the intent was to couple the training of the main inverse modules with that of the renderer, then this would become trickier (although likely interesting).