NVlabs / nvdiffrec

Official code for the CVPR 2022 (oral) paper "Extracting Triangular 3D Models, Materials, and Lighting From Images".
Other
2.09k stars 222 forks source link

MLP texture does not support for DLMesh #20

Closed wangjksjtu closed 2 years ago

wangjksjtu commented 2 years ago

Thank you for the amazing work!!

I tried to enable the mlp texture for DLM - changing this line https://github.com/NVlabs/nvdiffrec/blob/main/train.py#L632 to

mat = initial_guess_material(geometry, True, FLAGS)

However, when I run the the job, it produces the following error:

iter=    0, img_loss=0.020718, reg_loss=0.000000, lr=0.00010, time=606.2 ms, rem=10.10 m
Traceback (most recent call last):
  File "/home/wangjk/programs/nvdiffrec/train.py", line 778, in <module>
    geometry, mat = optimize_mesh(glctx, geometry, mat, lgt, dataset_train, dataset_validate, FLAGS, pass_idx=0, pass_name="mesh_pass", 
  File "/home/wangjk/programs/nvdiffrec/train.py", line 542, in optimize_mesh
    total_loss.backward()
  File "/home/wangjk/anaconda3/envs/torch-ngp/lib/python3.9/site-packages/torch/_tensor.py", line 307, in backward
    torch.autograd.backward(self, gradient, retain_graph, create_graph, inputs=inputs)
  File "/home/wangjk/anaconda3/envs/torch-ngp/lib/python3.9/site-packages/torch/autograd/__init__.py", line 154, in backward
    Variable._execution_engine.run_backward(
RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.

Do you have any ideas about this? Really appreciate it ;)

jmunkberg commented 2 years ago

See below

jmunkberg commented 2 years ago

Fixed in CL 564cad1e67ad78761870fbc0ccf3f4f1f5f31f86

Thanks for reporting this @wangjksjtu !

wangjksjtu commented 2 years ago

Thanks a lot for the prompt reply and quick fixes! @jmunkberg Really appreciate your great help ;)