Hello, I evaluated the model on ps_female_3, and use the pretrain model from here, but got this error:
Traceback (most recent call last):
File "render.py", line 194, in main
test(config)
File "render.py", line 84, in test
scene.load_checkpoint(load_ckpt)
File "/home/human/codes/liyiheng/codes/3dgs-avatar-release/scene/init.py", line 86, in load_checkpoint
self.converter.optimizer.load_state_dict(converter_opt_sd)
File "/home/human/anaconda3/envs/3dgs-avatar/lib/python3.7/site-packages/torch/optim/optimizer.py", line 166, in load_state_dict
raise ValueError("loaded state dict has a different number of "
ValueError: loaded state dict has a different number of parameter groups
I find the optimizer defined here has a opt_params of length 6, but the converter_opt_sd['param_groups'] of the pretrain model has a length of 7. I don't know why there is a difference.
Is there any guidance on the usage of the pretrain model ?
Thanks for reporting this. I delete some params when I clean the code so there's some misalignment. Simply comment the lines to stop loading the optimizer/scheduler as they are not used for test.
Hello, I evaluated the model on ps_female_3, and use the pretrain model from here, but got this error:
Traceback (most recent call last): File "render.py", line 194, in main test(config) File "render.py", line 84, in test scene.load_checkpoint(load_ckpt) File "/home/human/codes/liyiheng/codes/3dgs-avatar-release/scene/init.py", line 86, in load_checkpoint self.converter.optimizer.load_state_dict(converter_opt_sd) File "/home/human/anaconda3/envs/3dgs-avatar/lib/python3.7/site-packages/torch/optim/optimizer.py", line 166, in load_state_dict raise ValueError("loaded state dict has a different number of " ValueError: loaded state dict has a different number of parameter groups
I find the optimizer defined here has a opt_params of length 6, but the converter_opt_sd['param_groups'] of the pretrain model has a length of 7. I don't know why there is a difference.
Is there any guidance on the usage of the pretrain model ?