Closed rebecajohn closed 1 month ago
@rebecajohn, related to this? https://github.com/peract/peract/issues/24
If i use persistent=False , i get the following error. -> agents/peract_bc/qattention_peract_bc_agent.py", line 668, in load_weights b = merged_state_dict['_voxelizer._ones_max_coords'].shape[0] KeyError: '_voxelizer._ones_max_coords'
/agents/peract_bc/qattention_peract_bc_agent.py", line 674, in load_weights self._q.load_state_dict(merged_state_dict) File "/usr/local/lib/python3.8/dist-packages/torch/nn/modules/module.py", line 2189, in load_state_dict raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format( RuntimeError: Error(s) in loading state_dict for QFunction: size mismatch for _qnet.input_preprocess.conv3d.weight: copying a param with shape torch.Size([64, 10, 1, 1, 1]) from checkpoint, the shape in current model is torch.Size([64, 128, 1, 1, 1]). size mismatch for _qnet.patchify.conv3d.weight: copying a param with shape torch.Size([64, 64, 5, 5, 5]) from checkpoint, the shape in current model is torch.Size([64, 128, 5, 5, 5]). size mismatch for _qnet.up0.conv_up.2.conv3d.weight: copying a param with shape torch.Size([64, 64, 5, 5, 5]) from checkpoint, the shape in current model is torch.Size([64, 128, 5, 5, 5]). size mismatch for _qnet.trans_decoder.conv3d.weight: copying a param with shape torch.Size([1, 64, 3, 3, 3]) from checkpoint, the shape in current model is torch.Size([64, 128, 3, 3, 3]). size mismatch for _qnet.trans_decoder.conv3d.bias: copying a param with shape torch.Size([1]) from checkpoint, the shape in current model is torch.Size([64]). QMutex: destroying locked mutex