Closed rishikanthc closed 1 year ago
Hi, thanks for opening this issue. Could you provide a minimal example to recreate the error?
I don’t have a lot of time this week but I will see if I can look into it.
I tested deepcopy
and it works as expected. Could you please provide more information about your code to help with resolving the issue.
import torch, torchhd, copy
class Model(torch.nn.Module):
def __init__(self):
super().__init__()
self.emb = torchhd.embeddings.Projection(3, 4)
model = Model()
print(model.emb.weight)
# Parameter containing:
# tensor([[ 0.5032, 0.0142, -0.8640],
# [ 0.4262, -0.3184, 0.8467],
# [ 0.7050, -0.3374, 0.6238],
# [ 0.6886, -0.6895, 0.2245]])
model_copy = copy.deepcopy(model)
print(model_copy.emb.weigth)
# Parameter containing:
# tensor([[ 0.5032, 0.0142, -0.8640],
# [ 0.4262, -0.3184, 0.8467],
# [ 0.7050, -0.3374, 0.6238],
# [ 0.6886, -0.6895, 0.2245]])
model_copy.emb.weight[0, :] = 0
print(model_copy.emb.weight)
# Parameter containing:
# tensor([[ 0.0000, 0.0000, 0.0000],
# [ 0.4262, -0.3184, 0.8467],
# [ 0.7050, -0.3374, 0.6238],
# [ 0.6886, -0.6895, 0.2245]])
print(model.emb.weight)
# Parameter containing:
# tensor([[ 0.5032, 0.0142, -0.8640],
# [ 0.4262, -0.3184, 0.8467],
# [ 0.7050, -0.3374, 0.6238],
# [ 0.6886, -0.6895, 0.2245]])
Also tested saving and loading a model containing the random projection embedding, also works as expected:
In [21]: torch.save(model.state_dict(), "test.pk")
In [22]: state_dict = torch.load("test.pk")
In [23]: state_dict
Out[23]:
OrderedDict([('emb.weight',
tensor([[ 0.5032, 0.0142, -0.8640],
[ 0.4262, -0.3184, 0.8467],
[ 0.7050, -0.3374, 0.6238],
[ 0.6886, -0.6895, 0.2245]]))])
In [24]: model.load_state_dict(state_dict)
Out[24]: <All keys matched successfully>
Closed because of inactivity
Hi,
I have a piece of code where I'm saving the best model during training. The model includes the projection embedding from torchhd.
RuntimeError: The default implementation of __deepcopy__() for non-wrapper subclasses only works for subclass types that implement new_empty() and for which that function returns another instance of the same subclass. You should either properly implement new_empty() for your subclass or override __deepcopy__() if it is intended behavior for new_empty() to return an instance of a different type.
This error also persists with all other embeddings in general. I need this to work urgently and any help with fixing this would be appreciated.