Hi, I am using this library to define a trainable weight in my nn.Module class. However, I am unable to add it as a nn.Parameter. Thus, I'm trying to add the core tensors as a parameter in my module like this:
self.weight = tn.randn(self.n, self.in, self.out, ranks_tucker=self.rank, device='cuda', requires_grad=True)
cores = []
for c_i, core in enumerate(self.weight.cores):
core = nn.Parameter(core)
self.register_parameter('tucker_core_{}'.format(c_i), core)
cores.append(core)
self.weight.cores = cores
Us = []
for u_i, u in enumerate(self.weight.Us):
u = nn.Parameter(u)
self.register_parameter('tucker_Us_{}'.format(u_i), u)
Us.append(u)
self.weight.Us = Us
self.model_params = nn.ParameterList(cores + Us)
Then, I'm using Adam optimizer in the default way:
However, the network doesn't seem to learn anything. The weights doesn't update, which probably means the optimization isn't working. Do you have any ideas how to fix this? Many thanks for this nice repo!
Hi, I am using this library to define a trainable weight in my
nn.Module
class. However, I am unable to add it as ann.Parameter
. Thus, I'm trying to add the core tensors as a parameter in my module like this:Then, I'm using Adam optimizer in the default way:
However, the network doesn't seem to learn anything. The weights doesn't update, which probably means the optimization isn't working. Do you have any ideas how to fix this? Many thanks for this nice repo!