Closed mazzzystar closed 6 years ago
Hi, @mazzzystar , have you solved the problem? I think your solution is quite straightforward.
@csyanbin Hi I solved the problem but not based on his work.
I created an Embedding for emb
, and my zqx
comes from the nearest vector for zex
based on code below:
def find_nearest(zex, emb):
"""
zex: ->(-1, self.z_dim)
emb: (k, z_dim)
"""
j = l2_dist(zex[:, None], emb[None, :]).sum(2).min(1)[1]
# print("j_idx={}".format(j))
return emb[j], j # [1250, 64]
Then use a hook to return back the grad from zqx
to encoder
Thanks. That seems good. Can I ask how to use hook function to return the grad from zqx to encoder?
I wrote a tiny script with comment to let you know the general meanings.
# set hook var to zex
org_h = z_e_x
# define hook function
def hook(grad):
nonlocal org_h
self.saved_grad = grad
self.saved_h = org_h
return grad
# register hook
if z_q_x.requires_grad:
z_q_x.register_hook(hook)
# define a function to backward hook function in model.py.
def bwd(self):
self.saved_h.backward(self.saved_grad)
# backward loss
model.backward()
model.bwd()
Thanks so much! That is really helpful!
Hi, thanks for your implementation ! I'm now trying to implement the
audio
experiments of VQ-VAE. But when try to imitate your code, there is something I got confused:I omit the
nn.Embedding
for VQEmbedding. My code is:So my
z_q_x
andz_e_x
have the same dim, say(1, 256, 16000)
(Batch, Dim, Length).But when I train the model by computing the
.grad
:Error happened:
It means my
z_q_x
does not havegrad
. Actually because I dido some quantization work, myz_q_x
andz_e_x
areLongTensor
, is this the reason for no grad ?