kelvinguu / neural-editor

Repository for "Generating Sentences by Editing Prototypes"
328 stars 61 forks source link

Weird output of the edit encoder #20

Open grll opened 6 years ago

grll commented 6 years ago

I spotted something strange happening in the edit_model/edit_encoder.py, seq_batch_noise function line 62:

new_values[:, 0, :] = phint*m_expand+ prand*(1-m_expand)

This basically return a noisy version of only one vector (the first one) and all other vector is putted to 0. Instead of every of them as specified in the docstring. This is then propagated to the input of the attention decoder hence making the attention layer of the insert and delete embedding using only the first insert or delete token information.

Is there a reason for this or is it just a mistake ?

wugh commented 5 years ago

@grll I also find some weird code, see my issue