Open Jason-kid opened 5 years ago
+1
+1
On Tue, Aug 27, 2019 at 2:37 PM Mr-wang2016 notifications@github.com wrote:
+1
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/tensorflow/tensor2tensor/issues/1671?email_source=notifications&email_token=AK5XMBMJZ6IDL4R5NG7YMQTQGTKSXA5CNFSM4IOPU5G2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD5GZVNQ#issuecomment-525179574, or mute the thread https://github.com/notifications/unsubscribe-auth/AK5XMBI6O5MMHCDE6WVFVADQGTKSXANCNFSM4IOPU5GQ .
Tips: You can track the code where the positional embedding was add; For example, in transformer.py
:
positional embedding was add in functions: transformer_prepare_encoder
,_fast_decode
, transformer_prepare_decoder
As above, i want to add a new embedding for input embedding. The older = word embedding + position embedding. The new = word embedding + position embedding + my own embedding where are the codes i need to modify ?