tensorflow / tensor2tensor

Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.
Apache License 2.0
15.5k stars 3.49k forks source link

Add New Embedding #1671

Open Jason-kid opened 5 years ago

Jason-kid commented 5 years ago

As above, i want to add a new embedding for input embedding. The older = word embedding + position embedding. The new = word embedding + position embedding + my own embedding where are the codes i need to modify ?

Mr-wang2016 commented 5 years ago

+1

autobotasia commented 5 years ago

+1

On Tue, Aug 27, 2019 at 2:37 PM Mr-wang2016 notifications@github.com wrote:

+1

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/tensorflow/tensor2tensor/issues/1671?email_source=notifications&email_token=AK5XMBMJZ6IDL4R5NG7YMQTQGTKSXA5CNFSM4IOPU5G2YY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD5GZVNQ#issuecomment-525179574, or mute the thread https://github.com/notifications/unsubscribe-auth/AK5XMBI6O5MMHCDE6WVFVADQGTKSXANCNFSM4IOPU5GQ .

dreamingo commented 5 years ago

Tips: You can track the code where the positional embedding was add; For example, in transformer.py:

positional embedding was add in functions: transformer_prepare_encoder_fast_decode, transformer_prepare_decoder