leviswind / pytorch-transformer

pytorch implementation of Attention is all you need
239 stars 58 forks source link

A little misktake in modules.py #6

Closed ROBINADC closed 5 years ago

ROBINADC commented 5 years ago

in if __name__ == '__main__': outputs = position_encoding(num_units)(inputs) should be outputs = positional_encoding(num_units)(inputs)

banshee1 commented 5 years ago

This line is the code that the coder is debugging with and won't be run while you run train.py. So it doesn't matter.