tensorflow / text

Making text a first-class citizen in TensorFlow.
https://www.tensorflow.org/beta/tutorials/tensorflow_text/intro
Apache License 2.0
1.21k stars 333 forks source link

Fix vocab_size in PositionalEmbedding in Transformer notebook #1266

Open 8bitmp3 opened 2 months ago

8bitmp3 commented 2 months ago

@MarkDaoust @markmcd

Also addresses https://github.com/tensorflow/text/issues/1264

review-notebook-app[bot] commented 2 months ago

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

JuanVargas commented 2 months ago

I corrected the code as suggested by Mark Doust and that solved the problem. Thank you! The code is:

embed_pt = PositionalEmbedding(vocab_size=tokenizers.pt.get_vocab_size().numpy(), d_model=512) embed_en = PositionalEmbedding(vocab_size=tokenizers.en.get_vocab_size().numpy(), d_model=512)

That said, why is that correction not added to the notebook at the TF site??