microsoft / unilm

Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
https://aka.ms/GeneralAI
MIT License
20.19k stars 2.55k forks source link

1D position embedding in LayoutLM #350

Open rtanaka-lab opened 3 years ago

rtanaka-lab commented 3 years ago

Did LayoutLM learn 1D position embedding during pre-training? The LayoutLM paper did not describe about it, but the official code contains 1D position embedding.

ruifcruz commented 3 years ago

As far as I know this is already a feature of Bert. That's possibly the reason why you haven't seen it highlighted

rtanaka-lab commented 3 years ago

Thank you for your reply. I have already understood LayoutLM model was initialized from BERT. If LayoutLM did not use 1D position embedding when pre-training on IIT-CDIP dataset, I am wondering if causing the forgetting of the 1D position information.