alibaba / AliceMind

ALIbaba's Collection of Encoder-decoders from MinD (Machine IntelligeNce of Damo) Lab
Apache License 2.0
1.98k stars 291 forks source link

structralLM copy related work from LayoutLMv2 #31

Closed paulpaul91 closed 2 years ago

paulpaul91 commented 2 years ago

layoutlmv2 related work image structralral related work image

paulpaul91 commented 2 years ago

Both accepted by ACL2021, who copy who

lcl6679292 commented 2 years ago

Thank you for your interest in our work. StructuralLM and LayoutLM propose different solutions to the problem of visually-rich document modeling. The StructuralLM approach is original and distinct from LayoutLM in document types, model architectures and research findings. Our StructuralLM paper has properly cited the previous work of LayoutLM(v1), and referred to the baseline models used in the v1 paper, including the name RoBERTa. LayoutLM(v2) later replaced the baseline from RoBERTa to UniLMv2, which led to the discrepancy in the model names. As StructuralLM and LayoutLM both address the same research problem, which is an emerging topic with limited studies in the past, that results in the similarity in citations of the related work sections. We sincerely appreciate the contribution of LayoutLM to this research field.

paulpaul91 commented 2 years ago

Thank you for your interest in our work. StructuralLM and LayoutLM propose different solutions to the problem of visually-rich document modeling. The StructuralLM approach is original and distinct from LayoutLM in document types, model architectures and research findings. Our StructuralLM paper has properly cited the previous work of LayoutLM(v1), and referred to the baseline models used in the v1 paper, including the name RoBERTa. LayoutLM(v2) later replaced the baseline from RoBERTa to UniLMv2, which led to the discrepancy in the model names. As StructuralLM and LayoutLM both address the same research problem, which is an emerging topic with limited studies in the past, that results in the similarity in citations of the related work sections. We sincerely appreciate the contribution of LayoutLM to this research field.

interesting