microsoft / unilm

Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalities
https://aka.ms/GeneralAI
MIT License
19.48k stars 2.48k forks source link

Layoutlmv3-chinese #997

Open wumouren3000 opened 1 year ago

wumouren3000 commented 1 year ago

Describe Model I am using (UniLM, MiniLM, LayoutLM ...): Hello! Personally, I like this model very much. I used it to fine-tune the downstream tasks and easily achieved good results. I'm curious about the pre-training of this model. Can you ask me how much F1 you have achieved in the pre-training of v3-chinese?

superxii commented 1 year ago

Hi,

Could you please kindly share your training code and visualization code? I am having some problems on both training and visualization with Layoutlmv3-base-chinese. Thanks a lot.

tianchiguaixia commented 20 hours ago

很简单。就不到50行代码训练和推理好了。 1 (3) 1 (4)