jpWang / LiLT

Official PyTorch implementation of LiLT: A Simple yet Effective Language-Independent Layout Transformer for Structured Document Understanding (ACL 2022)
MIT License
335 stars 40 forks source link

how to train from scratch #40

Open uobinxiao opened 1 year ago

uobinxiao commented 1 year ago

just wondering how to train the pre-train model from scratch. Does this repo contain pretraining code?