jpWang / LiLT

Official PyTorch implementation of LiLT: A Simple yet Effective Language-Independent Layout Transformer for Structured Document Understanding (ACL 2022)
MIT License
335 stars 40 forks source link

HuggingFace version #2

Closed ManuelFay closed 1 year ago

ManuelFay commented 2 years ago

Hello, amazing job !

I love the paradigm of decoupling the LM and the Layout model at first, before fine-tuning with joint training ! I've managed to port my LayoutXLM code to your framework almost plug and play, and was wondering if you were planning to open an official model implementation on the HuggingFace transformers library ? As is, except for a few tweaks due to version changes, and perhaps a processor object wrapping the tokenizer not much is required, so I was wondering if you had plans to do so ? Cheers and again, great work ! Best, Manuel

jpWang commented 2 years ago

Hi Manuel, thanks for your attention and recognition of our work. So far, we don't have an urgent plan to do it. However, if this work gets more attention, or after we further improve this research, we will put it on the agenda.

bilelomrani1 commented 2 years ago

That would be awesome! Looking forward to hearing more about your research.

jpWang commented 2 years ago

@ManuelFay Hi Manuel, Recently I noticed that you have successfully made our LiLT models available on HuggingFace (For example, https://huggingface.co/manu/lilt-infoxlm-base). Thanks for your kind help and efforts ! 😃

vibeeshan025 commented 2 years ago

Any Idea of simple inference code for using the fine-tuned model.

NielsRogge commented 1 year ago

Hi folks (cc @jpWang),

Happy to share that I've got a working HuggingFace implementation. I'm able to reproduce the results (F1 of 88% on FUNSD). The model is here: https://huggingface.co/nielsr/lilt-roberta-en-base-finetuned-funsd

I'll open a PR soon :)

vishal-nayak1 commented 1 year ago

@NielsRogge (cc @jpWang) Hi, Are you planning to add this model to transformer library?It would be really great full.

jpWang commented 1 year ago

Hi folks (cc @jpWang),

Happy to share that I've got a working HuggingFace implementation. I'm able to reproduce the results (F1 of 88% on FUNSD). The model is here: https://huggingface.co/nielsr/lilt-roberta-en-base-finetuned-funsd

I'll open a PR soon :)

Hi, thanks for your great effort. Contact me if any problem encountered :)

NielsRogge commented 1 year ago

Hi @jpWang,

I've opened a PR here to add LiLT to 🤗 Transformers: https://github.com/huggingface/transformers/pull/19450.

Would you like to create an organization for the company/research institute you're part of, such that we transfer the weigths there?

jpWang commented 1 year ago

Hi @jpWang,

I've opened a PR here to add LiLT to 🤗 Transformers: huggingface/transformers#19450.

Would you like to create an organization for the company/research institute you're part of, such that we transfer the weigths there?

Hi @NielsRogge , I have joined the organization of our Lab: https://github.com/SCUT-DLVCLab. Is this ok? :)

NielsRogge commented 1 year ago

Hi,

What's the URL of the HuggingFace organization? I can't seem to find SCUT-DLVCLab on hf.co.

jpWang commented 1 year ago

Hi, I just created our organization: https://huggingface.co/SCUT-DLVCLab.

NielsRogge commented 1 year ago

Thanks a lot! Now 2 checkpoints are transferred there.

Could you give me your email address, such that I can set up a Slack channel? I'd like to discuss some things related to LiLT.

jpWang commented 1 year ago

My email is scutjpwang@foxmail.com. Thanks for your effort :)

jpWang commented 1 year ago

Hi folks, Happy to share that LiLT has been added to HuggingFace/transformers library:https://huggingface.co/docs/transformers/main/en/model_doc/lilt ! Thank @NielsRogge for his great efforts!