Open louis030195 opened 2 years ago
Now MarkupLM is not supported by the package transformers
of huggingface, so you can only use it by downloading our source code. We will work on it to make MarkupLM appear on transformers
soon.
Hi,
I've added MarkupLM to Transformers here: https://github.com/NielsRogge/transformers/tree/modeling_markuplm/src/transformers/models/markuplm
However, I've not opened a PR yet, as I'd like to have a MarkupLProcessor
(similar to LayoutLMv2Processor
), that allows to prepare all data for the model (rather than only tokenizing text).
Feel free to work further on my branch.
@NielsRogge Thanks for adding MakupLM into the great transformers
library! We have add a processor for MarkupLM
like LayoutLMv2Processor
as you required, and opened a PR under your branch. However this implementation is not so complete as we are not familiar with all the apis in transformers
. We would appreciate it very much if you can kindly help us improve and officially release it.
@NielsRogge Any updates for adding MarkupLM to Transformers?
@NielsRogge you are amazing. Thank you for this!
MarkupLM is now part of the Transformers library, feel free to close this issue :)
Describe the bug Model: markuplm
The problem arises when using:
A clear and concise description of what the bug is.
To Reproduce Steps to reproduce the behavior:
Expected behavior A clear and concise description of what you expected to happen. The tokenizer and model are properly loaded.
Platform: Google Colab
Python version:
PyTorch version (GPU?):