Open piegu opened 1 year ago
Issue opened in the Optimum library: https://github.com/huggingface/optimum/issues/1024
Have you considered making a smaller model? What is your model size?
One thing you can try (especially if you're using a multilingual model like https://huggingface.co/nielsr/lilt-xlm-roberta-base), then you can remove token embeddings of tokens of languages that you don't need.
See this blog post for more info: https://medium.com/@coding-otter/reduce-your-transformers-model-size-by-removing-unwanted-tokens-and-word-embeddings-eec08166d2f9
Hi,
I'm using Hugging Face libraries in order to run
LiLT
. How can I decrease inference time? Which code to use?I've already try
BetterTransformer
(Optimum
) andONNX
but none of them acceptsLiLT
model.NotImplementedError: The model type lilt is not yet supported to be used with BetterTransformer.
KeyError: "lilt is not supported yet.
Thank you.
Note: I asked this question here, too: https://github.com/jpWang/LiLT/issues/42