wietsedv / bertje

BERTje is a Dutch pre-trained BERT model developed at the University of Groningen. (EMNLP Findings 2020) "What’s so special about BERT’s layers? A closer look at the NLP pipeline in monolingual and multilingual models"
https://aclanthology.org/2020.findings-emnlp.389/
Apache License 2.0
133 stars 10 forks source link

Question #23

Closed livein21st closed 2 years ago

livein21st commented 2 years ago

Hi,

My name is Nirav, currently pursuing Ph.D. at TU Delft on NL and NE extraction from dutch historical text.

I found your model on huggingface and really wanted to try it out with my own experiment. However, I don't if this model can be used with Spacy and Apple M1 chip.

Do you have any instructions to steps written specifically for using Bertje with spacy on the Apple M1 machine? Also I was wondering if you have time to about the NLP pipeline in Spacy with Bertje model?

Let me know.

Thank you.

wietsedv commented 2 years ago

You can use BERTje perfectly fine on the Apple M1 chip since PyTorch has Apple Silicon support. I do not have experience with transformers via Spacy, but if you can use any HuggingFace model hub model via Spacy, it should work exactly the same as with original English BERT. I do not have the intention to invest any time into researching Spacy compatibility.

Good luck!

livein21st commented 2 years ago

Okay, Thanks for this information. I will try to setup a pipeline and use this model for some NE extraction.