Question
Guys, I have a NER model trained with 5 classes. Now I'd like to add one more class giving tagged data for that new class. Unfortunately, if use this approach, even if my language model is the previous trained NER, it just get information for the last class, loosing all the others.
So, how can I add more data to a NER model without the need to train everything again?
Additional context
First of all I have a Language Model (BERT LM) and CONLL file to be used to train NER. I have 5 classes in the CONLL file and I'm able to create the NER Model. It works greatly.
But now, I need to add one more class. How can I do that without train all adding the new class in the process? Should I use the previous NER model as language model? Should use the prediction head from the NER and combine it to the new one?
Question Guys, I have a NER model trained with 5 classes. Now I'd like to add one more class giving tagged data for that new class. Unfortunately, if use this approach, even if my language model is the previous trained NER, it just get information for the last class, loosing all the others.
So, how can I add more data to a NER model without the need to train everything again?
Additional context First of all I have a Language Model (BERT LM) and CONLL file to be used to train NER. I have 5 classes in the CONLL file and I'm able to create the NER Model. It works greatly.
But now, I need to add one more class. How can I do that without train all adding the new class in the process? Should I use the previous NER model as language model? Should use the prediction head from the NER and combine it to the new one?
Thanks in advance.