tarrade / proj_multilingual_text_classification

Explore multilingal text classification using embedding, bert and deep learning architecture
Apache License 2.0
5 stars 2 forks source link

How are the pretrained models structured and how can we access the structure? #5

Closed vluechinger closed 4 years ago

vluechinger commented 4 years ago

What happens exactly in the BERT layer?

How can we unfold the structure using keras?

What happens exactly in the TFBertForSequenceClassification version?

tarrade commented 4 years ago

one of the best visualization:

image

I didn't find yet a way to see the full structure with keras but using pytorch: image image

TFBertForSequenceClassification return the logits (you can path in the config how much class do you have )