First attempts to do this have been done in notebook 10_Embeddings.
Although, we came to the conclusion that embedding single words only makes sense for "traditional" methods like word2vec and other vector representations that do not depend on the whole sentence.
For BERT embeddings, we should implement a sentence representation which would make more sense.
More information: https://www.quora.com/What-are-the-main-differences-between-the-word-embeddings-of-ELMo-BERT-Word2vec-and-GloVe
Also: the links regarding embeddings in confluence
To Do: sentence representation in different languages
First attempts to do this have been done in notebook 10_Embeddings.
Although, we came to the conclusion that embedding single words only makes sense for "traditional" methods like word2vec and other vector representations that do not depend on the whole sentence. For BERT embeddings, we should implement a sentence representation which would make more sense. More information: https://www.quora.com/What-are-the-main-differences-between-the-word-embeddings-of-ELMo-BERT-Word2vec-and-GloVe Also: the links regarding embeddings in confluence
To Do: sentence representation in different languages