An experiment about re-implementing supervised learning models based on shallow neural network approaches (e.g. fastText) with some additional exclusive features and nice API. Written in Python and fully compatible with Scikit-learn.
fastText supervised model does not take into account of the document and words representation, it just learns bag of words and labels.
embeddings are computed only on the relation word->label. it would be interesting to learn jointly the semantic relation label<->document<->word<->context.
for now it is only possible to pre-train word embeddings and then use them as initial vectors for the classification algorithm
fastText supervised model does not take into account of the document and words representation, it just learns bag of words and labels. embeddings are computed only on the relation word->label. it would be interesting to learn jointly the semantic relation label<->document<->word<->context. for now it is only possible to pre-train word embeddings and then use them as initial vectors for the classification algorithm