samsucik / knowledge-distil-bert

Master's thesis project in collaboration with Rasa, focusing on knowledge distillation from BERT into different very small networks and analysis of the students' NLP capabilities.
12 stars 0 forks source link