Contains notebooks related to various transformers based models for different nlp based tasks
40
stars
18
forks
source link
getting 38% hamming score using this code for multilabel classification of my 24000 dataset into 26 classes. please suggest something so that casn go upto 90% #8
i am using this code only for multilabel classification of text into 26 labels but getting only 38% hamming score and 28% flat_score.
please suggest some changes in my cod so that can achieve above 90%
loss is not descreasing no matter how much i increase the epochs and i also tried increasing my batch size to 64 also with 2 gpu support but not getting very good accuracy.
i am using this code only for multilabel classification of text into 26 labels but getting only 38% hamming score and 28% flat_score. please suggest some changes in my cod so that can achieve above 90%
0it [00:00, ?it/s] Epoch: 0, Loss: 0.607304036617279 102it [01:51, 1.04s/it] Epoch: 0, Loss: 0.42913737893104553 202it [03:40, 1.04s/it] Epoch: 0, Loss: 0.31689316034317017 278it [05:05, 1.10s/it] 0it [00:00, ?it/s] Epoch: 1, Loss: 0.25217771530151367 102it [01:52, 1.04s/it] Epoch: 1, Loss: 0.2359163910150528 202it [03:41, 1.04s/it] Epoch: 1, Loss: 0.20691940188407898 278it [05:06, 1.10s/it] 0it [00:00, ?it/s] Epoch: 2, Loss: 0.18726885318756104 102it [01:52, 1.05s/it] Epoch: 2, Loss: 0.18049341440200806 202it [03:41, 1.05s/it] Epoch: 2, Loss: 0.1675453633069992 278it [05:06, 1.10s/it] 0it [00:00, ?it/s] Epoch: 3, Loss: 0.1804620772600174
loss is not descreasing no matter how much i increase the epochs and i also tried increasing my batch size to 64 also with 2 gpu support but not getting very good accuracy.