Closed ch3cook-fdu closed 2 years ago
Hi @ch3cook-fdu! Thanks for your interest! I've used the glove embedding at some point during the experiments but it did not improve the performance very much. So I did not use it in the later development of the model. You are welcome to add it to see if there is an improvement.
Thanks a lot! BTW, could you upload the training log? I found it slow to converge with CIDEr while training, I don't know if this is normal.
@ch3cook-fdu Yes, it did take some time to converge (the best CIDEr usually appears at epoch 30+). Please see this log.txt for reference. Hope it helps!
The cider performance has large fluctuation during training actually 🤔. Thx! That helps a lot!
I've notice that the Transformer Captioner does not use glove embedding as pretrained word embeddings. Would this hurt the captioner's performance?