Open lukemao opened 3 years ago
Sadly this does not work well. You first need to run TSDAE, then do training with your labeled data.
thanks Nils!
Hey @nreimers!
When performing fine-tuning on top of a TSDAE-pretrained model, is it recommended to use the same pooling method as was used during TSDAE for fine-tuning and/or at inference time? E.g. if TSDAE was performed with pooling_model = models.Pooling(word_embedding_model.get_word_embedding_dimension(), 'cls')
, should 'cls'
pooling be used throughout? Thanks!
@bdferris642 Yes, it makes sense to use the same pooling
Thank you!
Hi,
Thanks for publishing and sharing the TSDAE approach. I am reading through the paper. I have one question.
In Section 7.4 of the paper, it recommends using TSDAE as the pretrained approach before in-domain fine-tune.
Can we use TSDAE as the fine-tune training method after doing a Out-of-the-box supervised pre-trained models like USE-large or SBERT-base-NLI.
e.g.:
Kindly regards, Lu.