Closed PhilipMay closed 3 years ago
Nice to see you have done something !
Why do you exactly do this
I am tying parameters here.
do I still need that in my case
I recommend keeping it.
does it " destroy" my pretrained generator and discriminator?
No
To be detailed:
Be free to tag me if you still have questions.
Ahh I see - tying means you do something like in a siamese network - right?
Yeah, both share parameters.
Hi, I have an pretrained ELECTRA generator and discriminator stored on disk. Both trained on a large corpus. Now I want to train it on a domainspecific corpus.
To do that I am loading them from disk by adding
.from_pretrained()
here:https://github.com/richarddwang/electra_pytorch/blob/ab29d03e69c6fb37df238e653c8d1a81240e3dd6/pretrain.py#L364-L365
My question is: Why do you exactly do this:
https://github.com/richarddwang/electra_pytorch/blob/ab29d03e69c6fb37df238e653c8d1a81240e3dd6/pretrain.py#L366-L367
and do I still need that in my case or does it " destroy" my pretrained generator and discriminator?
Many thanks Philip