if config.embedding_pretrained is not None:
self.embedding = nn.Embedding.from_pretrained(config.embedding_pretrained, freeze=False)
这里的freeze=Fasle 不是很理解,难道说词向量需要随着训练过程进行参数更新?
torch 2.0.1 版本是True 描述为不参数网络参数更新?
Args:
embeddings (Tensor): FloatTensor containing weights for the Embedding.
First dimension is being passed to Embedding as num_embeddings, second as embedding_dim.
freeze (bool, optional): If True, the tensor does not get updated in the learning process.
Equivalent to embedding.weight.requires_grad = False. Default: True
if config.embedding_pretrained is not None: self.embedding = nn.Embedding.from_pretrained(config.embedding_pretrained, freeze=False)
这里的freeze=Fasle 不是很理解,难道说词向量需要随着训练过程进行参数更新? torch 2.0.1 版本是True 描述为不参数网络参数更新? Args: embeddings (Tensor): FloatTensor containing weights for the Embedding. First dimension is being passed to Embedding as
num_embeddings
, second asembedding_dim
. freeze (bool, optional): IfTrue
, the tensor does not get updated in the learning process. Equivalent toembedding.weight.requires_grad = False
. Default:True