Closed Black-Rhen closed 1 year ago
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Closing the issue, since no updates observed. Feel free to re-open if you need any further assistance.
在尝试提取特定layer词向量时使用了以下代码:
Get the word embeddings from layers 9 to 12
layer_start = 9 # Starting layer (inclusive) layer_end = 13 # Ending layer (exclusive) embeddings1 = [] embeddings2 = []
for layer in range(layer_start, layer_end): embeddings1_layer = module.get_embedding(tokens1, use_specified_layer=True, layer_num=layer) embeddings2_layer = module.get_embedding(tokens2, use_specified_layer=True, layer_num=layer)
得到如下报错: Traceback (most recent call last): in
embeddings1_layer = module.get_embedding(tokens1, use_specified_layer=True, layer_num=layer)
TypeError: TransformerModule.get_embedding() got an unexpected keyword argument 'use_specified_layer'
貌似并没有这个argument,请问怎样提取特定layer的词向量呢?直接提取的貌似是静态的词向量。