Hi, I'm trying to use WKPooling on your fine-tuned roberta base models, but I'm getting:
File ".../lib/python3.7/site-packages/sentence_transformers/models/WKPooling.py", line 47, in forward
token_embedding = self.unify_token(token_feature)
File ".../lib/python3.7/site-packages/sentence_transformers/models/WKPooling.py", line 75, in unify_token
Q, R = torch.qr(window_matrix.T)
AttributeError: 'Tensor' object has no attribute 'T'
AFAIK there is no .T in pytorch. Should it be torch.transpose?
Hi, I'm trying to use WKPooling on your fine-tuned roberta base models, but I'm getting:
AFAIK there is no .T in pytorch. Should it be torch.transpose?