Open HosseinZaredar opened 2 years ago
Hi,
There was a small problem with the mask returned from TextTokenizer forward function. The next function using this mask needs a 2D tensor. Therefore, in TextTokenizer, the mask should not be unsqueezed before being returned.
The problem is fixed in this pull request.
Hi,
There was a small problem with the mask returned from TextTokenizer forward function. The next function using this mask needs a 2D tensor. Therefore, in TextTokenizer, the mask should not be unsqueezed before being returned.
The problem is fixed in this pull request.