Open irena123333 opened 1 year ago
Hi, based on our hyperparameters, the maximum length of the words can is 77 tokens, but increasing this length seems not influence much of the performance because the RoBERTa model is pretrained before.
So as far as we could expect, limit the text less than 70 words can be safe.
Hi, Can you tell me the maximum length of the words the clap model can encode, so that I can limit my input text length in case of some errors. Thank you!