Closed xhc19930714 closed 3 years ago
@xhc19930714 Sorry for the late response. The error means that your input sequence should be pruned into 512 tokens (the maximum length which GPT2 can acccept), or you may turn into other models to find a better one.
I suggest pruning your data to aovid too long sequence.
Closed since there is no more activity.
It indicates that token indice is longer than the specified maximum sequence length for GPT(512). Where should I fix this problem? my GPU is 2 rtx3090. Probably large enough to run your model?