VinAIResearch / PhoBERT

PhoBERT: Pre-trained language models for Vietnamese (EMNLP-2020 Findings)
MIT License
636 stars 92 forks source link

What is the max sequence length of the encoder and decoder? #28

Closed ithieund closed 3 years ago

ithieund commented 3 years ago

Hi @datquocnguyen , Can you specify the max sequence length of the model's encoder and decoder?

I want you use this model for regression (text generation task). Is that possible? Thank you very much!

datquocnguyen commented 3 years ago

Max sequence length is 256. Btw, it's a BERT-based model, i.e. Transformer encoder only. It is possible to use PhoBERT for text generation: https://ai.stackexchange.com/questions/9141/can-bert-be-used-for-sentence-generating-tasks

ithieund commented 3 years ago

Thank you.