Open yhy-2000 opened 2 years ago
I have the same concern when utilizing the ltp model on long docs, where the token length is greater than 512. Have you figured out how to resolve this? Thanks!
Sorry, I dont find the solution either
------------------ 原始邮件 ------------------ 发件人: "kssteven418/LTP" @.>; 发送时间: 2022年7月11日(星期一) 上午7:25 @.>; @.**@.>; 主题: Re: [kssteven418/LTP] Where to get the pretrained model with max-seq-length over 512? (Issue #7)
I have the same concern when utilizing the ltp model on long docs, where the token length is greater than 512. Have you figured out how to resolve this? Thanks!
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>
Please find the comment at #8 , thank you.
OK, thank you for your reply!!
------------------ 原始邮件 ------------------ 发件人: "Sehoon @.>; 发送时间: 2022年7月12日(星期二) 凌晨1:20 收件人: @.>; 抄送: @.>; @.>; 主题: Re: [kssteven418/LTP] Where to get the pretrained model with max-seq-length over 512? (Issue #7)
Please find the comment at #8 , thank you.
— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>
I am trying to train a ltp model tackling long document, but where can I get the pretrained model with max-seq-length over 512? As far as I know, pretrained models provided by huggingface are all limited to length 512.