kssteven418 / LTP

[KDD'22] Learned Token Pruning for Transformers
https://arxiv.org/abs/2107.00910
Apache License 2.0
93 stars 17 forks source link

Where to get the pretrained model with max-seq-length over 512? #7

Open yhy-2000 opened 2 years ago

yhy-2000 commented 2 years ago

I am trying to train a ltp model tackling long document, but where can I get the pretrained model with max-seq-length over 512? As far as I know, pretrained models provided by huggingface are all limited to length 512.

XueqiYang commented 2 years ago

I have the same concern when utilizing the ltp model on long docs, where the token length is greater than 512. Have you figured out how to resolve this? Thanks!

yhy-2000 commented 2 years ago

Sorry, I dont find the solution either

------------------ 原始邮件 ------------------ 发件人: "kssteven418/LTP" @.>; 发送时间: 2022年7月11日(星期一) 上午7:25 @.>; @.**@.>; 主题: Re: [kssteven418/LTP] Where to get the pretrained model with max-seq-length over 512? (Issue #7)

I have the same concern when utilizing the ltp model on long docs, where the token length is greater than 512. Have you figured out how to resolve this? Thanks!

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>

kssteven418 commented 2 years ago

Please find the comment at #8 , thank you.

yhy-2000 commented 2 years ago

OK, thank you for your reply!!

------------------ 原始邮件 ------------------ 发件人: "Sehoon @.>; 发送时间: 2022年7月12日(星期二) 凌晨1:20 收件人: @.>; 抄送: @.>; @.>; 主题: Re: [kssteven418/LTP] Where to get the pretrained model with max-seq-length over 512? (Issue #7)

Please find the comment at #8 , thank you.

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you authored the thread.Message ID: @.***>