Closed ShivanshuPurohit closed 4 years ago
Hello Shivanshu, I'm not entirely sure what you mean. The BERT-based (and similar keyword extractors) work in supervised manner, and are a different branch of methods. You could theoretically generate keywords with RaKUn and then learn them with bert, however, this sounds a bit like the full neural approaches like the> https://gitlab.com/matej.martinc/tnt_kid/-/tree/master
On Wed, Oct 21, 2020 at 8:36 AM Shivanshu Purohit notifications@github.com wrote:
Can I use a specific BERT model with this code? Say I want to use bert-base-uncased to extract keywords. Will it work here?
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/SkBlaz/rakun/issues/6, or unsubscribe https://github.com/notifications/unsubscribe-auth/ACMSERABOAWNCOWAJPRDQA3SLZ6N7ANCNFSM4SZKME6A .
Hope the related approach suited your needs, I'm closing this issue until this is concretized some more
Can I use a specific BERT model with this code? Say I want to use bert-base-uncased to extract keywords. Will it work here?