yourh / AttentionXML

Implementation for "AttentionXML: Label Tree-based Attention-Aware Deep Model for High-Performance Extreme Multi-Label Text Classification"
245 stars 41 forks source link

What's the difference between AttentionXML and fastAttentionXML? #8

Closed MaYB2333 closed 4 years ago

MaYB2333 commented 4 years ago

Hi yourh !

I find 2 models in this repo, AttentionXML and fastAttentionXML, the latter of which seems not occurred in your paper. Also, I can't find where the PLT module is used in AttentionXML. Could you tell me the difference between these two models, and, where is the PLT modules in AttentionXML?

Thank you very much!

yourh commented 4 years ago

FastAttentionXML is the AttentionXML with PLT.

MaYB2333 commented 4 years ago

Thank you very much, so how can I run FastAttention on Wiki10-31K dataset? Seemingly there's no ['train']['sparse'] file.

yourh commented 4 years ago

You can download it from XMLRepository.

MaYB2333 commented 4 years ago

I see, Thank you very much!