OctoberChang / X-Transformer

X-Transformer: Taming Pretrained Transformers for eXtreme Multi-label Text Classification
BSD 3-Clause "New" or "Revised" License
135 stars 28 forks source link

modeling.py : difference between RobertaForXMLC (etc.) class and transformers.RobertaForSequenceClassification? #8

Closed simonlevine closed 3 years ago

simonlevine commented 3 years ago

Hi, It's my understanding that the ForSequenceClassification variant is identical in function to the ForXMLC class defined in xbert/modeling.py. Is this correct? Thanks!

OctoberChang commented 3 years ago

This repo may be out of date. Feel free to try out latest X-Transformer implementation at PECOS repo (https://github.com/amzn/pecos), thanks.