Closed Morizeyao closed 4 years ago
T5 supports any task which can be cast as a text-to-text task, and I would argue NER can be cast as a text-to-text task something like so: Input: "Jim bought 300 shares of Acme Corp. in 2006." Target: "Jim [Person] Acme Corp. [Organization] 2006 [Time]" or something. We haven't tried T5 on any NER tasks but I would be surprised if this didn't work since other span-prediction tasks (like SQuAD) work well.
@Morizeyao did you try T5 with NER? Did you get good results?
We are trying to finetune T5 with NER downstream task and the results are not really good.
@craffel I have the same question than @Guillem96 and that issue does not seem to be resolved yet. Would you mind reopening this ticket?
We really can't provide support of the kind "T5 didn't work well for me on task X". Whether it worked or not will depend on a lot of factors (the dataset itself, the size of the dataset, the amount of fine-tuning, the chosen task format, etc). I can tell you at least that we have had great success applying T5 to NER tasks.
We really can't provide support of the kind "T5 didn't work well for me on task X". Whether it worked or not will depend on a lot of factors (the dataset itself, the size of the dataset, the amount of fine-tuning, the chosen task format, etc). I can tell you at least that we have had great success applying T5 to NER tasks.
@craffel , would you have links to some papers / examples where T5 was applied to BIO-labeling type tasks or token-level binary classification tasks, which could be used for inspiration? Thanks in advance!
Hi there, just as the title says. I wonder, does T5 support sequence labeling (like NER) tasks?
I don't see report of such tasks in the paper. (correct me If I am wrong).
Thank you very much!