google-research / adapter-bert

Apache License 2.0
483 stars 49 forks source link

missing processors #5

Open jacobdeasy opened 4 years ago

jacobdeasy commented 4 years ago

Congratulations on the great paper!

One question, do you have additional processor classes? At the moment, the code reads:

`processors = {
      "cola": ColaProcessor,
      "mnli": MnliProcessor,
      "mrpc": MrpcProcessor,
  }`

and later:

`if task_name not in processors:
    raise ValueError("Task not found: %s" % (task_name))`

meaning that only 3 datasets can be used for training.

It would be useful for replication purposes to have all processors available. Let me know if this is possible or if I have misunderstood!

neilhoulsby commented 4 years ago

run_classifier.py just serves as an example, and should be relatively easy to extend to other tasks. See, for example, the original BERT codebase (https://github.com/google-research/bert/) for an XNLI preprocessor, or for a separate main for SQuAD Question Answering.