ybracke / transnormer

A lexical normalizer for historical spelling variants using a transformer architecture.
GNU General Public License v3.0
6 stars 1 forks source link

Improve configuration of training arguments #89

Open ybracke opened 7 months ago

ybracke commented 7 months ago

Concerns file: train_model.py

There are some hard-coded training arguments:

predict_with_generate=False,
group_by_length=True,

and many possible training arguments that are not configurable by the current code. It would probably make sense to make this arguments includable in the config file and then pass them to the Seq2SeqTrainingArguments as a whole **args.