kathrinse / be_great

A novel approach for synthesizing tabular data using pretrained large language models
MIT License
276 stars 46 forks source link

Changing default behaviour of weight saving? #2

Closed 903124 closed 1 year ago

903124 commented 1 year ago

Currently the weight is saved every 500 steps which quickly fill up hard disk space. Is it possible to change that so it updates the weight during training?

unnir commented 1 year ago

Sure!

please do this:

model = GReaT(llm='distilgpt2', epochs=50,  save_steps=400000)

You can pass any training arguments from the huggingface framework: https://huggingface.co/docs/transformers/main_classes/trainer#transformers.TrainingArguments

Hope it helps!