pytorch / torchtune

PyTorch native finetuning library
https://pytorch.org/torchtune/main/
BSD 3-Clause "New" or "Revised" License
4.35k stars 440 forks source link

[Refactor] Default all configs to `epochs: 1` #1980

Closed joecummings closed 1 week ago

joecummings commented 1 week ago

The purpose of our configs is to give a starting point for finetuning a given model. Therefore, 1 epoch is plenty to ensure the model is working OOTB. The risk of having epoch higher than 1 is that it will take longer for users to get feedback on their finetuning runs.

Should be as simple as a find all and replace.