bghira / SimpleTuner

A general fine-tuning kit geared toward diffusion models.
GNU Affero General Public License v3.0
1.81k stars 172 forks source link

Config Preservation Feature Request #1021

Closed rafstahelin closed 1 month ago

rafstahelin commented 1 month ago

Overview

We propose enhancing SimpleTuner's training script to automatically save the config.json training file to the model's output path. This feature aims to preserve crucial training parameters alongside the model checkpoints.

Rationale

Many users download models from cloud services and subsequently delete the training environment. Storing the config.json file with the checkpoints ensures that training parameters are always accessible, providing a complete record of the training process. Current Situation The training script already saves several important files in the output path:

Adding the config.json file would complete this set of essential training artifacts.

Proposed Implementation

Modify the training script to copy the config.json file to the model's output directory.

Implement this action at the beginning of the training process to ensure it's saved even if training is interrupted.

Consider suffixing the copied file (e.g., training_config.json) to avoid potential conflicts with other configuration files. Perhaps with date-time and instance-prompt name, ie config-comixflux-202412312400.json

Benefits

Complete training record: Ensures all critical training information is preserved with the model. Improved reproducibility: Makes it easier to replicate or adjust training parameters in future sessions. Enhanced portability: Facilitates easier sharing and understanding of models among users. Simplified workflow: Eliminates the need for manual config preservation.

Technical Considerations

Ensure the copying process doesn't interfere with the existing training workflow. Consider implementing a versioning system if multiple training runs are performed on the same model.

We believe this feature will significantly improve the user experience and the overall utility of models trained with SimpleTuner. Your feedback and suggestions on this proposal are welcome.

bghira commented 1 month ago

there is a simpler solution where you don't delete the training environment without saving the config.

it already allows you to use subfolders for organising configs.

putting it into the output dir means it will be published to the hub automatically etc, which isn't what a lot of users want. it may contain secrets or internal information that users do not want put into the output dir.

the output dir contains output from the trainer, but the config is an input. manage inputs separately.

rafstahelin commented 1 month ago

Cool makes sense. I will think of it differently then