Open ArgentVASIMR opened 7 months ago
Thank you for the suggestion. --fp8_base
is an experimental feature, because I didn't test well the scripts in this repo with PyTorch 2.1. However, from the community, there is no big issue with 2.1, so I think we can move on 2.1. Prodigy and D-Adaptation are a similar state, but it will be no problem to include them in requirements.txt
now. Also bitsandbytes
releases the official Windows packgage.
I will update requirements.txt
and the instructions sooner.
--fp8_base
is an implemented commandline arg in sd-scripts for (SDXL_)train_network.py, and yet the install instructions in README.md still have the outdated and incompatible Torch v2.0.1.I propose the following install commands:
This incorporates a compatible version of torch into sd-scripts, while automatically installing a compatible version of torchvision and xformers without needing to specify exact versions.
Many people use the LR-free optimisers of prodigy and dadaptation (with its variants), and I do not see how their ambient presence in an sd-scripts install could harm said install. If someone explicitly does not wish to have the optimisers installed, then they will likely have the experience to simply not run the command that installs them.
Additionally, I do not see a need for bitsandbytes to be optional, so it should also be returned to the main list of install instructions. Repeating what I said regarding the LR-free optimisers, if someone explicitly does not wish to have bitsandbytes installed, then they will likely have the experience to simply not run the command that installs it.
This probably should have been a PR, but I believed the change was slight enough to not necessitate it, and I lack the necessary experience with github to write a proper PR.