Validated (locally) the Fine-Tuning Scheduler (FTS) tutorial for FTS/Lightning/PyTorch 2.4.0 (as of the recent 2.4.0 Lightning commit).
(this PR is currently using 2.3.3 until Lightning and FTS 2.4.x are released)
The only minor changes in this PR are to explicitly set datasetstrust_remote_code (which will be mandatory with Datasets 3.x) and to remove a reference to the now unsupported PT 2.0.x
Thank you for all your work and your consistently valuable contributions to the open-source ML community!
Before submitting
[X] Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
[X] Did you make sure to update the docs?
[X] Did you write any new necessary tests?
What does this PR do?
Updates the Fine-Tuning Scheduler tutorial to prepare for 2.4.0 tutorial publishing.
PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Validated (locally) the Fine-Tuning Scheduler (FTS) tutorial for FTS/Lightning/PyTorch
2.4.0
(as of the recent2.4.0
Lightning commit). (this PR is currently using2.3.3
until Lightning and FTS2.4.x
are released)The only minor changes in this PR are to explicitly set
datasets
trust_remote_code
(which will be mandatory with Datasets 3.x) and to remove a reference to the now unsupported PT2.0.x
Thank you for all your work and your consistently valuable contributions to the open-source ML community!
Before submitting
What does this PR do?
Updates the Fine-Tuning Scheduler tutorial to prepare for
2.4.0
tutorial publishing.PR review
Anyone in the community is free to review the PR once the tests have passed. If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃