Update test_launch_script.py to handle exceptions that accelerate.commands.launch.launch_command(args) raises when the accelerate option num_processes is greater than 1
Update tests in test_launch_script.py to try and use a free port. This is useful if you have a default accelerate_config.yml file (e.g. under ~/.cache/transformers) with a num_processes > 1 field.
Updated tests that set the training argument --eval_strategy epoch to fallback to --evaluation_strategy for backwards compatibility with older version of transformers (e.g. 4.39.3).
due to fms_acceleration having a constraint to transformers<4.40.0 transformers gets set to 4.39.3 when you use poetry to install fms-hf-tuning (even if you do not add the cli arg --with fms-accel).
This PR introduces a method get_train_args() in test_sft_trainer.py for building the configs.TrainingArguments object
Related issue number
Contributes to #60
How to verify the PR
Use poetry install to install fms-hf-tuning from source. You will end up installing the python dependencies listed in poetry.lock
Was the PR tested
[ ] I have added >=1 unit test(s) for every new method I have added.
Description of the change
accelerate.commands.launch.launch_command(args)
raises when the accelerate optionnum_processes
is greater than 1test_launch_script.py
to try and use a free port. This is useful if you have a default accelerate_config.yml file (e.g. under~/.cache/transformers
) with a num_processes > 1 field.--eval_strategy epoch
to fallback to--evaluation_strategy
for backwards compatibility with older version of transformers (e.g.4.39.3
).--with fms-accel
).get_train_args()
intest_sft_trainer.py
for building theconfigs.TrainingArguments
objectRelated issue number
Contributes to #60
How to verify the PR
Use
poetry install
to installfms-hf-tuning
from source. You will end up installing the python dependencies listed inpoetry.lock
Was the PR tested