Open Jing-L97 opened 4 months ago
There seem to be an issue with the if
statements on line 74, 398 and 424 in the trlx/trainer/accelerate_ppo_trainer.py file.
The check for self.model.peft_type
should be made with hasattr
like this:
if ... and hasattr(self.model, "peft_type")
🐛 Describe the bug
Hi we encountered the
DistributedDataParallel issue
when running the example code with Ray Optimization, in which we set theDistributed Type: no
:ray start --head --port=6379 python -m trlx.sweep --config configs/sweeps/ppo_sweep.yml --accelerate_config configs/accelerate/ddp.yaml --num_gpus 4 examples/ppo_sentiments.py
Here's the Traceback Error that we encountered
The same error occurred when we changed the config file into the iml setting below
Thank you very much!
Which trlX version are you using?
https://github.com/CarperAI/trlx/tree/3340c2f3a56d1d14fdd5f13ad575121fa26b6d92
Additional system and package information
transformers==4.32.0
,python==3.9