Lightning-Universe / lightning-transformers

Flexible components pairing 🤗 Transformers with :zap: Pytorch Lightning
https://lightning-transformers.readthedocs.io
Apache License 2.0
607 stars 77 forks source link

PL has changed the paramerters of Trainer #238

Closed ElderWanng closed 2 years ago

ElderWanng commented 2 years ago

🐛 Bug

In the latest pytorch-lightning ( 1.6.x ), the argument of number of gpus has changed to 'devices', but in this project's requirement it's pytorch-lightning>=1.4.0. So pip will automatically install 1.6.x which conflicts to the config file

Borda commented 2 years ago

@ElderWanng mind sending a PR? :rabbit:

tanmoyio commented 2 years ago

@ElderWanng if you are not working on it, I want to make a PR cc @Borda

Borda commented 2 years ago

if you are not working on it, I want to make a PR

that would be great!

tanmoyio commented 2 years ago

@Borda I am preparing the PR, I have some confusions,

  1. pl.Trainer still has the argument gpus, so should I keep it along side with argument devices, also I have seen some benchmark config files which contains gpus should I change those?

  2. Also in the main README.md there are several commands which can be changed, like from - python train.py task=nlp/language_modeling dataset=nlp/language_modeling/wikitext trainer.gpus=1 training.batch_size=8 to python train.py task=nlp/language_modeling dataset=nlp/language_modeling/wikitext trainer.accelerator=gpu training.batch_size=8 we don't need to manually set devices if we set the accelerator, correct me if I am wrong.

Let me know your opinion.