Closed Feobi1999 closed 2 years ago
Hi, @Feobi1999,
You don't need to modify other parameters. Also, we highly recommend you could take the look at the instructions of PytorchLightning and DPT, you don't need to modify the batch size as well.
Hope this helps!
If I use eight GPUS, in addition to modifying the batch size=8, do I need to modify other parameters such as --num_nodes and lr?