Closed Wykay closed 1 year ago
Hello, Jialian.
I am currently training the model on 4 3090 GPUs and I find that the batch size is small.
I have changed the SOLVER: IMS_PER_BATCH from 64 to 128 in config/base.yaml but the memomy consumption doesn't seem to become larger.
Could you please tell me how could I increase it.
Thanks a lot.
Thanks for your interest in GRiT.
The batch size is defined here: https://github.com/JialianW/GRiT/blob/39b33dbc0900e4be0458af14597fcb1a82d933bb/configs/GRiT_B_ObjectDet.yaml#L19
Hello, Jialian.
I am currently training the model on 4 3090 GPUs and I find that the batch size is small.
I have changed the SOLVER: IMS_PER_BATCH from 64 to 128 in config/base.yaml but the memomy consumption doesn't seem to become larger.
Could you please tell me how could I increase it.
Thanks a lot.