zihangJiang / TokenLabeling

Pytorch implementation of "All Tokens Matter: Token Labeling for Training Better Vision Transformers"
Apache License 2.0
426 stars 36 forks source link

BatchSize Specified #26

Open HiIcy opened 2 years ago

HiIcy commented 2 years ago

if I wanna to use 1p to train, how many batchsize I need to allocate? or there's the formula to compute?, please

zihangJiang commented 2 years ago

Hi, we use batch_size=1024 for most of our experiments.

HiIcy commented 2 years ago

oh, 1024 is too large...