amazon-science / omni-detr

PyTorch implementation of Omni-DETR for omni-supervised object detection: https://arxiv.org/abs/2203.16089
Other
64 stars 6 forks source link

some questions about batchsize. #15

Closed nhw649 closed 9 months ago

nhw649 commented 11 months ago

The batch size is defined here. parser.add_argument('--batch_size', default=1, type=int)

But why is batch_sampler_train_burnin batchsize fixed at 2? batch_sampler_train_label = torch.utils.data.BatchSampler(sampler_train_label, args.batch_size, drop_last=True) batch_sampler_train_unlabel = torch.utils.data.BatchSampler(sampler_train_unlabel, args.batch_size, drop_last=True) batch_sampler_train_burnin = torch.utils.data.BatchSampler(sampler_train_burnin, 2, drop_last=True)

In the paper, the batch size is set to 16, so I need modify which one?

All models of Deformable DETR are trained with total batch size of 16.