Closed nhw649 closed 9 months ago
The batch size is defined here. parser.add_argument('--batch_size', default=1, type=int)
parser.add_argument('--batch_size', default=1, type=int)
But why is batch_sampler_train_burnin batchsize fixed at 2? batch_sampler_train_label = torch.utils.data.BatchSampler(sampler_train_label, args.batch_size, drop_last=True) batch_sampler_train_unlabel = torch.utils.data.BatchSampler(sampler_train_unlabel, args.batch_size, drop_last=True) batch_sampler_train_burnin = torch.utils.data.BatchSampler(sampler_train_burnin, 2, drop_last=True)
batch_sampler_train_label = torch.utils.data.BatchSampler(sampler_train_label, args.batch_size, drop_last=True) batch_sampler_train_unlabel = torch.utils.data.BatchSampler(sampler_train_unlabel, args.batch_size, drop_last=True) batch_sampler_train_burnin = torch.utils.data.BatchSampler(sampler_train_burnin, 2, drop_last=True)
In the paper, the batch size is set to 16, so I need modify which one?
All models of Deformable DETR are trained with total batch size of 16.
The batch size is defined here.
parser.add_argument('--batch_size', default=1, type=int)
But why is batch_sampler_train_burnin batchsize fixed at 2?
batch_sampler_train_label = torch.utils.data.BatchSampler(sampler_train_label, args.batch_size, drop_last=True) batch_sampler_train_unlabel = torch.utils.data.BatchSampler(sampler_train_unlabel, args.batch_size, drop_last=True) batch_sampler_train_burnin = torch.utils.data.BatchSampler(sampler_train_burnin, 2, drop_last=True)
In the paper, the batch size is set to 16, so I need modify which one?