Hi, Dude. I have noticed that the AP changes a little with different --batch-size, it this reasonable?
Notice that this isn't related to the CUDA randomness. And the following two evaluation have a 0.3 AP gap on the coco128 dataset. When evaluate on the full coco val set, the gap is closer but still there.
@imyhxy rectangular batch shapes are computed as a function of the --batch-size argument for minimum padding, so yes as your --batch-size varies then padding will vary, causing results to vary.
Hi, Dude. I have noticed that the
AP
changes a little with different--batch-size
, it this reasonable?Notice that this isn't related to the CUDA randomness. And the following two evaluation have a
0.3 AP
gap on the coco128 dataset. When evaluate on the full coco val set, the gap is closer but still there.Test 1:
Test 2: