Open PkuRainBow opened 6 years ago
''the test batch size is set to number of gpus'' is that necessary?
That is not necessary, but it is a lazy solution.
@zhanghang1989 I think you are talking about this part of MultiEvalModule
? Would it be better to add assertion code and some notifications to make sure that test batch size is no greater than number of GPUs?
Well, it has been solved, which I just found. So there is no problem now.
@zhanghang1989 It seems necessary to change collate_fn
for dataloader when
Otherwise, the default torch.stack(batch, 0, out=out)
will raise error.
Well, the author has carefully considered this case as this, which I just found. Just ignore my ignorance.
Previously, I noticed that you choose base_size=520, crop_size=480 and now you change them to base_size=576, crop_size=608.
However, I still have a concern about it, the 608 should be larger than the largest highth or width in your dataset, so we should increase this parameter if we are dealing with larger datasets.
Besides, I noticed that you mentioned that your MultiEvalModule only support single image evalution, but you also provide the batch size parameter, which is contradictory....... I guess we can only set the batch size as 1 during testing phase.