longcw / RoIAlign.pytorch

RoIAlign & crop_and_resize for PyTorch
555 stars 103 forks source link

Hi does your roi align layer support multi gpu? #8

Open mks0601 opened 6 years ago

mks0601 commented 6 years ago

Hi, Does your roi align layer support multi gpu? I got the weird result when RoI align layer from yours.. For example, if I use 2 gpu, the number of batch of the output tensor is 2 times less.

Can you check?

longcw commented 6 years ago

This may be caused by the wrong box_ind. Your input boxes will be split into two tensors when you are using 2 GPUs while the box_ind in the second tensor will not change automatically.

mks0601 commented 6 years ago

Then how can I make box_ind correct?

longcw commented 6 years ago

You can make the values in box_ind corresponding to each GPU when you generate roi.

paucarre commented 6 years ago

@longcw it didn't work for me

boxes_index = boxes_index.cuda(input.data.get_device())

Still it fails saying tensors are in different GPUs. It only works when using the GPU with index 0. So it seems that for some reason I don't understand somewhere the GPU is hardcoded. I've been digging through the code but I couldn't find it unfortunately...

stomachacheGE commented 5 years ago

@paucarre Hi, did you make it work with multiple GPUs?

frostinassiky commented 4 years ago

@stomachacheGE @paucarre I found this cmd works for me: roi= roi.to(input.device) The idea is that both feature and roi should be on the same device.