Open kx-Z opened 3 years ago
We do not support this feature at the moment
We do not support this feature at the moment
Actually, we can use the branch of pure PyTorch implementation for FP16 feature.
Any update on this?
Can I implement 'ms_deform_attn_forward_cuda' part to fp32 alone? And how to do this?
Any update on this?
Can I implement 'ms_deform_attn_forward_cuda' part to fp32 alone? And how to do this?
You can force the forward of deformable DETR to be fp32 by adding @force_fp32()
on top of the forward definition.
Can I implement 'ms_deform_attn_forward_cuda' part to fp32 alone? And how to do this?
You can force the forward of deformable DETR to be fp32 by adding
@force_fp32()
on top of the forward definition.
hi,I met the same problem now and I'm noob in mmdet. Can you tell me which file did you change? I tried to change mask2former_head.py in models/densehead but it didn't works. Thank you!
Can I implement 'ms_deform_attn_forward_cuda' part to fp32 alone? And how to do this?
You can force the forward of deformable DETR to be fp32 by adding
@force_fp32()
on top of the forward definition.hi,I met the same problem now and I'm noob in mmdet. Can you tell me which file did you change? I tried to change mask2former_head.py in models/densehead but it didn't works. Thank you!
hi, I find that It doesn't work. I suggest to use PyTorch implementation. '@force_fp32' maybe only force input param of function to be fp32. Dose Someone else have a better solution?
RuntimeError: "ms_deform_attn_forward_cuda" not implemented for 'Half'
Does deformable detr not support fp16?