Closed Rsweater closed 1 year ago
@Rsweater I think it was mentioned in the paper appendix. The flip fusion can make the positive outputs all in one side (e.g., left half of the 1D feature vector). Intutitively, we would like the positive predictions come from meaningful spatial positions, basically they should not be too near. Hence the local maxima. Anyways, this does not contribute to the final results.
It was not for replacing nms, nms is not required here by design.
@Rsweater I think it was mentioned in the paper appendix. The flip fusion can make the positive outputs all in one side (e.g., left half of the 1D feature vector). Intutitively, we would like the positive predictions come from meaningful spatial positions, basically they should not be too near. Hence the local maxima. Anyways, this does not contribute to the final results.
It was not for replacing nms, nms is not required here by design.
Sorry, I may be a bit stupid, I'll try to understand: due to the flip, it will cause some extra lines near a certain location, so use max_pool to do a preliminary screening local_maxima? If so, there is a question, why cost_label should be multiplied by it?
@Rsweater
The main goal was to **only** let local maximas match gt.
@Rsweater
The main goal was to **only** let local maximas match gt.
I see, Thx!
Hello, Feng! What is the use of
local_maxima
inhungarian_bezier_loss.py
? I see that thislocal_maxima
is used as a weight for cls cost, is it used to distinguish instances? I have seen elsewhere that the nms that differentiate instances is the extreme value used to make an offset to add, but here the instances are not all valued at 1? Is this not the purpose? Thx! https://github.com/voldemortX/pytorch-auto-drive/blob/bf0827a5061301e582493ac19b12d109458bf072/utils/losses/hungarian_bezier_loss.py#L58C5-L58C5