MCG-NJU / DDM

[CVPR 2022] Progressive Attention on Multi-Level Dense Difference Maps for Generic Event Boundary Detection
MIT License
49 stars 3 forks source link

Maybe I found a bug in ./DDM-Net/modeling/position_embedding.py line: 34 #3

Closed renjie-liang closed 2 years ago

renjie-liang commented 2 years ago

The code here should be written like this. I am looking forward to you can proofread it.

    def forward(self, locations):
        result = (
            # self.position_table[: locations.shape[1]]
            self.position_table[:, :locations.shape[1], :]
            .clone()
            .detach()
            .repeat(locations.shape[0], 1, 1)
        )
        return result
JackyTown commented 2 years ago

This is NOT a bug. You can simply try a simple case with torch.rand(32, 10, 512) and print the shape of result.

First, the position table only has 2 dims. Second, .repeat(bs, 1, 1) is exploited to generate positional embedding for each sample.