hkchengrex / XMem

[ECCV 2022] XMem: Long-Term Video Object Segmentation with an Atkinson-Shiffrin Memory Model
https://hkchengrex.com/XMem/
MIT License
1.76k stars 192 forks source link

Why should similarity scores be normalized? What if it is not normalized? #130

Closed xinbaiw closed 11 months ago

xinbaiw commented 11 months ago

Why should similarity scores be normalized? What if it is not normalized?

image

hkchengrex commented 11 months ago

Because we want a probability, not the sum.

xinbaiw commented 11 months ago

Thank you for your reply. Can you explain more specifically? I didn't understand. The similarity score needs to be multiplied by the value of memory. Does non normalization have any impact?

hkchengrex commented 11 months ago

This is not the same normalization that you asked about in https://github.com/hkchengrex/STCN/issues/151#issuecomment-1791821712. This is to compute the usage of each element. Without normalization, the algorithm would dispositionally favor old elements.

xinbaiw commented 11 months ago

Ok, thank you very much!