Yang-Bob / PMMs

Prototype Mixture Models for Few-shot Semantic Segmentation
163 stars 27 forks source link

About some details in PMMs #24

Open hkkevinhf opened 3 years ago

hkkevinhf commented 3 years ago

Hi, I have read this great work. However I have some questions especially in the PMMs module code. First, What is the purpose of L2 normalization in Line 61 of PMMs.py ? Dose it affect to remove it? Second, the input to EM algorithm is the masked feature representation, however, zero value positions are not removed after mask operation. The masked feature will have many zero elements. Does this affect the result of EM algorithm?

Yang-Bob commented 3 years ago

L2 denotes the L2 normalization. It should not be removed according to the mixture model equation. Since that there are more than one images in a batch and the number of mask==1 of each is not equal. To make program process parallel and faster, the 0 values are not removed. The result does not affect much.