Closed JiankunW closed 4 years ago
Hi Jiankun,
Yes, sorry if the code is a bit messy. This is just to create the operations (modules) that are in charge
of applying the alpha
coefficient to the feature x
coming from the first modality, and (1-alpha)
to the feature y
coming from the second modality. This is applied prior to the linear layer fusion operation, and only used if the argument alphas
was enabled (set to True). This comes from one of the papers
from the previous work, if I remember correctly, we did not use this for the models in our paper.
The code snippet for the module that is created when this options is used:
class AlphaScalarMultiplication(nn.Module):
def __init__(self, size_alpha_x, size_alpha_y):
super(AlphaScalarMultiplication, self).__init__()
self.size_alpha_x = size_alpha_x
self.size_alpha_y = size_alpha_y
self.alpha_x = nn.Parameter(torch.from_numpy(np.zeros((1), np.float32)))
def forward(self, x, y):
bsz = x.size()[0]
factorx = torch.sigmoid(self.alpha_x.expand(bsz, self.size_alpha_x))
factory = 1.0 - torch.sigmoid(self.alpha_x.expand(bsz, self.size_alpha_y))
x = x * factorx
y = y * factory
return x, y
One of this module is created for every fusion across the fusion network.
Thank you, juanmanpr!😃 I know what alpha
coefficient does now from your quick explanation.
Which paper is this from? I still wanna what effect does this alpha
have on fusion model performance? Could you explain some further?
Yes, no worries. Basically, it adds a explicit re-weighting mechanisms for features across modalities. The paper is:
Centralnet: a multilayer approach for multimodal fusion
By my co-author, Valentin Velzeuf
Cheers
Thank you so much! 在 2020年9月10日 +0800 00:54,Juan Manuel notifications@github.com,写道:
Yes, no worries. Basically, it adds a explicit re-weighting mechanisms for features across modalities. The paper is:
Centralnet: a multilayer approach for multimodal fusion By my co-author, Valentin Velzeuf Cheers — You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or unsubscribe.
Hi, thank u for your great work! 👍
I am a little confused about the meaning of "self.alphas = self._create_alphas()" in Searchable_xxx_Net class? What was _create_alphas() function used for?