Open PaulForInvent opened 3 years ago
Hey, I try to understand where this comes from. It is just mentioned here link
But seems not be used anywhere than. Because this feature is used in the losses like OnlineContrastive. I don't hink it comes from the huggingface model?
To which forward is this here referring to?
I also wonder what this _modules is like here.
Why is this not in the init?
Thanks. :-)
the 'sentence_embedding' entry is added by the pooling layer.
The _modules come from torch.nn.Sequential. It stores all the modules in the sequential network architecture.
Hey, I try to understand where this comes from. It is just mentioned here link
But seems not be used anywhere than. Because this feature is used in the losses like OnlineContrastive. I don't hink it comes from the huggingface model?
To which forward is this here referring to?
I also wonder what this _modules is like here.
Why is this not in the init?
Thanks. :-)