Closed waallf closed 5 years ago
@waallf It's just embedding with vocab size 2, which element is sentence A and sentence B. So I inherit the nn.Embedding and initialized with vocab_size 2
More concretely, he sets this embedding to be of shape (2 + 1, embed_size) where:
We keep the same embedding space as the other embeddings so that we can add them together later.
import torch.nn as nn
class SegmentEmbedding(nn.Embedding):
def __init__(self, embed_size=512):
super().__init__(3, embed_size, padding_idx=0)
@codertimo this can probably be closed.
hi, thank you for the code it's really helpful. but i have an issue! what is segment_info actually? i mean what should we write here?
Thanks for you code ,which let me leran more details for this papper .But i cant't understand
segment.py
. You haven't writeen how to embedding segment lable .