Open jnakor opened 4 years ago
def forward(self, X): embedded_chars = self.W[X] # [batch_size, sequence_length, sequence_length]
I think the shape is [batch_size, sequence_length,embedding_size]
yes,I think so
Filed PR #49
can somebody tell me, why need three conv layer to convolve the word embedding matrix? I dont understand.
def forward(self, X): embedded_chars = self.W[X] # [batch_size, sequence_length, sequence_length]
I think the shape is [batch_size, sequence_length,embedding_size]