Open nlp-wh opened 5 years ago
Hi. You can only choose between softmax and crf. The classifier is only the last layer of the network. An LSTM network is always applied before the classifier
Ok, I understand, thank you, I saw a part of the code that can choose LSTM as a classifier, I thought it could be similar to seq2seq.
elif isinstance(classifier, (list, tuple)) and classifier[0] == 'LSTM':
size = classifier[1]
if isinstance(self.params['dropout'], (list, tuple)):
output = Bidirectional(LSTM(size, return_sequences=True, dropout=self.params['dropout'][0], recurrent_dropout=self.params['dropout'][1]), name=modelName+'_varLSTM_'+str(cnt))(output)
else:
""" Naive dropout """
output = Bidirectional(LSTM(size, return_sequences=True), name=modelName+'_LSTM_'+str(cnt))(output)
if self.params['dropout'] > 0.0:
output = TimeDistributed(Dropout(self.params['dropout']), name=modelName+'_dropout_'+str(self.params['dropout'])+"_"+str(cnt))(output)
else:
assert(False) #Wrong classifier
For multi task learning you can have task specific LSTM layers by passing a list as argument to the classifier option. But there the last layer must be softmax or CRF
Hello, I have already understood the part of multi-tasking learning in your code. Is this part of the code suitable for joint learning? For example, named entity recognition+relationship classification, because it involves different input formats, how to modify the code of input formats?
Best regards,
Hi I think changing the code to join learning would require substantial work. It was not really designed for that, and multi-task learning and joint learning of two tasks on the same data are quite different (=> requires a different setup)
Thank you for your answer. Recently I plan to do joint learning, but I don’t have any ideas. Smh...
Hello, when I choose LSTM as a classifier, I always give an error: AssertionError, assert(False) #Wrong classifier, how should I choose LSTM as a classifier?