marekrei / sequence-labeler

Neural network sequence labeling model
252 stars 74 forks source link

One Problem about backward mask in Language model cost function #9

Open wugh opened 6 years ago

wugh commented 6 years ago

Hi,

In the laber.py line 243:

                lmcost_bw_mask = tf.sequence_mask(sentence_lengths, maxlen=tf.shape(target_ids)[1])[:,:-1]

The mask have some issue, for example

origin_seq: 1 2 3 4 0 0 0
origin_mask: 1 1 1 1 0 0 0
lmcost_bw_mask: 1 1 1 1 0 0
the correct lmcost_bw_mask here should be: 1 1 1 0 0 0