Open Wang-Yu-Qing opened 4 years ago
When padding the seq of description, the code use default value 0 to pad:
train_embed = keras.preprocessing.sequence.pad_sequences( train_embed, maxlen=max_seq_length, padding="post") test_embed = keras.preprocessing.sequence.pad_sequences( test_embed, maxlen=max_seq_length, padding="post")
But in the description encoding process, 0 refer to one specific word. I think maybe use max(encode_value)+1 as padding value?
max(encode_value)+1
When padding the seq of description, the code use default value 0 to pad:
But in the description encoding process, 0 refer to one specific word. I think maybe use
max(encode_value)+1
as padding value?