spiglerg / RNN_Text_Generation_Tensorflow

DEPRECATED CODE : Text generation using RNN (LSTM) implemented using Tensorflow
Apache License 2.0
111 stars 49 forks source link

About the sentence result #18

Open lydemo opened 6 years ago

lydemo commented 6 years ago

I notice that the longer(the more batches)that I've trained the more perfect that generated sentences will be,but I find that some generated sentences can be totally the same as some sentences in my train corpus,is it possible?I just wonder whether it just generate the sentences like that or 'copy' like that.

spiglerg commented 6 years ago

That sounds like overfitting. You may try adding a regularizer to the network's weights or decrease the number of parameters (or increase the size of the training set).

niranjan8129 commented 5 years ago

@spiglerg I am facing overfitting issue . as you said " adding a regularizer to the network's weights or decrease the number of parameters (or increase the size of the training set) " are you referring to below to increase or decrease ? if yes, please give the exact value

lstm_size = 256 # 128 num_layers = 2 batch_size = 128 # 128 time_steps = 100 # 50