tensorflow / models

Models and examples built with TensorFlow
Other
77.18k stars 45.76k forks source link

Using of bucket in textsum #1462

Closed ghost closed 4 years ago

ghost commented 7 years ago

I'd like to tackle the issue of the using of bucket in models/textsum. There is a batch_reader which supports bucketing. But in the seq2seq_attention, the length of sequence is directly defined by hps.enc_timesteps and hps.dec_timesteps. I wonder why there is such a setting. And why don't you use dynamic_rnn instead. Do you think about making the sequence length more flexible?

concretevitamin commented 7 years ago

@panyx0718 could you take a look?

Xiyor commented 7 years ago

yes, I had the same confusion as @rylanchiu , when learning textsum codes. So, What is the conclusion on its bucket mechanism?

tensorflowbutler commented 4 years ago

Hi There, We are checking to see if you still need help on this, as this seems to be considerably old issue. Please update this issue with the latest information, code snippet to reproduce your issue and error you are seeing. If we don't hear from you in the next 7 days, this issue will be closed automatically. If you don't need help on this issue any more, please consider closing this.