abisee / pointer-generator

Code for the ACL 2017 paper "Get To The Point: Summarization with Pointer-Generator Networks"
Other
2.18k stars 812 forks source link

Input queue is empty when calling next_batch #114

Open chmille3 opened 6 years ago

chmille3 commented 6 years ago

I am using this command to run:

python run_summarization.py --mode=decode --data_path=./finishedfiles/chunked/test* --vocab_path=./finished_files/vocab --log_root=./pretrained_model --max_enc_steps=400 --max_dec_steps=120 --coverage=1

But I am getting this warning and nothing happens: WARNING:tensorflow:Bucket input queue is empty when calling next_batch. Bucket queue size: 0, Input queue size: 400

Any help is appreciated, thanks!

chmille3 commented 6 years ago

I take out the --max_enc_steps=400 and I get this error:

INFO:tensorflow:Failed to load checkpoint from ./pretrained_model/train. Sleeping for 10 secs...

liyingjiao02 commented 5 years ago

I am using this command to run:

python run_summarization.py --mode=decode --data_path=./finishedfiles/chunked/test* --vocab_path=./finished_files/vocab --log_root=./pretrained_model --max_enc_steps=400 --max_dec_steps=120 --coverage=1

But I am getting this warning and nothing happens: WARNING:tensorflow:Bucket input queue is empty when calling next_batch. Bucket queue size: 0, Input queue size: 400

Any help is appreciated, thanks!

did you find the cause of this problem, i met this problem too.

chmille3 commented 5 years ago

It has been awhile since I have worked on this, but I believe the source of the problem was the version of tensorflow I was using with respect to the pre-trained model. I used the pre-trained model for tensorflow 1.2.1. I got mine to work by used that version of tensorflow installed. I hope this helps.

TanyaChowdhury commented 5 years ago

Was anyone able to figure out why this is happening? Changing the tf version didn't work for me.

ghost commented 3 years ago

Have you solved it now? Changing the TF version is useless for me.

TanyaChowdhury commented 3 years ago

Yes, I think it was about getting to the exact TF versions for me.