Open EricWWWW opened 4 years ago
batch_size_t is the total number of captions whose length is bigger than present time step. And in the for loop, these captions go on propagating forward. In other word, the captions whose length are smaller than present time step stop propagating.
Hi, I'm a beginner on image-caption,and your code can run on my pc successfully without any bug. But there is one thing in "models.py" line 136 : the variable “batch_size_t” in for loop,I can't figure out what it does