Open ghimireadarsh opened 3 years ago
Steps per epoch should be equal to sample//batch_size
I have the same question
when I use batchsize=32, it gives me the error.
63/100 [=================>............] - ETA: 7s - loss: 0.6931 - acc: 0.5150WARNING:tensorflow:Your input ran out of data; interrupting training. Make sure that your dataset or generator can generate at least steps_per_epoch * epochs
batches (in this case, 10000 batches). You may need to use the repeat() function when building your dataset.
Why the
batchsize = 32
, should it not be 20? Since thesteps_per_epoch = 100
and there is issue when using the generator in this way, since thebatchsize
is not matching up with amount of data being generated. Thus the current keras generator requires to use repeat() in generator to matchup data generated. Correct me if I am wrong