mila-iqia / fuel

A data pipeline framework for machine learning
MIT License
867 stars 268 forks source link

[Feature Request] option to make batch size fixed #399

Open khaotik opened 7 years ago

khaotik commented 7 years ago

I see this in SequentialScheme docstring:

 |  Notes
 |  -----
 |  The batch size isn't enforced, so the last batch could be smaller.

In libraries such as Tensorflow, where tensor shape is static by nature, I find this behavior is causing headache.

dmitriy-serdyuk commented 7 years ago

Agree. It should be an option to drop the last batch.