Closed Nimrais closed 3 years ago
I think in the current implementation it's not a bug - it's a feature) Batch overfit is a functionality that allows you to check fast that the model can learn on the same data or not.
Hi! Thank you for your contribution! Please re-check all issue template checklists - unfilled issues would be closed automatically. And do not forget to join our slack for collaboration.
We still should count the sample size in a right way ;)
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
should be fixed ;)
🐛 Bug Report
runner.train( ... overfit=True )
The runner.loader_sample_len is wrong currently we will have loader length from the original loader. But we must have the first batch size * a number of batches.
How To Reproduce
Take any loader and after runner.train check the train loader length with overfit=True
Expected behavior
we must have the first batch size * a number of batches.
Environment
Not needed
Checklist
FAQ
Please review the FAQ before submitting an issue: