LabeliaLabs / distributed-learning-contributivity

Simulate collaborative ML scenarios, experiment multi-partner learning approaches and measure respective contributions of different datasets to model performance.
https://www.labelia.org
Apache License 2.0
57 stars 12 forks source link

What happens when we clip batch_size ? #232

Open RomainGoussault opened 4 years ago

RomainGoussault commented 4 years ago

The batch size is clipped here --> https://github.com/SubstraFoundation/distributed-learning-contributivity/blob/b72fa98c0b4db45d368f577d0f6d1a861b1610c2/scenario.py#L584

So if it's clipped it means that sometimes we don't use all the data in the dataset. But we don't give any feedback to the user. We should check when it happens and inform somehow the user.

bowni commented 4 years ago
arthurPignet commented 3 years ago

I think that the MAX_BATCH_SIZE should be related to the GPU memory available, so it will be dataset dependant