Open Lime-Cakes opened 2 years ago
batch_sampler
is currently not supported by the poptorch.DataLoader
but you could use one with a stock torch.utils.data.DataLoader
however you need to make sure each element the sampler returns matches the combined batch size expected by the PopTorch model.
Here is how the combined batch size is computed:
self._combined_batch_size = batch_size * \
options.device_iterations * \
options.replication_factor * \
options.Training.gradient_accumulation
Source: https://github.com/graphcore/poptorch/blob/sdk-release-3.0/python/__init__.py#L278
Is it possible to use dataloader with a custom sample/batch_sampler? At the moment, I cannot find any useful information on using poptorch's dataloader with custom sampler. Are there plans to support it, or is custom sampler impossible due to IPU design?
Edit: At the moment, using a custom batch_sampler would results in the following error: