Closed carlini closed 4 years ago
If you try and run a parallel module over a batch of examples where the batch shape doesn't divide the number of devices, the code crashes in numpy land. Instead, throw a nicer error message.
If you try and run a parallel module over a batch of examples where the batch shape doesn't divide the number of devices, the code crashes in numpy land. Instead, throw a nicer error message.