class DeepMojiBatchSampler(object):
"""A Batch sampler that enables larger epochs on small datasets and
has upsampling functionality.
# Arguments:
y_in: Labels of the dataset.
batch_size: Batch size.
epoch_size: Number of samples in an epoch.
*upsample: Whether upsampling should be done. This flag should only be
set on binary class problems.*
seed: Random number generator seed.
# __iter__ output:
iterator of lists (batches) of indices in the dataset
"""
Read this on your https://medium.com/huggingface/understanding-emotions-from-keras-to-pytorch-3ccb61d5a983, is there any underlying important logic behind this which can be applied to the concept of upsampling in general?