Tensor-Reloaded / Convergence

Artificial Neural Network training convergence boosting by smart data ordering
Academic Free License v3.0
0 stars 0 forks source link

Feasibility trade-off technique #8

Open simi2525 opened 4 years ago

simi2525 commented 4 years ago

An interesting experiment we can run which can be used with SOTA training techniques and architectures, so that we can compare the potential of the ordering technique in a real use-case:

Perform shuffling of dataset as normal. Separate the dataset into batches as normal. Calculate perfect order training over the existing batches (batch level perfect level, instead of instance level). This should be more feasible since the number of batches is small.

Another important thing to notice is that there should be no influence anymore in terms of other techniques used, for example random flipping/cropping augmentation (or even sum-augmentation) since we find the perfect order for a given series of batches.

Could be a useful compromise in order to obtain feasibility.