titu1994 / keras-one-cycle

Implementation of One-Cycle Learning rate policy (adapted from Fast.ai lib)
MIT License
285 stars 78 forks source link

Documentation has different initialization #26

Open rmcconke opened 3 years ago

rmcconke commented 3 years ago

The initialization in clr.py is: def __init__(self, num_samples, batch_size, max_lr, end_percentage=0.1, scale_percentage=None, maximum_momentum=0.95, minimum_momentum=0.85, verbose=True)

The initialization in the documentation is: lr_manager = OneCycleLR(num_samples, num_epoch, batch_size, max_lr end_percentage=0.1, scale_percentage=None, maximum_momentum=0.95, minimum_momentum=0.85)

This causes unexpected behaviour in the function compute_lr.