mlcommons / training_policies

Issues related to MLPerf™ training policies, including rules and suggested changes
https://mlcommons.org/en/groups/training
Apache License 2.0
92 stars 66 forks source link

Merge HP constraint table into rules #346

Closed petermattson closed 4 years ago

petermattson commented 4 years ago

Plan is to merge the HP constraint table into the rules, which will close a variety of other issues. Will close those other issues as duplicates of this one for tracking.

petermattson commented 4 years ago

Expected text replacing all other hp text:

CLOSED:

By default, the hyperparameters must be the same as the reference.

Hyperparameters include the choice of optimizer and regularization terms such as norms and weight decays.

The implementation of the optimizer must match the optimizer specified in the Appendex: Allowed Optimizer. The Appendex lists which optimizers in the popular deep learning frameworks are compliant by default. If the submission uses an alternate implementation, the submitter must describe the optimizer’s equation and demonstrate equivalence with the approved optimizers on that list.

The following table lists all tunable hyperparameters, including the exact identifier used in the reference, and the constraint on the value of those hyperparameters. This table is the definitive source of truth.

[ Table ]

bitfort commented 4 years ago

SWG:

People think this looks good!