mlcommons / training_policies

Issues related to MLPerf™ training policies, including rules and suggested changes
https://mlcommons.org/en/groups/training
Apache License 2.0
92 stars 66 forks source link

[HPC][OpenCatalyst] Missing HP rule for LR decay factor #494

Closed sparticlesteve closed 2 years ago

sparticlesteve commented 2 years ago

The learning rate decay factor was mistakenly left out of the HP table for the OpenCatalyst benchmark, but is a very reasonable thing to tune and is usually allowed for other benchmarks.

I think we should allow to add a rule saying this parameter is unconstrained for HPC v2.0, despite being past the freeze deadline.

azrael417 commented 2 years ago

I think it makes sense to make this parameter tunable. I do not see any big issues adding this to the tunable parameters list at this point in time since we are still early in the submission.

sparticlesteve commented 2 years ago

Turns out this wasn't really forgotten, but was hidden in the markdown rendering due to a bug in the table! See #495