mlcommons / inference_policies

Issues related to MLPerf™ Inference policies, including rules and suggested changes
https://mlcommons.org/en/groups/inference/
Apache License 2.0
56 stars 52 forks source link

Benchmark duration is inconsistent with official website #201

Closed haowang5128 closed 3 years ago

haowang5128 commented 3 years ago

Hi, I was checking the https://mlcommons.org/en/inference-edge-07/ page , and found the duration requirement is '60s' which is inconsistent with https://github.com/mlcommons/inference_policies/blob/master/inference_rules.adoc#3-scenarios (600s).

tjablin commented 3 years ago

The web page you link is for MLPerf Inference 0.7. The rules have been updated for MLPerf Inference 1.0. The minimum duration increased from one to ten minutes between 0.7 and 1.0.

haowang5128 commented 3 years ago

The web page you link is for MLPerf Inference 0.7. The rules have been updated for MLPerf Inference 1.0. The minimum duration increased from one to ten minutes between 0.7 and 1.0.

Oh,I see now. Thank you!