mlcommons / inference_policies

Issues related to MLPerf™ Inference policies, including rules and suggested changes
https://mlcommons.org/en/groups/inference/
Apache License 2.0
54 stars 52 forks source link

min duration #239

Open saareliad opened 2 years ago

saareliad commented 2 years ago

Hi, the rules show that min duration is 600 for all workloads (I was looking at datacenter) while it shall be 60 for most of them. https://github.com/mlcommons/inference_policies/blob/master/inference_rules.adoc#3-scenarios

e.g,. looking at https://github.com/mlcommons/inference_results_v2.0/search?q=60000 mlperf.conf shows 600, user.conf overrides it with 60, rules still show 600 official results page shows 60 as well.

nv-ananjappa commented 2 years ago

@saareliad It should be 600. Could you point out one example file where it is 60?

saareliad commented 2 years ago

First thing which came in the aforementioned search link:

https://github.com/mlcommons/inference_results_v2.0/blob/8fcd68065d54033cc0aa8e83d931907aacfb8c02/closed/GIGABYTE/measurements/GIGABYTE-G492-ID0_A100-SXM-80GBx8_TRT/bert-99/Server/user.conf

And there are a lot more like this + official results website