Open saareliad opened 2 years ago
Hi, the rules show that min duration is 600 for all workloads (I was looking at datacenter) while it shall be 60 for most of them. https://github.com/mlcommons/inference_policies/blob/master/inference_rules.adoc#3-scenarios
e.g,. looking at https://github.com/mlcommons/inference_results_v2.0/search?q=60000 mlperf.conf shows 600, user.conf overrides it with 60, rules still show 600 official results page shows 60 as well.
@saareliad It should be 600. Could you point out one example file where it is 60?
First thing which came in the aforementioned search link:
https://github.com/mlcommons/inference_results_v2.0/blob/8fcd68065d54033cc0aa8e83d931907aacfb8c02/closed/GIGABYTE/measurements/GIGABYTE-G492-ID0_A100-SXM-80GBx8_TRT/bert-99/Server/user.conf
And there are a lot more like this + official results website
Hi, the rules show that min duration is 600 for all workloads (I was looking at datacenter) while it shall be 60 for most of them. https://github.com/mlcommons/inference_policies/blob/master/inference_rules.adoc#3-scenarios
e.g,. looking at https://github.com/mlcommons/inference_results_v2.0/search?q=60000 mlperf.conf shows 600, user.conf overrides it with 60, rules still show 600 official results page shows 60 as well.