mlcommons / inference

Reference implementations of MLPerf™ inference benchmarks
https://mlcommons.org/en/groups/inference
Apache License 2.0
1.22k stars 532 forks source link

Equal Issue mode only implemented for Offline #1082

Closed nv-jinhosuh closed 2 years ago

nv-jinhosuh commented 2 years ago

We have an equal issue mode implementation for Offline scenario, through PR1032 (https://github.com/mlcommons/inference/pull/1032), and this works flawlessly for 3D-UNet Offline runs.

Now for 3D-UNet SingleStream scenario, we are missing the equal issue mode support from LoadGen, and it is problematic as below:

It is important for LoadGen to support the equal issue mode, for scenarios other than Offline. For v2.0, we probably want to add the support to 3D-UNet SingleStream specifically, and after the submission we can make the implementation to be more generic and unified. Screenshot 2022-02-08 121340

nvpohanh commented 2 years ago

@ashwin @pgmpablo157321 @psyhtest for visibility