mlcommons / inference_results_v2.1

This repository contains the results and code for the MLPerf™ Inference v2.1 benchmark.
https://mlcommons.org/en/inference-datacenter-21/
Apache License 2.0
18 stars 27 forks source link

how can I call Offline benchmark when mps on? #10

Open emptyinteger opened 11 months ago

emptyinteger commented 11 months ago

I want to compare with full gpu and mps sliced but it does not work