Xilinx / inference-server

https://xilinx.github.io/inference-server/
Apache License 2.0
43 stars 13 forks source link

Run MLCommons benchmarks #120

Closed varunsh-xilinx closed 1 year ago

varunsh-xilinx commented 1 year ago

What is needed to connect the inference server to run standard MLCommons benchmarks and/or submit to MLCommons? Having these benchmarks is a useful metric to see where improvement is needed.

varunsh-xilinx commented 1 year ago

Completed by #197