cm run script "run mlperf inference generate-run-cmds _submission"
cm run script "detect os"
! cd /home/susie.sun
! call /home/susie.sun/CM/repos/mlcommons@cm4mlops/script/detect-os/run.sh from tmp-run.sh
! call "postprocess" from /home/susie.sun/CM/repos/mlcommons@cm4mlops/script/detect-os/customize.py
cm run script "detect cpu"
cm run script "detect os"
! cd /home/susie.sun
! call /home/susie.sun/CM/repos/mlcommons@cm4mlops/script/detect-os/run.sh from tmp-run.sh
! call "postprocess" from /home/susie.sun/CM/repos/mlcommons@cm4mlops/script/detect-os/customize.py
! cd /home/susie.sun
! call /home/susie.sun/CM/repos/mlcommons@cm4mlops/script/detect-cpu/run.sh from tmp-run.sh
! call "postprocess" from /home/susie.sun/CM/repos/mlcommons@cm4mlops/script/detect-cpu/customize.py
cm run script "get python3"
! load /home/susie.sun/CM/repos/local/cache/38444f88746d4e25/cm-cached-state.json
Path to Python: /home/susie.sun/anaconda3/envs/mlperf/bin/python3
Python version: 3.10.0
cm run script "get mlcommons inference src"
! load /home/susie.sun/CM/repos/local/cache/c6ee028bd2a34c49/cm-cached-state.json
Path to the MLPerf inference benchmark configuration file: /home/susie.sun/CM/repos/local/cache/2c8c91d452654dd5/inference/mlperf.conf
Path to MLPerf inference benchmark sources: /home/susie.sun/CM/repos/local/cache/2c8c91d452654dd5/inference
cm run script "get sut description"
! load /home/susie.sun/CM/repos/local/cache/1ddd481dde0648ee/cm-cached-state.json
cm run script "install pip-package for-cmind-python _package.tabulate"
! load /home/susie.sun/CM/repos/local/cache/1ed4900679914a1a/cm-cached-state.json
cm run script "get mlperf inference utils"
cm run script "get mlperf inference src"
! load /home/susie.sun/CM/repos/local/cache/c6ee028bd2a34c49/cm-cached-state.json
Path to the MLPerf inference benchmark configuration file: /home/susie.sun/CM/repos/local/cache/2c8c91d452654dd5/inference/mlperf.conf
Path to MLPerf inference benchmark sources: /home/susie.sun/CM/repos/local/cache/2c8c91d452654dd5/inference
! call "postprocess" from /home/susie.sun/CM/repos/mlcommons@cm4mlops/script/get-mlperf-inference-utils/customize.py
Using MLCommons Inference source from /home/susie.sun/CM/repos/local/cache/2c8c91d452654dd5/inference
Running loadgen scenario: Offline and mode: performance
cm run script "app mlperf inference generic _reference _resnet50 _tf _gpu _test _offline"
CM error: no scripts were found with above tags and variations
(mlperf) susie.sun@yizhu-R5300-G5:~$ cmr "run mlperf inference generate-run-cmds _submission" --quiet --submitter="MLCommons" --hw_name=default --model=resnet50 --implementation=reference --backend=tf --device=gpu --scenario=Offline --adr.compiler.tags=gcc --target_qps=1 --category=edge --division=open
cm run script "run mlperf inference generate-run-cmds _submission"
cm run script "detect os" ! cd /home/susie.sun ! call /home/susie.sun/CM/repos/mlcommons@cm4mlops/script/detect-os/run.sh from tmp-run.sh ! call "postprocess" from /home/susie.sun/CM/repos/mlcommons@cm4mlops/script/detect-os/customize.py
cm run script "detect cpu"
cm run script "detect os" ! cd /home/susie.sun ! call /home/susie.sun/CM/repos/mlcommons@cm4mlops/script/detect-os/run.sh from tmp-run.sh ! call "postprocess" from /home/susie.sun/CM/repos/mlcommons@cm4mlops/script/detect-os/customize.py ! cd /home/susie.sun ! call /home/susie.sun/CM/repos/mlcommons@cm4mlops/script/detect-cpu/run.sh from tmp-run.sh ! call "postprocess" from /home/susie.sun/CM/repos/mlcommons@cm4mlops/script/detect-cpu/customize.py
cm run script "get python3" ! load /home/susie.sun/CM/repos/local/cache/38444f88746d4e25/cm-cached-state.json
Path to Python: /home/susie.sun/anaconda3/envs/mlperf/bin/python3 Python version: 3.10.0
Path to the MLPerf inference benchmark configuration file: /home/susie.sun/CM/repos/local/cache/2c8c91d452654dd5/inference/mlperf.conf Path to MLPerf inference benchmark sources: /home/susie.sun/CM/repos/local/cache/2c8c91d452654dd5/inference
cm run script "get sut description" ! load /home/susie.sun/CM/repos/local/cache/1ddd481dde0648ee/cm-cached-state.json
cm run script "install pip-package for-cmind-python _package.tabulate" ! load /home/susie.sun/CM/repos/local/cache/1ed4900679914a1a/cm-cached-state.json
cm run script "get mlperf inference utils"
Path to the MLPerf inference benchmark configuration file: /home/susie.sun/CM/repos/local/cache/2c8c91d452654dd5/inference/mlperf.conf Path to MLPerf inference benchmark sources: /home/susie.sun/CM/repos/local/cache/2c8c91d452654dd5/inference
Using MLCommons Inference source from /home/susie.sun/CM/repos/local/cache/2c8c91d452654dd5/inference
Running loadgen scenario: Offline and mode: performance
CM error: no scripts were found with above tags and variations
variation tags ['reference', 'resnet50', 'tf', 'gpu', 'test', 'offline'] are not matching for the found script app-mlperf-inference with variations dictkeys(['cpp', 'mil', 'mlcommons-cpp', 'ctuning-cpp-tflite', 'tflite-cpp', 'reference', 'python', 'nvidia', 'mlcommons-python', 'reference,gptj', 'reference,sdxl', 'reference,dlrm-v2', 'reference,llama2-70b', 'reference,resnet50', 'reference,retinanet', 'reference,bert', 'nvidia-original', 'intel', 'intel-original', 'intel-original,gptj', 'intel-original,gptj,build-harness', 'qualcomm', 'kilt', 'kilt,qaic,resnet50', 'kilt,qaic,retinanet', 'kilt,qaic,bert-99', 'kilt,qaic,bert-99.9', 'intel-original,resnet50', 'intel-original,retinanet', 'intel-original,bert-99', 'intel-original,bert-99.9', 'intel-original,gptj-99', 'intel-original,gptj-99.9', 'resnet50', 'retinanet', '3d-unet-99', '3d-unet-99.9', '3d-unet', 'sdxl', 'llama2-70b', 'llama2-70b-99', 'llama2-70b-99.9', 'rnnt', 'rnnt,reference', 'gptj-99', 'gptj-99.9', 'gptj', 'gptj', 'bert', 'bert-99', 'bert-99.9', 'dlrm_', 'dlrm-v2-99', 'dlrm-v2-99.9', 'mobilenet', 'efficientnet', 'onnxruntime', 'tensorrt', 'tf', 'pytorch', 'ncnn', 'deepsparse', 'tflite', 'glow', 'tvm-onnx', 'tvm-pytorch', 'tvm-tflite', 'ray', 'cpu', 'cuda', 'rocm', 'qaic', 'tpu', 'fast', 'test', 'valid,retinanet', 'valid', 'quantized', 'fp32', 'float32', 'float16', 'bfloat16', 'int4', 'int8', 'uint8', 'offline', 'multistream', 'singlestream', 'server', 'power', 'batch_size.#', 'r2.1_default', 'r3.0_default', 'r3.1_default', 'r4.0_default']) !