Open WeizhuoZhang-intel opened 2 months ago
Hi @chuanqi129,
Currently guilty commit inductor single run cannot support other backend except inductor. https://github.com/chuanqi129/inductor-tools/blob/main/scripts/modelbench/inductor_single_run.sh#L53-L58
Flag_extra="" if [[ ${FREEZE} == "on" ]]; then export TORCHINDUCTOR_FREEZING=1 echo "Testing with freezing on." Flag_extra="--freezing " fi
Can we read the cmd line paramerters from TABLE in runner.py? So we don't have to update some of the options when enabling a new backend. https://github.com/pytorch/pytorch/blob/main/benchmarks/dynamo/runner.py#L64-L96
TABLE = { "training": { "ts_nnc": "--training --speedup-ts ", "ts_nvfuser": "--training --nvfuser --speedup-dynamo-ts ", "eager": "--training --backend=eager ", "aot_eager": "--training --backend=aot_eager ", "cudagraphs": "--training --backend=cudagraphs ", "aot_nvfuser": "--training --nvfuser --backend=aot_ts_nvfuser ", "nvprims_nvfuser": "--training --backend=nvprims_nvfuser ", "inductor": "--training --inductor ", "inductor_no_cudagraphs": "--training --inductor --disable-cudagraphs ", "inductor_max_autotune": "--training --inductor --inductor-compile-mode max-autotune ", "inductor_max_autotune_no_cudagraphs": ( "--training --inductor --inductor-compile-mode max-autotune-no-cudagraphs --disable-cudagraphs " ), }, "inference": { "aot_eager": "--inference --backend=aot_eager ", "eager": "--inference --backend=eager ", "ts_nnc": "--inference --speedup-ts ", "ts_nvfuser": "--inference -n100 --speedup-ts --nvfuser ", "trt": "--inference -n100 --speedup-trt ", "ts_nvfuser_cudagraphs": "--inference --backend=cudagraphs_ts ", "inductor": "--inference -n50 --inductor ", "inductor_no_cudagraphs": "--inference -n50 --inductor --disable-cudagraphs ", "inductor_max_autotune": "--inference -n50 --inductor --inductor-compile-mode max-autotune ", "inductor_max_autotune_no_cudagraphs": ( "--inference -n50 --inductor --inductor-compile-mode max-autotune-no-cudagraphs --disable-cudagraphs " ), "torchscript-onnx": "--inference -n5 --torchscript-onnx", "dynamo-onnx": "--inference -n5 --dynamo-onnx", }, }
go ahead
Hi @chuanqi129,
Currently guilty commit inductor single run cannot support other backend except inductor. https://github.com/chuanqi129/inductor-tools/blob/main/scripts/modelbench/inductor_single_run.sh#L53-L58
Can we read the cmd line paramerters from TABLE in runner.py? So we don't have to update some of the options when enabling a new backend. https://github.com/pytorch/pytorch/blob/main/benchmarks/dynamo/runner.py#L64-L96