Open chenqiny opened 2 years ago
@chentong319 I recall you did work to get better line numbers. Is your work able to cover these cases? Thanks for looking into this.
I tried to get location information for resnet-50 model in onnx-model-zoo.
Command: onnx-mlir --preserveBitcode --preserveLocations --preserveMLIR /external/onnx-zoo/models/vision/classification/resnet/model/resnet50-caffe2-v1-7.onnx
Expect:
- User can get mlir files for each pass
To get mlir files for each pass, you need to use mlir flag "--print-ir-before-all" or "--print-ir-after-all"
- User can get location information in mlir files. The location attributes indicates source line number in the input mlir.
Yes. Previously, user can only get location information when input is a .mlir file. Now input from .onnx input is fixed.
- User can get location information in input mlir which indicates source line number/node id in onnx file (Optional)
Location inform is line number in *.input.mlir file, no node id.
- User can get listing information includes llvm mlir source lines, instruction bytes, disassembly code and instruction addresses generated from that line. With these information, I can map instructions in perf sampling report with llvm mlir file, I can also map hot instructions with onnx file or mlir files from any pass.
Now the gdb can step through onnx operations. I think perf should be fine if it use the same mechanism as gdb.
If possible I also recommend to output the passes information in a structurized file like json or txt.
onnx-mlir uses mlir pass management tool, but we did not look into the debugging support of mlir. For example, the flag --preserveLocation
should be a MLIR flag, not a onnx-mlir flag, to control the behavior of --print-ir-before/after-all
.
Does onnx-mlir work with perf now? We can try that.
I tried to get location information for resnet-50 model in onnx-model-zoo.
Command: onnx-mlir --preserveBitcode --preserveLocations --preserveMLIR /external/onnx-zoo/models/vision/classification/resnet/model/resnet50-caffe2-v1-7.onnx
Expect:
Actual result There is only input and llvm mlir files.
And in the mlir files, the location information are looks like following:
Input mlir:
%16 = "onnx.Relu"(%15) {onnx_node_name = ""} : (tensor<*xf32>) -> tensor<*xf32> loc(#loc)
llvm mlirPasses: 'builtin.func' Pipeline {anonymous}::DecomposeONNXToONNXPass {anonymous}::ShapeInferencePass Canonicalizer {anonymous}::ShapeInferencePass 'builtin.func' Pipeline {anonymous}::ConstPropONNXToONNXPass {anonymous}::ONNXOpTransformPass SymbolDCE 'builtin.func' Pipeline {anonymous}::ONNXPreKrnlVerifyPass {anonymous}::InstrumentONNXPass {anonymous}::FrontendToKrnlLoweringPass Canonicalizer 'builtin.func' Pipeline {anonymous}::DisconnectKrnlDimFromAllocPass BufferDeallocation {anonymous}::KrnlEnableMemoryPoolPass {anonymous}::KrnlBundleMemoryPoolsPass Canonicalizer 'builtin.func' Pipeline {anonymous}::KrnlOptimizeMemoryPoolsPass Canonicalizer 'builtin.func' Pipeline {anonymous}::ConvertKrnlToAffinePass ConvertVectorToSCF ConvertAffineToStandard SCFToStandard {anonymous}::ConvertKrnlToLLVMPass ReconcileUnrealizedCasts Canonicalizer