jiazhihao / TASO

The Tensor Algebra SuperOptimizer for Deep Learning
Apache License 2.0
687 stars 90 forks source link

Successfully Run but Little Improvement #40

Open OuHangKresnik opened 4 years ago

OuHangKresnik commented 4 years ago

I think further substitution may be generated if we can fix "Cannot find input tensor" and "unsupported" issue. Below is the log:

Found unsupported ONNX operator: LRN (Skipped) Cannot find input tensor for operator: name(Pad) type(Pad) (Skipped) Cannot find input tensor for operator: name(pooling) type(MaxPool) (Skipped) Cannot find input tensor for operator: name(convolution1) type(Conv) (Skipped) Cannot find input tensor for operator: name(activation1) type(Relu) (Skipped) Found unsupported ONNX operator: LRN (Skipped) Cannot find input tensor for operator: name(Pad1) type(Pad) (Skipped) Cannot find input tensor for operator: name(pooling1) type(MaxPool) (Skipped) Cannot find input tensor for operator: name(convolution2) type(Conv) (Skipped) Cannot find input tensor for operator: name(activation2) type(Relu) (Skipped) Found unsupported ONNX operator: LRN (Skipped) Found unsupported ONNX operator: Flatten (Skipped) Cannot find input tensor for operator: name(innerProduct) type(Gemm) (Skipped) Found unsupported ONNX operator: Flatten (Skipped) Cannot find input tensor for operator: name(innerProduct1) type(Gemm) (Skipped) cost[Conv2D]: i(1 3 128 128) w(20 3 5 5) s(1 1) p(1) cost(0.0544) total_cost(0.0544) cost[Activation]: mode(8) cost(0.0124) total_cost(0.0669) Cost metrics: exe_time(0.0669) flops(0.0871) memory_access(6.6384) kernel_launches(2)

    ===== Start Cost-Based Backtracking Search =====
    [0] cost = 0.0669 bestCost = 0.0669 candidates.size() = 0
    [1] cost = 0.0273 bestCost = 0.0273 candidates.size() = 0
    ===== Finish Cost-Based Backtracking Search =====

    cost[Conv2D]: i(1 3 128 128) w(20 3 5 5) s(1 1) p(1) cost(0.0273) total_cost(0.0273)
    Cost metrics: exe_time(0.0273) flops(0.0882) memory_access(5.4653) kernel_launches(1)

(