Closed AndreasKaratzas closed 1 year ago
I checked these configurations, and there was no error on my platform.
Is there a configuration that you run the CNN successfully with? (To see if the problem is related to configuration).
Please send the output log for one of the configurations to track the error.
PS: Also, as your platform has 4 Big and 4 Small CPU cores, I suggest using (--threads=4 --threads2=4 --total_cores=8) in your configurations to see if it works.
Sample seg fault:
sudo LD_LIBRARY_PATH=/home/ARMCL-pipe-all/build /home/ARMCL-pipe-all/build/examples/graph_resnet50_all_pipe_sync --threads=4 --threads2=4 --total_cores=8 --partition_point=13 --partition_point2=15 --order=L-B-G --n=50
/home/ARMCL-pipe-all/build/examples/graph_resnet50_all_pipe_sync
Threads : 4
Small Cores Threads : 4
Target : Neon
Data type : F32
Data layout : NHWC
Tuner enabled? : false
Cache enabled? : false
Tuner mode : Normal
Tuner file :
MLGO file :
Fast math enabled? : false
Image file :
Labels file : transfer_wait
Partition point is : 13
Second partition point is : 15
Order is : L-B-G
Total number of cores is : 8
Run network for 50 times.
[GRAPH][21-08-2022 10:49:43][INFO] Running mutating pass : NodeFusionMutator
[GRAPH][21-08-2022 10:49:43][INFO] Running mutating pass : GroupedConvolutionMutator
[GRAPH][21-08-2022 10:49:43][INFO] Running mutating pass : InPlaceOperationMutator
[GRAPH][21-08-2022 10:49:43][INFO] Running mutating pass : DepthConcatSubTensorMutator
[GRAPH][21-08-2022 10:49:43][INFO] Running mutating pass : SplitLayerSubTensorMutator
[GRAPH][21-08-2022 10:49:43][INFO] Running mutating pass : NodeExecutionMethodMutator
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated conv1/convolution+conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 3x224x224 Weights shape: 3x7x7x64 Output shape: 64x112x112 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated pool1/MaxPool Type: PoolingLayer Target: Neon Data Type: F32 Input shape: 64x112x112 Output shape: 64x56x56 Pooling info: MAX
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block1/unit1/bottleneck_v1/conv1/convolution+block1/unit1/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 64x56x56 Weights shape: 64x1x1x64 Output shape: 64x56x56 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block1/unit1/bottleneck_v1/conv2/convolution+block1/unit1/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 64x56x56 Weights shape: 64x3x3x64 Output shape: 64x56x56 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block1/unit1/bottleneck_v1/conv3/convolution+block1/unit1/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 64x56x56 Weights shape: 64x1x1x256 Output shape: 256x56x56
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block1/unit1/bottleneck_v1/shortcut/convolution+block1/unit1/bottleneck_v1/shortcut/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 64x56x56 Weights shape: 64x1x1x256 Output shape: 256x56x56
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block1/unit1/bottleneck_v1/add Type: EltwiseLayer Target: Neon Operation: ArithmeticAddition Data Type: F32 Shape: 256x56x56
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block1/unit1/bottleneck_v1/Relu Type: ActivationLayer Target: Neon Data Type: F32 Shape: 256x56x56 Activation function: RELU a: 0 b: 0 InPlace : 1
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block1/unit2/bottleneck_v1/conv1/convolution+block1/unit2/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 256x56x56 Weights shape: 256x1x1x64 Output shape: 64x56x56 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block1/unit2/bottleneck_v1/conv2/convolution+block1/unit2/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 64x56x56 Weights shape: 64x3x3x64 Output shape: 64x56x56 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block1/unit2/bottleneck_v1/conv3/convolution+block1/unit2/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 64x56x56 Weights shape: 64x1x1x256 Output shape: 256x56x56
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block1/unit2/bottleneck_v1/add Type: EltwiseLayer Target: Neon Operation: ArithmeticAddition Data Type: F32 Shape: 256x56x56
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block1/unit2/bottleneck_v1/Relu Type: ActivationLayer Target: Neon Data Type: F32 Shape: 256x56x56 Activation function: RELU a: 0 b: 0 InPlace : 1
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block1/unit3/bottleneck_v1/shortcut/MaxPool Type: PoolingLayer Target: Neon Data Type: F32 Input shape: 256x56x56 Output shape: 256x28x28 Pooling info: MAX
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block1/unit3/bottleneck_v1/conv1/convolution+block1/unit3/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 256x56x56 Weights shape: 256x1x1x64 Output shape: 64x56x56 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block1/unit3/bottleneck_v1/conv2/convolution+block1/unit3/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 64x56x56 Weights shape: 64x3x3x64 Output shape: 64x28x28 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block1/unit3/bottleneck_v1/conv3/convolution+block1/unit3/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 64x28x28 Weights shape: 64x1x1x256 Output shape: 256x28x28
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block1/unit3/bottleneck_v1/add Type: EltwiseLayer Target: Neon Operation: ArithmeticAddition Data Type: F32 Shape: 256x28x28
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block1/unit3/bottleneck_v1/Relu Type: ActivationLayer Target: Neon Data Type: F32 Shape: 256x28x28 Activation function: RELU a: 0 b: 0 InPlace : 1
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit1/bottleneck_v1/conv1/convolution+block2/unit1/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 256x28x28 Weights shape: 256x1x1x128 Output shape: 128x28x28 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit1/bottleneck_v1/conv2/convolution+block2/unit1/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 128x28x28 Weights shape: 128x3x3x128 Output shape: 128x28x28 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit1/bottleneck_v1/conv3/convolution+block2/unit1/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 128x28x28 Weights shape: 128x1x1x512 Output shape: 512x28x28
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit1/bottleneck_v1/shortcut/convolution+block2/unit1/bottleneck_v1/shortcut/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 256x28x28 Weights shape: 256x1x1x512 Output shape: 512x28x28
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit1/bottleneck_v1/add Type: EltwiseLayer Target: Neon Operation: ArithmeticAddition Data Type: F32 Shape: 512x28x28
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit1/bottleneck_v1/Relu Type: ActivationLayer Target: Neon Data Type: F32 Shape: 512x28x28 Activation function: RELU a: 0 b: 0 InPlace : 1
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit2/bottleneck_v1/conv1/convolution+block2/unit2/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 512x28x28 Weights shape: 512x1x1x128 Output shape: 128x28x28 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit2/bottleneck_v1/conv2/convolution+block2/unit2/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 128x28x28 Weights shape: 128x3x3x128 Output shape: 128x28x28 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit2/bottleneck_v1/conv3/convolution+block2/unit2/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 128x28x28 Weights shape: 128x1x1x512 Output shape: 512x28x28
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit2/bottleneck_v1/add Type: EltwiseLayer Target: Neon Operation: ArithmeticAddition Data Type: F32 Shape: 512x28x28
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit2/bottleneck_v1/Relu Type: ActivationLayer Target: Neon Data Type: F32 Shape: 512x28x28 Activation function: RELU a: 0 b: 0 InPlace : 1
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit3/bottleneck_v1/conv1/convolution+block2/unit3/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 512x28x28 Weights shape: 512x1x1x128 Output shape: 128x28x28 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit3/bottleneck_v1/conv2/convolution+block2/unit3/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 128x28x28 Weights shape: 128x3x3x128 Output shape: 128x28x28 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit3/bottleneck_v1/conv3/convolution+block2/unit3/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 128x28x28 Weights shape: 128x1x1x512 Output shape: 512x28x28
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit3/bottleneck_v1/add Type: EltwiseLayer Target: Neon Operation: ArithmeticAddition Data Type: F32 Shape: 512x28x28
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit3/bottleneck_v1/Relu Type: ActivationLayer Target: Neon Data Type: F32 Shape: 512x28x28 Activation function: RELU a: 0 b: 0 InPlace : 1
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit4/bottleneck_v1/shortcut/MaxPool Type: PoolingLayer Target: Neon Data Type: F32 Input shape: 512x28x28 Output shape: 512x14x14 Pooling info: MAX
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit4/bottleneck_v1/conv1/convolution+block2/unit4/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 512x28x28 Weights shape: 512x1x1x128 Output shape: 128x28x28 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit4/bottleneck_v1/conv2/convolution+block2/unit4/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 128x28x28 Weights shape: 128x3x3x128 Output shape: 128x14x14 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit4/bottleneck_v1/conv3/convolution+block2/unit4/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 128x14x14 Weights shape: 128x1x1x512 Output shape: 512x14x14
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit4/bottleneck_v1/add Type: EltwiseLayer Target: Neon Operation: ArithmeticAddition Data Type: F32 Shape: 512x14x14
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block2/unit4/bottleneck_v1/Relu Type: ActivationLayer Target: Neon Data Type: F32 Shape: 512x14x14 Activation function: RELU a: 0 b: 0 InPlace : 1
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit1/bottleneck_v1/conv1/convolution+block3/unit1/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 512x14x14 Weights shape: 512x1x1x256 Output shape: 256x14x14 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit1/bottleneck_v1/conv2/convolution+block3/unit1/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 256x14x14 Weights shape: 256x3x3x256 Output shape: 256x14x14 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit1/bottleneck_v1/conv3/convolution+block3/unit1/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 256x14x14 Weights shape: 256x1x1x1024 Output shape: 1024x14x14
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit1/bottleneck_v1/shortcut/convolution+block3/unit1/bottleneck_v1/shortcut/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 512x14x14 Weights shape: 512x1x1x1024 Output shape: 1024x14x14
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit1/bottleneck_v1/add Type: EltwiseLayer Target: Neon Operation: ArithmeticAddition Data Type: F32 Shape: 1024x14x14
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit1/bottleneck_v1/Relu Type: ActivationLayer Target: Neon Data Type: F32 Shape: 1024x14x14 Activation function: RELU a: 0 b: 0 InPlace : 1
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit2/bottleneck_v1/conv1/convolution+block3/unit2/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 1024x14x14 Weights shape: 1024x1x1x256 Output shape: 256x14x14 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit2/bottleneck_v1/conv2/convolution+block3/unit2/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 256x14x14 Weights shape: 256x3x3x256 Output shape: 256x14x14 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit2/bottleneck_v1/conv3/convolution+block3/unit2/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 256x14x14 Weights shape: 256x1x1x1024 Output shape: 1024x14x14
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit2/bottleneck_v1/add Type: EltwiseLayer Target: Neon Operation: ArithmeticAddition Data Type: F32 Shape: 1024x14x14
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit2/bottleneck_v1/Relu Type: ActivationLayer Target: Neon Data Type: F32 Shape: 1024x14x14 Activation function: RELU a: 0 b: 0 InPlace : 1
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit3/bottleneck_v1/conv1/convolution+block3/unit3/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 1024x14x14 Weights shape: 1024x1x1x256 Output shape: 256x14x14 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit3/bottleneck_v1/conv2/convolution+block3/unit3/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 256x14x14 Weights shape: 256x3x3x256 Output shape: 256x14x14 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit3/bottleneck_v1/conv3/convolution+block3/unit3/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 256x14x14 Weights shape: 256x1x1x1024 Output shape: 1024x14x14
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit3/bottleneck_v1/add Type: EltwiseLayer Target: Neon Operation: ArithmeticAddition Data Type: F32 Shape: 1024x14x14
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit3/bottleneck_v1/Relu Type: ActivationLayer Target: Neon Data Type: F32 Shape: 1024x14x14 Activation function: RELU a: 0 b: 0 InPlace : 1
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit4/bottleneck_v1/conv1/convolution+block3/unit4/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 1024x14x14 Weights shape: 1024x1x1x256 Output shape: 256x14x14 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit4/bottleneck_v1/conv2/convolution+block3/unit4/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 256x14x14 Weights shape: 256x3x3x256 Output shape: 256x14x14 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit4/bottleneck_v1/conv3/convolution+block3/unit4/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 256x14x14 Weights shape: 256x1x1x1024 Output shape: 1024x14x14
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit4/bottleneck_v1/add Type: EltwiseLayer Target: Neon Operation: ArithmeticAddition Data Type: F32 Shape: 1024x14x14
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit4/bottleneck_v1/Relu Type: ActivationLayer Target: Neon Data Type: F32 Shape: 1024x14x14 Activation function: RELU a: 0 b: 0 InPlace : 1
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit5/bottleneck_v1/conv1/convolution+block3/unit5/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 1024x14x14 Weights shape: 1024x1x1x256 Output shape: 256x14x14 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit5/bottleneck_v1/conv2/convolution+block3/unit5/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 256x14x14 Weights shape: 256x3x3x256 Output shape: 256x14x14 RELU
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit5/bottleneck_v1/conv3/convolution+block3/unit5/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 256x14x14 Weights shape: 256x1x1x1024 Output shape: 1024x14x14
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit5/bottleneck_v1/add Type: EltwiseLayer Target: Neon Operation: ArithmeticAddition Data Type: F32 Shape: 1024x14x14
[GRAPH][21-08-2022 10:49:43][INFO] Instantiated block3/unit5/bottleneck_v1/Relu Type: ActivationLayer Target: Neon Data Type: F32 Shape: 1024x14x14 Activation function: RELU a: 0 b: 0 InPlace : 1
[GRAPH][21-08-2022 10:49:43][INFO] Running mutating pass : NodeFusionMutator
Segmentation fault
Another sample seg fault:
sudo LD_LIBRARY_PATH=/home/ARMCL-pipe-all/build /home/ARMCL-pipe-all/build/examples/graph_resnet50_all_pipe_sync --threads=4 --threads2=4 --total_cores=8 --partition_point=1 --partition_point2=3 --order=B-L-G --n=50
/home/ARMCL-pipe-all/build/examples/graph_resnet50_all_pipe_sync
Threads : 4
Small Cores Threads : 4
Target : Neon
Data type : F32
Data layout : NHWC
Tuner enabled? : false
Cache enabled? : false
Tuner mode : Normal
Tuner file :
MLGO file :
Fast math enabled? : false
Image file :
Labels file : transfer_wait
Partition point is : 1
Second partition point is : 3
Order is : B-L-G
Total number of cores is : 8
Run network for 50 times.
[GRAPH][21-08-2022 10:53:21][INFO] Running mutating pass : NodeFusionMutator
[GRAPH][21-08-2022 10:53:21][INFO] Running mutating pass : GroupedConvolutionMutator
[GRAPH][21-08-2022 10:53:21][INFO] Running mutating pass : InPlaceOperationMutator
[GRAPH][21-08-2022 10:53:21][INFO] Running mutating pass : DepthConcatSubTensorMutator
[GRAPH][21-08-2022 10:53:21][INFO] Running mutating pass : SplitLayerSubTensorMutator
[GRAPH][21-08-2022 10:53:21][INFO] Running mutating pass : NodeExecutionMethodMutator
[GRAPH][21-08-2022 10:53:21][INFO] Instantiated conv1/convolution+conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 3x224x224 Weights shape: 3x7x7x64 Output shape: 64x112x112 RELU
[GRAPH][21-08-2022 10:53:21][INFO] Instantiated pool1/MaxPool Type: PoolingLayer Target: Neon Data Type: F32 Input shape: 64x112x112 Output shape: 64x56x56 Pooling info: MAX
[GRAPH][21-08-2022 10:53:21][INFO] Running mutating pass : NodeFusionMutator
Segmentation fault
Configuration that runs properly:
sudo LD_LIBRARY_PATH=/home/ARMCL-pipe-all/build /home/ARMCL-pipe-all/build/examples/graph_resnet50_all_pipe_sync --threads=4 --threads2=4 --total_cores=8 --partition_point=2 --partition_point2=4 --order=B-L-G --n=50
/home/ARMCL-pipe-all/build/examples/graph_resnet50_all_pipe_sync
Threads : 4
Small Cores Threads : 4
Target : Neon
Data type : F32
Data layout : NHWC
Tuner enabled? : false
Cache enabled? : false
Tuner mode : Normal
Tuner file :
MLGO file :
Fast math enabled? : false
Image file :
Labels file : transfer_wait
Partition point is : 2
Second partition point is : 4
Order is : B-L-G
Total number of cores is : 8
Run network for 50 times.
[GRAPH][21-08-2022 10:54:09][INFO] Running mutating pass : NodeFusionMutator
[GRAPH][21-08-2022 10:54:09][INFO] Running mutating pass : GroupedConvolutionMutator
[GRAPH][21-08-2022 10:54:09][INFO] Running mutating pass : InPlaceOperationMutator
[GRAPH][21-08-2022 10:54:09][INFO] Running mutating pass : DepthConcatSubTensorMutator
[GRAPH][21-08-2022 10:54:09][INFO] Running mutating pass : SplitLayerSubTensorMutator
[GRAPH][21-08-2022 10:54:09][INFO] Running mutating pass : NodeExecutionMethodMutator
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated conv1/convolution+conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 3x224x224 Weights shape: 3x7x7x64 Output shape: 64x112x112 RELU
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated pool1/MaxPool Type: PoolingLayer Target: Neon Data Type: F32 Input shape: 64x112x112 Output shape: 64x56x56 Pooling info: MAX
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block1/unit1/bottleneck_v1/conv1/convolution+block1/unit1/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 64x56x56 Weights shape: 64x1x1x64 Output shape: 64x56x56 RELU
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block1/unit1/bottleneck_v1/conv2/convolution+block1/unit1/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 64x56x56 Weights shape: 64x3x3x64 Output shape: 64x56x56 RELU
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block1/unit1/bottleneck_v1/conv3/convolution+block1/unit1/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 64x56x56 Weights shape: 64x1x1x256 Output shape: 256x56x56
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block1/unit1/bottleneck_v1/shortcut/convolution+block1/unit1/bottleneck_v1/shortcut/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 64x56x56 Weights shape: 64x1x1x256 Output shape: 256x56x56
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block1/unit1/bottleneck_v1/add Type: EltwiseLayer Target: Neon Operation: ArithmeticAddition Data Type: F32 Shape: 256x56x56
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block1/unit1/bottleneck_v1/Relu Type: ActivationLayer Target: Neon Data Type: F32 Shape: 256x56x56 Activation function: RELU a: 0 b: 0 InPlace : 1
[GRAPH][21-08-2022 10:54:09][INFO] Running mutating pass : NodeFusionMutator
[GRAPH][21-08-2022 10:54:09][INFO] Running mutating pass : GroupedConvolutionMutator
[GRAPH][21-08-2022 10:54:09][INFO] Running mutating pass : InPlaceOperationMutator
[GRAPH][21-08-2022 10:54:09][INFO] Running mutating pass : DepthConcatSubTensorMutator
[GRAPH][21-08-2022 10:54:09][INFO] Running mutating pass : SplitLayerSubTensorMutator
[GRAPH][21-08-2022 10:54:09][INFO] Running mutating pass : NodeExecutionMethodMutator
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block1/unit2/bottleneck_v1/conv1/convolution+block1/unit2/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 256x56x56 Weights shape: 256x1x1x64 Output shape: 64x56x56 RELU
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block1/unit2/bottleneck_v1/conv2/convolution+block1/unit2/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 64x56x56 Weights shape: 64x3x3x64 Output shape: 64x56x56 RELU
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block1/unit2/bottleneck_v1/conv3/convolution+block1/unit2/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 64x56x56 Weights shape: 64x1x1x256 Output shape: 256x56x56
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block1/unit2/bottleneck_v1/add Type: EltwiseLayer Target: Neon Operation: ArithmeticAddition Data Type: F32 Shape: 256x56x56
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block1/unit2/bottleneck_v1/Relu Type: ActivationLayer Target: Neon Data Type: F32 Shape: 256x56x56 Activation function: RELU a: 0 b: 0 InPlace : 1
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block1/unit3/bottleneck_v1/shortcut/MaxPool Type: PoolingLayer Target: Neon Data Type: F32 Input shape: 256x56x56 Output shape: 256x28x28 Pooling info: MAX
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block1/unit3/bottleneck_v1/conv1/convolution+block1/unit3/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 256x56x56 Weights shape: 256x1x1x64 Output shape: 64x56x56 RELU
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block1/unit3/bottleneck_v1/conv2/convolution+block1/unit3/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 64x56x56 Weights shape: 64x3x3x64 Output shape: 64x28x28 RELU
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block1/unit3/bottleneck_v1/conv3/convolution+block1/unit3/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: Neon Data Type: F32 Input shape: 64x28x28 Weights shape: 64x1x1x256 Output shape: 256x28x28
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block1/unit3/bottleneck_v1/add Type: EltwiseLayer Target: Neon Operation: ArithmeticAddition Data Type: F32 Shape: 256x28x28
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block1/unit3/bottleneck_v1/Relu Type: ActivationLayer Target: Neon Data Type: F32 Shape: 256x28x28 Activation function: RELU a: 0 b: 0 InPlace : 1
[CORE][21-08-2022 10:54:09][INFO] "\"Cannot open DotMLGO file . Use default heuristics instea
[GRAPH][21-08-2022 10:54:09][INFO] Running mutating pass : NodeFusionMutator
[GRAPH][21-08-2022 10:54:09][INFO] Running mutating pass : GroupedConvolutionMutator
[GRAPH][21-08-2022 10:54:09][INFO] Running mutating pass : InPlaceOperationMutator
[GRAPH][21-08-2022 10:54:09][INFO] Running mutating pass : DepthConcatSubTensorMutator
[GRAPH][21-08-2022 10:54:09][INFO] Running mutating pass : SplitLayerSubTensorMutator
[GRAPH][21-08-2022 10:54:09][INFO] Running mutating pass : NodeExecutionMethodMutator
[CORE][21-08-2022 10:54:09][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=1,N=1000,K=2048,B=1)
[CORE][21-08-2022 10:54:09][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:09][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:09][INFO] "\"Use gemm kernel from default heuristics: Reshaped_Only_RHS
[CORE][21-08-2022 10:54:09][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=1,N=1000,K=2048,B=1)
[CORE][21-08-2022 10:54:09][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:09][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:09][INFO] "\"Use gemm kernel from default heuristics: Reshaped_Only_RHS
[CORE][21-08-2022 10:54:09][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=256,B=1)
[CORE][21-08-2022 10:54:09][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:09][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:09][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:09][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=256,B=1)
[CORE][21-08-2022 10:54:09][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:09][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:09][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:09][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=256,B=1)
[CORE][21-08-2022 10:54:09][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:09][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:09][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:09][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=256,B=1)
[CORE][21-08-2022 10:54:09][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:09][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:09][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:09][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=256,B=1)
[CORE][21-08-2022 10:54:09][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:09][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:09][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block2/unit1/bottleneck_v1/shortcut/convolution+block2/unit1/bottleneck_v1/shortcut/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 256x28x28 Weights shape: 256x1x1x512 Output shape: 512x28x28
[CORE][21-08-2022 10:54:09][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=256,B=1)
[CORE][21-08-2022 10:54:09][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:09][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:09][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:09][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=256,B=1)
[CORE][21-08-2022 10:54:09][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:09][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:09][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:09][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=256,B=1)
[CORE][21-08-2022 10:54:09][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:09][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:09][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:09][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=256,B=1)
[CORE][21-08-2022 10:54:09][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:09][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:09][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:09][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=256,B=1)
[CORE][21-08-2022 10:54:09][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:09][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:09][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:09][INFO] Instantiated block2/unit1/bottleneck_v1/conv1/convolution+block2/unit1/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 256x28x28 Weights shape: 256x1x1x128 Output shape: 128x28x28 RELU
[CORE][21-08-2022 10:54:09][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=128,K=128,B=36)
[CORE][21-08-2022 10:54:09][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:09][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:09][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:09][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=128,K=128,B=36)
[CORE][21-08-2022 10:54:09][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:09][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:09][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:09][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=128,K=128,B=36)
[CORE][21-08-2022 10:54:09][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:09][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:09][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:10][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=128,K=128,B=36)
[CORE][21-08-2022 10:54:10][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:10][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:10][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:10][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=128,K=128,B=36)
[CORE][21-08-2022 10:54:10][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:10][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:10][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:10][INFO] Instantiated block2/unit1/bottleneck_v1/conv2/convolution+block2/unit1/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 128x28x28 Weights shape: 128x3x3x128 Output shape: 128x28x28 RELU
[CORE][21-08-2022 10:54:10][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:10][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:10][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:10][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:10][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:10][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:10][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:10][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:10][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:10][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:10][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:10][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:10][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:10][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:10][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:10][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:10][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:10][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:10][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:10][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:10][INFO] Instantiated block2/unit1/bottleneck_v1/conv3/convolution+block2/unit1/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 128x28x28 Weights shape: 128x1x1x512 Output shape: 512x28x28
[GRAPH][21-08-2022 10:54:10][INFO] Instantiated block2/unit1/bottleneck_v1/add Type: EltwiseLayer Target: CL Operation: ArithmeticAddition Data Type: F32 Shape: 512x28x28
[CORE][21-08-2022 10:54:10][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=512,B=1)
[CORE][21-08-2022 10:54:10][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:10][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:10][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:10][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=512,B=1)
[CORE][21-08-2022 10:54:10][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:10][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:10][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:10][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=512,B=1)
[CORE][21-08-2022 10:54:10][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:10][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:10][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:10][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=512,B=1)
[CORE][21-08-2022 10:54:10][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:10][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:10][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:10][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=512,B=1)
[CORE][21-08-2022 10:54:10][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:10][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:10][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:11][INFO] Instantiated block2/unit2/bottleneck_v1/conv1/convolution+block2/unit2/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 512x28x28 Weights shape: 512x1x1x128 Output shape: 128x28x28 RELU
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=128,K=128,B=36)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=128,K=128,B=36)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=128,K=128,B=36)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=128,K=128,B=36)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=128,K=128,B=36)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:11][INFO] Instantiated block2/unit2/bottleneck_v1/conv2/convolution+block2/unit2/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 128x28x28 Weights shape: 128x3x3x128 Output shape: 128x28x28 RELU
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:11][INFO] Instantiated block2/unit2/bottleneck_v1/conv3/convolution+block2/unit2/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 128x28x28 Weights shape: 128x1x1x512 Output shape: 512x28x28
[GRAPH][21-08-2022 10:54:11][INFO] Instantiated block2/unit2/bottleneck_v1/add Type: EltwiseLayer Target: CL Operation: ArithmeticAddition Data Type: F32 Shape: 512x28x28
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=512,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=512,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=512,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=512,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=512,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:11][INFO] Instantiated block2/unit3/bottleneck_v1/conv1/convolution+block2/unit3/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 512x28x28 Weights shape: 512x1x1x128 Output shape: 128x28x28 RELU
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=128,K=128,B=36)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=128,K=128,B=36)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=128,K=128,B=36)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=128,K=128,B=36)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=128,K=128,B=36)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:11][INFO] Instantiated block2/unit3/bottleneck_v1/conv2/convolution+block2/unit3/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 128x28x28 Weights shape: 128x3x3x128 Output shape: 128x28x28 RELU
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:11][INFO] Instantiated block2/unit3/bottleneck_v1/conv3/convolution+block2/unit3/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 128x28x28 Weights shape: 128x1x1x512 Output shape: 512x28x28
[GRAPH][21-08-2022 10:54:11][INFO] Instantiated block2/unit3/bottleneck_v1/add Type: EltwiseLayer Target: CL Operation: ArithmeticAddition Data Type: F32 Shape: 512x28x28
[GRAPH][21-08-2022 10:54:11][INFO] Instantiated block2/unit4/bottleneck_v1/shortcut/MaxPool Type: PoolingLayer Target: CL Data Type: F32 Input shape: 512x28x28 Output shape: 512x14x14 Pooling info: MAX
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=512,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=512,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=512,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=512,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=784,N=128,K=512,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:11][INFO] Instantiated block2/unit4/bottleneck_v1/conv1/convolution+block2/unit4/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 512x28x28 Weights shape: 512x1x1x128 Output shape: 128x28x28 RELU
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=128,K=1152,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=128,K=1152,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=128,K=1152,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=128,K=1152,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=128,K=1152,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:11][INFO] Instantiated block2/unit4/bottleneck_v1/conv2/convolution+block2/unit4/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 128x28x28 Weights shape: 128x3x3x128 Output shape: 128x14x14 RELU
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=512,K=128,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:11][INFO] Instantiated block2/unit4/bottleneck_v1/conv3/convolution+block2/unit4/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 128x14x14 Weights shape: 128x1x1x512 Output shape: 512x14x14
[GRAPH][21-08-2022 10:54:11][INFO] Instantiated block2/unit4/bottleneck_v1/add Type: EltwiseLayer Target: CL Operation: ArithmeticAddition Data Type: F32 Shape: 512x14x14
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=512,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=512,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=512,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=512,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=512,B=1)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:11][INFO] Instantiated block3/unit1/bottleneck_v1/conv1/convolution+block3/unit1/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 512x14x14 Weights shape: 512x1x1x256 Output shape: 256x14x14 RELU
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:11][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:11][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:11][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:11][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit1/bottleneck_v1/conv2/convolution+block3/unit1/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 256x14x14 Weights shape: 256x3x3x256 Output shape: 256x14x14 RELU
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit1/bottleneck_v1/conv3/convolution+block3/unit1/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 256x14x14 Weights shape: 256x1x1x1024 Output shape: 1024x14x14
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=512,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=512,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=512,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=512,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=512,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm config reshaped. Query(IP=g72,DataType=F32,M=196,N=1024,K=512,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use reshaped config from default heuristics: LHS info: ( m0= 5 k0= 4 v0= 2 trans= 0 inter= 0}) ; RHS info: ( n0= 4 k0= 4 h0= 16 trans= 1 inter= 1 exp_img=0})
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit1/bottleneck_v1/shortcut/convolution+block3/unit1/bottleneck_v1/shortcut/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 512x14x14 Weights shape: 512x1x1x1024 Output shape: 1024x14x14
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit1/bottleneck_v1/add Type: EltwiseLayer Target: CL Operation: ArithmeticAddition Data Type: F32 Shape: 1024x14x14
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit2/bottleneck_v1/conv1/convolution+block3/unit2/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 1024x14x14 Weights shape: 1024x1x1x256 Output shape: 256x14x14 RELU
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit2/bottleneck_v1/conv2/convolution+block3/unit2/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 256x14x14 Weights shape: 256x3x3x256 Output shape: 256x14x14 RELU
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit2/bottleneck_v1/conv3/convolution+block3/unit2/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 256x14x14 Weights shape: 256x1x1x1024 Output shape: 1024x14x14
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit2/bottleneck_v1/add Type: EltwiseLayer Target: CL Operation: ArithmeticAddition Data Type: F32 Shape: 1024x14x14
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit3/bottleneck_v1/conv1/convolution+block3/unit3/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 1024x14x14 Weights shape: 1024x1x1x256 Output shape: 256x14x14 RELU
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit3/bottleneck_v1/conv2/convolution+block3/unit3/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 256x14x14 Weights shape: 256x3x3x256 Output shape: 256x14x14 RELU
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit3/bottleneck_v1/conv3/convolution+block3/unit3/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 256x14x14 Weights shape: 256x1x1x1024 Output shape: 1024x14x14
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit3/bottleneck_v1/add Type: EltwiseLayer Target: CL Operation: ArithmeticAddition Data Type: F32 Shape: 1024x14x14
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit4/bottleneck_v1/conv1/convolution+block3/unit4/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 1024x14x14 Weights shape: 1024x1x1x256 Output shape: 256x14x14 RELU
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit4/bottleneck_v1/conv2/convolution+block3/unit4/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 256x14x14 Weights shape: 256x3x3x256 Output shape: 256x14x14 RELU
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit4/bottleneck_v1/conv3/convolution+block3/unit4/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 256x14x14 Weights shape: 256x1x1x1024 Output shape: 1024x14x14
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit4/bottleneck_v1/add Type: EltwiseLayer Target: CL Operation: ArithmeticAddition Data Type: F32 Shape: 1024x14x14
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit5/bottleneck_v1/conv1/convolution+block3/unit5/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 1024x14x14 Weights shape: 1024x1x1x256 Output shape: 256x14x14 RELU
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=16,N=256,K=256,B=36)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit5/bottleneck_v1/conv2/convolution+block3/unit5/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 256x14x14 Weights shape: 256x3x3x256 Output shape: 256x14x14 RELU
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit5/bottleneck_v1/conv3/convolution+block3/unit5/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 256x14x14 Weights shape: 256x1x1x1024 Output shape: 1024x14x14
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit5/bottleneck_v1/add Type: EltwiseLayer Target: CL Operation: ArithmeticAddition Data Type: F32 Shape: 1024x14x14
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit6/bottleneck_v1/shortcut/MaxPool Type: PoolingLayer Target: CL Data Type: F32 Input shape: 1024x14x14 Output shape: 1024x7x7 Pooling info: MAX
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=196,N=256,K=1024,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:12][INFO] Instantiated block3/unit6/bottleneck_v1/conv1/convolution+block3/unit6/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 1024x14x14 Weights shape: 1024x1x1x256 Output shape: 256x14x14 RELU
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=256,K=2304,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[CORE][21-08-2022 10:54:12][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=256,K=2304,B=1)
[CORE][21-08-2022 10:54:12][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:12][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:12][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[CORE][21-08-2022 10:54:13][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=256,K=2304,B=1)
[CORE][21-08-2022 10:54:13][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:13][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:13][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[CORE][21-08-2022 10:54:13][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=256,K=2304,B=1)
[CORE][21-08-2022 10:54:13][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:13][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:13][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[CORE][21-08-2022 10:54:13][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=256,K=2304,B=1)
[CORE][21-08-2022 10:54:13][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:13][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:13][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[GRAPH][21-08-2022 10:54:13][INFO] Instantiated block3/unit6/bottleneck_v1/conv2/convolution+block3/unit6/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 256x14x14 Weights shape: 256x3x3x256 Output shape: 256x7x7 RELU
[CORE][21-08-2022 10:54:13][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:13][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:13][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:13][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:13][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:13][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:13][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:13][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:13][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:13][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:13][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:13][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:13][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:13][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:13][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:13][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:13][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=1024,K=256,B=1)
[CORE][21-08-2022 10:54:13][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:13][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:13][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:13][INFO] Instantiated block3/unit6/bottleneck_v1/conv3/convolution+block3/unit6/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 256x7x7 Weights shape: 256x1x1x1024 Output shape: 1024x7x7
[GRAPH][21-08-2022 10:54:13][INFO] Instantiated block3/unit6/bottleneck_v1/add Type: EltwiseLayer Target: CL Operation: ArithmeticAddition Data Type: F32 Shape: 1024x7x7
[CORE][21-08-2022 10:54:13][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=512,K=1024,B=1)
[CORE][21-08-2022 10:54:13][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:13][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:13][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[CORE][21-08-2022 10:54:13][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=512,K=1024,B=1)
[CORE][21-08-2022 10:54:13][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:13][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:13][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[CORE][21-08-2022 10:54:13][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=512,K=1024,B=1)
[CORE][21-08-2022 10:54:13][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:13][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:13][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[CORE][21-08-2022 10:54:13][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=512,K=1024,B=1)
[CORE][21-08-2022 10:54:13][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:13][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:13][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[CORE][21-08-2022 10:54:13][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=512,K=1024,B=1)
[CORE][21-08-2022 10:54:13][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:13][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:13][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[GRAPH][21-08-2022 10:54:13][INFO] Instantiated block4/unit1/bottleneck_v1/conv1/convolution+block4/unit1/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 1024x7x7 Weights shape: 1024x1x1x512 Output shape: 512x7x7 RELU
[CORE][21-08-2022 10:54:13][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=4,N=512,K=512,B=36)
[CORE][21-08-2022 10:54:13][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:13][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:13][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:13][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=4,N=512,K=512,B=36)
[CORE][21-08-2022 10:54:13][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:13][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:13][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:13][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=4,N=512,K=512,B=36)
[CORE][21-08-2022 10:54:13][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:13][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:13][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:13][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=4,N=512,K=512,B=36)
[CORE][21-08-2022 10:54:13][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:13][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:13][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:13][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=4,N=512,K=512,B=36)
[CORE][21-08-2022 10:54:13][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:13][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:13][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:14][INFO] Instantiated block4/unit1/bottleneck_v1/conv2/convolution+block4/unit1/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 512x7x7 Weights shape: 512x3x3x512 Output shape: 512x7x7 RELU
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=512,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=512,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=512,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=512,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=512,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm config reshaped. Query(IP=g72,DataType=F32,M=49,N=2048,K=512,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use reshaped config from default heuristics: LHS info: ( m0= 5 k0= 4 v0= 2 trans= 0 inter= 0}) ; RHS info: ( n0= 4 k0= 4 h0= 16 trans= 1 inter= 1 exp_img=0})
[GRAPH][21-08-2022 10:54:14][INFO] Instantiated block4/unit1/bottleneck_v1/conv3/convolution+block4/unit1/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 512x7x7 Weights shape: 512x1x1x2048 Output shape: 2048x7x7
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=1024,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=1024,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=1024,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=1024,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=1024,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm config reshaped. Query(IP=g72,DataType=F32,M=49,N=2048,K=1024,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use reshaped config from default heuristics: LHS info: ( m0= 5 k0= 4 v0= 2 trans= 0 inter= 0}) ; RHS info: ( n0= 4 k0= 4 h0= 16 trans= 1 inter= 1 exp_img=0})
[GRAPH][21-08-2022 10:54:14][INFO] Instantiated block4/unit1/bottleneck_v1/shortcut/convolution+block4/unit1/bottleneck_v1/shortcut/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 1024x7x7 Weights shape: 1024x1x1x2048 Output shape: 2048x7x7
[GRAPH][21-08-2022 10:54:14][INFO] Instantiated block4/unit1/bottleneck_v1/add Type: EltwiseLayer Target: CL Operation: ArithmeticAddition Data Type: F32 Shape: 2048x7x7
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=512,K=2048,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=512,K=2048,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=512,K=2048,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=512,K=2048,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=512,K=2048,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[GRAPH][21-08-2022 10:54:14][INFO] Instantiated block4/unit2/bottleneck_v1/conv1/convolution+block4/unit2/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 2048x7x7 Weights shape: 2048x1x1x512 Output shape: 512x7x7 RELU
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=4,N=512,K=512,B=36)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=4,N=512,K=512,B=36)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=4,N=512,K=512,B=36)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=4,N=512,K=512,B=36)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=4,N=512,K=512,B=36)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:14][INFO] Instantiated block4/unit2/bottleneck_v1/conv2/convolution+block4/unit2/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 512x7x7 Weights shape: 512x3x3x512 Output shape: 512x7x7 RELU
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=512,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=512,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=512,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=512,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=512,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm config reshaped. Query(IP=g72,DataType=F32,M=49,N=2048,K=512,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use reshaped config from default heuristics: LHS info: ( m0= 5 k0= 4 v0= 2 trans= 0 inter= 0}) ; RHS info: ( n0= 4 k0= 4 h0= 16 trans= 1 inter= 1 exp_img=0})
[GRAPH][21-08-2022 10:54:14][INFO] Instantiated block4/unit2/bottleneck_v1/conv3/convolution+block4/unit2/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 512x7x7 Weights shape: 512x1x1x2048 Output shape: 2048x7x7
[GRAPH][21-08-2022 10:54:14][INFO] Instantiated block4/unit2/bottleneck_v1/add Type: EltwiseLayer Target: CL Operation: ArithmeticAddition Data Type: F32 Shape: 2048x7x7
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=512,K=2048,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=512,K=2048,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=512,K=2048,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=512,K=2048,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=512,K=2048,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped_V1
[GRAPH][21-08-2022 10:54:14][INFO] Instantiated block4/unit3/bottleneck_v1/conv1/convolution+block4/unit3/bottleneck_v1/conv1/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 2048x7x7 Weights shape: 2048x1x1x512 Output shape: 512x7x7 RELU
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=4,N=512,K=512,B=36)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=4,N=512,K=512,B=36)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=4,N=512,K=512,B=36)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=4,N=512,K=512,B=36)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=4,N=512,K=512,B=36)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Native_V1
[GRAPH][21-08-2022 10:54:14][INFO] Instantiated block4/unit3/bottleneck_v1/conv2/convolution+block4/unit3/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 512x7x7 Weights shape: 512x3x3x512 Output shape: 512x7x7 RELU
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=512,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=512,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=512,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=512,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=49,N=2048,K=512,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm config reshaped. Query(IP=g72,DataType=F32,M=49,N=2048,K=512,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use reshaped config from default heuristics: LHS info: ( m0= 5 k0= 4 v0= 2 trans= 0 inter= 0}) ; RHS info: ( n0= 4 k0= 4 h0= 16 trans= 1 inter= 1 exp_img=0})
[GRAPH][21-08-2022 10:54:14][INFO] Instantiated block4/unit3/bottleneck_v1/conv3/convolution+block4/unit3/bottleneck_v1/conv2/BatchNorm Type: FusedConvolutionBatchNormalizationLayer Target: CL Data Type: F32 Input shape: 512x7x7 Weights shape: 512x1x1x2048 Output shape: 2048x7x7
[GRAPH][21-08-2022 10:54:14][INFO] Instantiated block4/unit3/bottleneck_v1/add Type: EltwiseLayer Target: CL Operation: ArithmeticAddition Data Type: F32 Shape: 2048x7x7
[GRAPH][21-08-2022 10:54:14][INFO] Instantiated pool5 Type: PoolingLayer Target: CL Data Type: F32 Input shape: 2048x7x7 Output shape: 2048 Pooling info: AVG
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=1,N=1000,K=2048,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped_Only_RHS
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=1,N=1000,K=2048,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped_Only_RHS
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=1,N=1000,K=2048,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped_Only_RHS
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=1,N=1000,K=2048,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped_Only_RHS
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm type. Query(IP=g72,DataType=F32,M=1,N=1000,K=2048,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use gemm kernel from default heuristics: Reshaped_Only_RHS
[CORE][21-08-2022 10:54:14][INFO] "\"MLGOHeuristics querying gemm config reshaped only rhs. Query(IP=g72,DataType=F32,M=1,N=1000,K=2048,B=1)
[CORE][21-08-2022 10:54:14][INFO] Invalid DotMLGO. Use default heuristics instead
[CORE][21-08-2022 10:54:14][INFO] MLGOHeuristics query failed
[CORE][21-08-2022 10:54:14][INFO] "\"Use reshaped_only_rhs config from default heuristics: LHS info: ( m0= 1 k0= 16 v0= 1 trans= 0 inter= 0}) ; RHS info: ( n0= 2 k0= 16 h0= 4 trans= 1 inter= 1 exp_img=0})
[GRAPH][21-08-2022 10:54:15][INFO] Instantiated logits/convolution Type: GenericConvolutionLayer Target: CL Data Type: F32 Groups: 1 Input shape: 2048 Weights shape: 2048x1x1x1000 Output shape: 1000
[GRAPH][21-08-2022 10:54:15][INFO] Instantiated predictions/Reshape Type: FlattenLayer Target: CL Data Type: F32 Input shape: 1000 Output shape: 1000
[GRAPH][21-08-2022 10:54:15][INFO] Instantiated predictions/Softmax Type: SoftmaxLayer Target: CL Data Type: F32 Input shape: 1000 Output shape: 1000
First partition point:2
Second partition point:4
Total parts:18
Running Inference ...
stage1_input_time: 0.0030625 ms
stage1_inference_time: 102.046 ms
stage1_total_time: 102.049 ms
stage2_input_time: 25.5049 ms
stage2_inference_time: 75.6481 ms
stage2_total_time: 101.153 ms
stage3_input_time: 6.81891 ms
stage3_inference_time: 84.1443 ms
stage3_total_time: 90.9632 ms
************************************************
Frame rate is: 9.79919 FPS
Frame latency is: 294.166 ms
************************************************
Test passed
'debug': '1',
Please try with 'debug': '0' in build options and let me know if it works.
It did not work. I tried the 2 sample setups that yielded a seg fault with debug: 1
and they still yield a seg fault. I also tested the setup that worked, and it still works. So I suspect that it has to do with the particular configuration.
It did not work. I tried the 2 sample setups that yielded a seg fault with
debug: 1
and they still yield a seg fault. I also tested the setup that worked, and it still works. So I suspect that it has to do with the particular configuration.
Hi Andreas, Sorry for late reply. It is weird that you see this error! Because none of the configs have problem in my platforms. Are other CNNs working well or it is just Resnet50? One way to debug in your platform is to give me ssh access if is possible. This is my email address: aghapour.ehsan17@gmail.com
Ehsan, I'm really sorry for not replying earlier. I am working on your repository again :)
I also bought an Odroid platform which comes with 4 big cores and 2 little ones. On the Odroid platform I observed that the above configuration (resnet50 --partition_point=1 --partition_point2=3 --order=B-L-G) works. This means that there is either something wrong with my Hikey970, or there is something wrong with some hard coded parts inside this repo, like in src/runtime/Scheduler.cpp
, where the number of little cores is hard coded. In any case, I can confidently close this issue. Thank you for your support :)
Hi Andreas,
Sorry for missing your email. Please let me know if you still have some questions.
Regarding your previous question, I would say that number of total cores and big and small cores need to be set when running.
Best, Ehsan
On Wed, Feb 22, 2023 at 11:30 PM Andreas Karatzas @.***> wrote:
Ehsan, I'm really sorry for not replying earlier. I am working on your repository again :)
I also bought an Odroid platform which comes with 4 big cores and 2 little ones. On the Odroid platform I observed that the above configuration (resnet50 --partition_point=1 --partition_point2=3 --order=B-L-G) works. This means that there is either something wrong with my Hikey970, or there is something wrong with some hard coded parts inside this repo, like in src/runtime/Scheduler.cpp, where the number of little cores is hard coded. In any case, I can confidently close this issue. Thank you for your support :)
— Reply to this email directly, view it on GitHub https://github.com/Ehsan-aghapour/ARMCL-pipe-all/issues/3#issuecomment-1440903815, or unsubscribe https://github.com/notifications/unsubscribe-auth/ALPQ3E7HT5OUAITTW2HBXALWY2HQ3ANCNFSM567DAKEA . You are receiving this because you commented.Message ID: @.***>
Ehsan Aghapour
Output of 'strings libarm_compute.so | grep arm_compute_version':
Platform:
Hikey970
Operating System:
Debian 9, Linux kernel 4.9.78-147538-g244928755bbe
Problem description:
Below, there is a list of commands which yield a seg fault:
Also, these throw Runtime Error:
I have not done an exhaustive search to find all mappings causing seg faults, but those are some that definitely yield problems.