Open tms2003 opened 5 years ago
Hi,tms2003
ConvolutionDepthwise layer is the layer that I wrote by myself,so you can't find it.This have two methods.
1.You shuold change your prototxt file.You should change all ConvolutionDepthwise layers.
For example, layer { bottom: "layer1-conv" top: "layer2-dwconv" name: "layer2-dwconv" type: "ConvolutionDepthwise" convolution_param { num_output: 32 kernel_size: 3 pad: 1 stride: 1 bias_term: false } }
layer { bottom: "layer1-conv" top: "layer2-dwconv" name: "layer2-dwconv" type: "Convolution" convolution_param { num_output: 32 kernel_size: 3 pad: 1 stride: 1 bias_term: false group: 32 engine: CAFFE weight_filler { type: "msra" } } }
2.You can add this layer into caffe.Baidu [link](链接: https://pan.baidu.com/s/1MgDqQze617zyOiPx_QCpOg 提取码: 9d3f).This method nn inference is faster.
Good luck for you.
I try like this: python yolov3_darknet2caffe.py mobilenet_v1_yolov3.cfg mobilenet_v1_yolov3_final.weights mobilenet_yolov3.prototxt mobilenet_yolov3.caffemodel
and get an error:
F0627 23:02:02.249997 26192 layer_factory.hpp:81] Check failed: registry.count(type) == 1 (0 vs. 1) Unknown layer type: ConvolutionDepthwise (known types: AbsVal, Accuracy, ArgMax, BNLL, BatchNorm, BatchReindex, Bias, Clip, Concat, ContrastiveLoss, Convolution, Crop, Data, Deconvolution, Dropout, DummyData, ELU, Eltwise, Embed, EuclideanLoss, Exp, Filter, Flatten, HDF5Data, HDF5Output, HingeLoss, Im2col, ImageData, InfogainLoss, InnerProduct, Input, LRN, LSTM, LSTMUnit, Log, MVN, MemoryData, MultinomialLogisticLoss, PReLU, Parameter, Pooling, Power, RNN, ReLU, Reduction, Reshape, SPP, Scale, Sigmoid, SigmoidCrossEntropyLoss, Silence, Slice, Softmax, SoftmaxWithLoss, Split, Swish, TanH, Threshold, Tile, Upsample, WindowData) Check failure stack trace:
could u help me?