Open peterpaniff opened 5 years ago
First i use prototxt and caffemodel to inference, i get the right result, but the compute cost is too high in my cellphone. when i change it to protobin and caffemodel to inference, the result is wrong? the confidence is alway 1, any ideas?
@peterpaniff The latest version does not support prototxt, how can you be able to use it to inference?
First i use prototxt and caffemodel to inference, i get the right result, but the compute cost is too high in my cellphone. when i change it to protobin and caffemodel to inference, the result is wrong? the confidence is alway 1, any ideas?