mathmanu / caffe-jacinto

This repository has moved. The new link can be obtained from https://github.com/TexasInstruments/jacinto-ai-devkit
116 stars 35 forks source link

Layerwise quantization #15

Open Wronskia opened 6 years ago

Wronskia commented 6 years ago

Hello manu,

is it correct to use setattr(layer.quantization_param.precision, 8) given the generated caffe_pb2 for setting the layerwise quantization ?

Also, is it possible to sparsify networks layer by layer ?

Thanks a lot, Best

mathmanu commented 6 years ago

Are you trying to modify quantization parameters via python/pycaffe interface? I have not tried it - so don't know whether it works. What is the issue that you are facing?

Currently sparsity is applied in the function: void Net::FindAndApplyChannelThresholdNet() in net.cpp

It may be possible to specify a layer index to this function so that it can sparsify only a selected layer.

However, the sparsity target that is specified is for the entire network. Also in void Solver::ThresholdNet() There is a check to see whether the sparsity target that is specified has been achieved or not. This is also based on the sparsity of the entire network. This will also need change.

Wronskia commented 6 years ago

Hey manu,

Thanks for your answer, Actually the error I get using pycaffe is the following TypeError: unhashable type: 'LayerParameter'

Here is my code layer.quantization_param.qparam_w.bitwidth = 8

it is acutally a type error, however what is weird is that in the caffe.proto the type of the bitwidth attribute is integer

thanks

mathmanu commented 6 years ago

As far as I understand, pycaffe doesn't allow you to change the layer parameters. But may be you can work around this restriction by writing your own functions to get and set them. Let me know if you succeed. https://stackoverflow.com/questions/40858548/dynamically-modify-layers-parameters-in-caffe

mathmanu commented 6 years ago

You can also put that field into the prototxt file. But, this method doesn't allow you to change afterwards.