Open opti-mix opened 5 years ago
ConvRelu always inserts Relu which follows Conv since this is fp32 op (if it's not, it should be fixed). For Int8ConvRelu and Int8SumRelu quantization parameters for the op already such that there is no need to add ReLU at all.
When would it makes sense to insert ReLU for Int8ConvRelu and Int8SumRelu?
@rdzhabarov My point is that it should not be a task of the loader to make any smart decisions. If the operation says "there is a ReLU at the end" the loader should honor it and add the ReLU. Later optimizations can remove such ReLUs if necessary. This makes it easier to reason about what is going on and to compare the loaded graph with the actual graph in the C2 model.
C2 has such operations like
ConvRelu
,Int8ConvRelu
,Int8SumRelu
, etc. When Glow loads them we should applyReLU
on the result of the actual operation. If such as ReLU can be eliminated, it will be eliminated by means of the optimization described in #2747