C++ implementation of the Local Binary Pattern texture descriptors. This class integrates with OpenCV and FFTW3 to bring a complete and fast implementation of the popular descriptors: LBP u2, ri, riu2 & hf. The routines for calculating these descriptors are inspired by the Matlab code of the original authors.
Under MSVC, using additional SDL checks raises an unary minus error.
This error is raised as applying -(unsigned int) results into an (unsigned int) instead of an (int). This is expected behaviour according to MSVC Docs
lbp.hpp problematic line
I would like to confirm why the
-v & v
operation is done there?This could simply be resolved by casting to int:
-(int)v & v
if it doesn't alter the code intended behaviour.