Open TomHeaven opened 5 years ago
My colleague write a fast version of CRF and ICRF process: https://github.com/GuoShi28/CBDNet/tree/master/Others. You can refer to this code.
The CRF_map and ICRF_map are collected from others work. You can refer to the CBDNet paper for more details.
@GuoShi28 Thanks for your reply. I think the problem is the code has a lot of redundant computations. Especially for ICRF, there are only 256 types (8 bit) of input, so building a lookup table of size 201 x 256 will do the trick.
For CRF, building a lookup table with more bits (e.g. 16bit with 2^16 types) will dramatically increase the speed, however, it will introduce quantization errors. I'm still testing the performance difference caused by the change.
Thank you. It's a great suggestion for ICRF about a small range of lookup table. I just simply follow the setting of CRF which increase more computations as you said.
Functions CRF_Map and ICRF_Map are very slow in python (each function takes about 8 seconds when processing a 512x512 image) and are bottlenecks in efficiency.
I wonder where do you get the two .mat files used in CRF_map & ICRF_Map? Are they generated by formula or collected data?
In my experience, we only need one lookup table to do gamma mapping, which should be ultra fast. So can we combine the two lookup tables in .mat files to make the algorithms faster?