Open LexiYIN opened 5 years ago
hello, this is your full codes?
I also get negative KL-divergence, with every pair of image and unary potentials I test with. It isn't normal (it should be positive), but I don't know how to fix the (presumably erroneous) code, in which I assume the variable MatrixXf
is negative?
I am a beginner, I am sorry to bother you that the kl-Divergence I had got is a negative number, and I do not know why, is it normal? By the way ,the seg result is quite good
Q , tmp1, tmp2 = d.startInference() for i in range(10): print("KL-divergence at {}: {}".format(i, d.klDivergence(Q))) d.stepInference(Q, tmp1, tmp2)
KL-divergence at 0: -3206945.04475 KL-divergence at 1: -3291324.17025 KL-divergence at 2: -3297916.68741 KL-divergence at 3: -3301076.73635 KL-divergence at 4: -3302835.35756 KL-divergence at 5: -3303664.75467 KL-divergence at 6: -3304039.80505 KL-divergence at 7: -3304082.09449 KL-divergence at 8: -3304093.55528 KL-divergence at 9: -3304097.5484