lucasb-eyer / pydensecrf

Python wrapper to Philipp Krähenbühl's dense (fully connected) CRFs with gaussian edge potentials.
MIT License
1.93k stars 411 forks source link

klDivergence #85

Open LexiYIN opened 5 years ago

LexiYIN commented 5 years ago

I am a beginner, I am sorry to bother you that the kl-Divergence I had got is a negative number, and I do not know why, is it normal? By the way ,the seg result is quite good

Q , tmp1, tmp2 = d.startInference() for i in range(10): print("KL-divergence at {}: {}".format(i, d.klDivergence(Q))) d.stepInference(Q, tmp1, tmp2)

KL-divergence at 0: -3206945.04475 KL-divergence at 1: -3291324.17025 KL-divergence at 2: -3297916.68741 KL-divergence at 3: -3301076.73635 KL-divergence at 4: -3302835.35756 KL-divergence at 5: -3303664.75467 KL-divergence at 6: -3304039.80505 KL-divergence at 7: -3304082.09449 KL-divergence at 8: -3304093.55528 KL-divergence at 9: -3304097.5484

laitaihu commented 5 years ago

hello, this is your full codes?

dbuscombe-usgs commented 4 years ago

I also get negative KL-divergence, with every pair of image and unary potentials I test with. It isn't normal (it should be positive), but I don't know how to fix the (presumably erroneous) code, in which I assume the variable MatrixXf is negative?

https://github.com/lucasb-eyer/pydensecrf/blob/4d5343c398d75d7ebae34f51a47769084ba3a613/pydensecrf/densecrf/src/densecrf.cpp#L214