-
### Subject of the issue
I think entropy and mutual information would be very useful to include in performing a sensitivity analysis or quantifying relative impact of nodes in relation to one another…
-
@tankche1 Hi, I also notice P(T_i, E_j) computation use probs and probs. I do not understand use probs to compute self.MI_task_gate。
I think probs.sum(0) means the frequency of the experts selected b…
-
`MI_den = sum(bsxfun(@times,exp(bsxfun(@rdivide,-abs(bsxfun(@minus,y,centroids)).^2,sigma2)),P_symb),2);% cumulate MI
MI = -mean(log2(MI_den))+1/log(2); % convert to bits…
-
To include this, the most simple way would be to use this as starting point
http://penglab.janelia.org/proj/mRMR/#c++
NB: I have only cross-read the code for a couple of minutes....
They include a …
-
The extended NMI is between 0 and 1. But the nmi of example in overlap_nmi.py is 2.60794966304. Would you like to explain this question?
-
hi, AlyShamahell
I am working on the mutual information of the 3D CT/MR image. I want to know if your code suit for such 3D image data.
-
There is a new paper http://arxiv.org/abs/1601.00372 Which tries to improve, NMT by improving Mutual information, i.e not just by modeling P(y|x) it also tries to model, P(x|y). Now, it is not possibl…
-
As an alternative to the current method of non-hashtag sentiment clustering, we can try to perform pointwise mutual information scores on word bigrams. For non-stopwords, we can assess what the point…
-
-
Hello!
The implementation looks very interesting and I was looking for a fully differentiable implementation of the mutual information calculation.
I am using 3D images, basically MRI images tha…