-
### Subject of the issue
I think entropy and mutual information would be very useful to include in performing a sensitivity analysis or quantifying relative impact of nodes in relation to one another…
-
@tankche1 Hi, I also notice P(T_i, E_j) computation use probs and probs. I do not understand use probs to compute self.MI_task_gate。
I think probs.sum(0) means the frequency of the experts selected b…
-
I want to implement a benchmark which is used in smile/mine.
What is look like below, I guess the code should have the same style with bmi.benchmark.tasks.task_multinormal_dense,
but when I set rho …
-
In our samplers `.mutual_information()` is a callable method, in tasks it is a `@property`. It would be nice for them to be consistent.
-
`MI_den = sum(bsxfun(@times,exp(bsxfun(@rdivide,-abs(bsxfun(@minus,y,centroids)).^2,sigma2)),P_symb),2);% cumulate MI
MI = -mean(log2(MI_den))+1/log(2); % convert to bits…
-
-
To include this, the most simple way would be to use this as starting point
http://penglab.janelia.org/proj/mRMR/#c++
NB: I have only cross-read the code for a couple of minutes....
They include a …
-
Hello!
The implementation looks very interesting and I was looking for a fully differentiable implementation of the mutual information calculation.
I am using 3D images, basically MRI images tha…
-
The extended NMI is between 0 and 1. But the nmi of example in overlap_nmi.py is 2.60794966304. Would you like to explain this question?
-
hi, AlyShamahell
I am working on the mutual information of the 3D CT/MR image. I want to know if your code suit for such 3D image data.