Open un-lock-me opened 5 years ago
Sorry to take so long responding. "k" is one of the arguments for ee.entropy. def entropy(x, k=3, base=2):
The default is 3, but you clearly have more than 3 samples. I'm guessing the issue has to do with using a tensorflow tensor. I don't think it will work, for a variety of reasons. The most fundamental one is that the k nearest neighbor library that I use in numpy probably won't work on tensors.
I'd be interested if you try x = np.array( same matrix ) whether it works.
Thank you so much for getting back to me with this issue.
Actually I will be able to convert the code to tensorflow so that it can work on tensors.
The only issue I have is that, I need to calculate the entropy over each row, say I need to know what is the entropy of a single row in my tensor.
Taking this to account, and looking at most of the implementations for the KNN on continuous entropy, they consider the whole matrix.
Now I got stuck in as I dont know how to change the code to apply knn only one row by one row. In terms of the coding, I did it but it raises error as it compare each row with other rows(please correct me if Im wrong).
I appreciate it if you could share your idea regarding this with me.
Thanks~
I'm not sure if it's relevant, but I have seen entropy estimators based on pairwise distances that can be implemented as differentiable expressions in tensorflow. For instance, Eq. 10 of this paper: https://arxiv.org/pdf/1705.02436.pdf. I think Artemy has a few papers discussing that "mixture of Gaussians" estimator for entropy and mutual information.
Hi,
Thanks for sharing you work. I want to use the continuous entropy of your project in mine.
I have a matrice like this:
When I apply the
ee.entropy
, I receive this error:This is my code: