msmbuilder / mdentropy

Analyze correlated motions in MD trajectories with only a few lines of Python.
MIT License
31 stars 14 forks source link

Problem install MDentropy #60

Closed tharpole-chwy closed 7 years ago

tharpole-chwy commented 7 years ago

When I try to install MDentropy using anaconda I get: PackageNotFoundError: Package missing in current linux-64 channels:

When I try installing MDentropy using pip, i get told that the install is successful and see a folder called: mdentropy-0.2.dist-info in my site-packages however upon trying the example I get an import error: ImportError: No module named mdentropy

Any suggestions on how to correctly install? I also tried anaconda on mac OS.

cxhernandez commented 7 years ago

The installation instructions on the docs are for the "development" version (0.3), which still has yet to be released. This is why it's not available on Anaconda as of yet.

I'm not sure why the pip install mdentropy is not working for you, but I would suggest trying one of the following methods found in the docs.

rupeshagarwal commented 7 years ago

from mdentropy.metrics import DihedralMutualInformation Traceback (most recent call last): File "", line 1, in File "build/bdist.linux-x86_64/egg/mdentropy/init.py", line 1, in File "build/bdist.linux-x86_64/egg/mdentropy/core/init.py", line 1, in File "/usr/local/lib/python2.7/dist-packages/mdentropy-0.1-py2.7.egg/mdentropy/core/entropy.py", line 89 def kde_entropy(rng, *args, grid_size=20, **kwargs): ^ SyntaxError: invalid syntax

cxhernandez commented 7 years ago

@ragarwa4: MDEntropy is written for Python 3.4+. You'll have to upgrade from Python 2.7 in order to use it.

rupeshagarwal commented 7 years ago

Thank you for quick response. I need some help, what should be the input for mdentropy.entropy? Can u give me an example. I am trying to peform Mutual information calculations for RMSD.

Thanks in advance

cxhernandez commented 7 years ago

I need some help, what should be the input for mdentropy.entropy? Can u give me an example.

Please consult the documentation for mdentropy.entropy.

I am trying to peform Mutual information calculations for RMSD.

You can use mdentropy.mutinf for this calculation, once you've calculated the two RMSD distributions you want to analyze.

rupeshagarwal commented 7 years ago

args : numpy.ndarray, shape = (n_samples, ) or (n_samples, n_dims)

Data of which to calculate entropy. Each array must have the same number of samples.

This is confusing. It is one 1D array or multiple arrays? Does it need to be in a particular order? Is it probability distribution?

On Wed, Jun 28, 2017 at 1:09 AM, Carlos Hernández notifications@github.com wrote:

I need some help, what should be the input for mdentropy.entropy? Can u give me an example.

Please consult the documentation for mdentropy.entropy http://msmbuilder.org/mdentropy/development/generated/mdentropy.entropy.html#mdentropy.entropy .

I am trying to peform Mutual information calculations for RMSD.

You can use mdentropy.mutinf http://msmbuilder.org/mdentropy/development/generated/mdentropy.mutinf.html#mdentropy.mutinf for this calculation, once you've calculated the two RMSD distributions you want to compare.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/msmbuilder/mdentropy/issues/60#issuecomment-311557884, or mute the thread https://github.com/notifications/unsubscribe-auth/ANHSMH7eXuHUjcpeG3VbUEKLcay1C8mqks5sIeAPgaJpZM4M4c7G .

-- Graduate Research Assistant

Center for Molecular Biophysics, Oak Ridge National Lab Genome Science & Technology University of Tennessee,Knoxville

cxhernandez commented 7 years ago

This is confusing. It is one 1D array or multiple arrays?

Either. Entropy can be calculated for high-dimensional input and, as such, this function accepts an arbitrary number of numpy.ndarray objects as input, as long as n_samples is consistent.

Does it need to be in a particular order?

Entropy calculations are agnostic to variable order (e.g. H(X, Y) == H(Y, X)). Try it for yourself, the result should be the same.

Is it probability distribution?

It's your data.

rupeshagarwal commented 7 years ago

Thank you for your reply.

When I am trying to run centropy with knn using bin size 9, its giving me this error:

/usr/local/lib/python3.5/dist-packages/mdentropy-0.1-py3.5.egg/mdentropy/core/entropy.py:124: RuntimeWarning: divide by zero encountered in log

Also to calculate joint entropy, can I just write this:

mdentropy.entropy(9,[None],'knn',X,Y)

cxhernandez commented 7 years ago

When I am trying to run centropy with knn using bin size 9, its giving me this error

I've found the running with k set to 3 is usually sufficient for the knn estimator. With set to 9, it might be that the density is estimated as being very low in certain regions of your data, resulting in NaNs.

Also to calculate joint entropy, can I just write this

Yup! This should work if X and Y have similar dimensionality and equal number of samples.