arthurmensch / didyprog

Differentiable Dynamic Programming
MIT License
69 stars 19 forks source link

install issue #5

Open ashutoshsaboo opened 5 years ago

ashutoshsaboo commented 5 years ago

how do i install this without sudo, in user mode (similar to pip install --user)?

tried installing this using python setup.py install --user --prefix= reference: [here], but got the following stack trace:

Traceback (most recent call last):
  File "setup.py", line 63, in <module>
    'Operating System :: MacOS'
  File "/home/x86_64-unknown-linux_ol7-gnu/anaconda-5.2.0/envs/pytorch/lib/python3.6/site-packages/numpy/distutils/core.py", line 135, in setup
    config = configuration()
  File "setup.py", line 29, in configuration
    config.add_subpackage('didypro')
  File "/home/x86_64-unknown-linux_ol7-gnu/anaconda-5.2.0/envs/pytorch/lib/python3.6/site-packages/numpy/distutils/misc_util.py", line 1024, in add_subpackage
    caller_level = 2)
  File "/home/x86_64-unknown-linux_ol7-gnu/anaconda-5.2.0/envs/pytorch/lib/python3.6/site-packages/numpy/distutils/misc_util.py", line 986, in get_subpackage
    caller_level = caller_level+1)
  File "/home/x86_64-unknown-linux_ol7-gnu/anaconda-5.2.0/envs/pytorch/lib/python3.6/site-packages/numpy/distutils/misc_util.py", line 768, in __init__
    raise ValueError("%r is not a directory" % (package_path,))
ValueError: 'didypro' is not a directory

can someone please help me with this? @arthurmensch @lyprince thanks! :)

lyprince commented 5 years ago

In setup.py, find and replace all instances of 'didypro' with 'didyprog'

ashutoshsaboo commented 5 years ago

Thanks @lyprince , figured that out already, and it managed to install. For usage in PyTorch, I found the method dtw_value or dtw_grad here : link - but seems like it needs a theta array - is it supposed to be the pairwise distance matrix between time series in the minibatch, or what is it exactly?

If it's the former I think that dtaidistance package could achieve it : distance_matrix_fast method here. But it gives a (n,n) square ndarray unlike what's mentioned in the docs of the method as mentioned above -

:param theta: _numpy.ndarray, shape = (m, n),
        Distance matrix for DTW

If it's anything else could you suggest how to get the distance matrix? Any suggestions would be really helpful. Thank you! 😄 @lyprince @arthurmensch

mblondel commented 5 years ago

reference/dtw.py is a NumPy implementation, a PyTorch implementation is not available yet. theta should be m x n distance matrix where m and n are the lengths of the two time series.

ashutoshsaboo commented 5 years ago

@mblondel again my question here : link was specific to batching. so if you've got [n,l] batch of time series in case of a neural network ; n is the number of time series in the batch, and l is the length of each time series (assuming each is constant) - then sklearn euclidean__pairwise_distance gives you a n x n distance matrix similar to dtaidistance's method here . Now feeding this distance matrix to the dtw_grad method in this repo - does it make sense? Is this loss function actually made for such kind of batching applications ? Would be great if you could help!

Thank you so much! @mblondel @arthurmensch @lyprince :)