mblondel / soft-dtw

Python implementation of soft-DTW.
BSD 2-Clause "Simplified" License
550 stars 98 forks source link

Matrix dimensions in Eq 2.5 and Alg. 2 #24

Closed Maghoumi closed 3 years ago

Maghoumi commented 3 years ago

I was re-reading your paper recently a bit more carefully, and I have a question about the dimension of some matrices.

Specifically, in Eq 2.5, it's not really clear what the dimension of the jacobian matrix d(delta(x, y)) / dx is expected to be. Below, I've depicted the dimensions of all the matrices involved in the computation per the paper's notations. The right side of the equation yields p x n (same shape as X). But I don't understand how the dimensions are working out for the left side of the equation (the green doodles).

Equation

Another question: my understanding is that in Alg. 2, the dimensionality of E is n x m, and per the above, the output of Alg. 2 should be p x n (same as X). Am I correct?

Thanks in advance!

mblondel commented 3 years ago

It's a bit tricky because we're dealing with matrix outputs. Δ maps x to a distance matrix so it is a function from p x n to n x m. Therefore, its Jacobian is a linear map from p x n to n x m and its transpose is a linear map from n x m top x n. Therefore, the Jacobian transpose of Δ maps B, a matrix of size n x m, to a gradient of size p x n as expected. Appendix B.2 of our new paper explains that in some more details (in that paper, notation is different : time series are of shape n x d).

By the way, I just found your package pytorch-softdtw-cuda, it's great! Could you add the soft-dtw tag to your repo? Thanks!

Maghoumi commented 3 years ago

Thanks for your elaboration. This makes much more sense now.

Added soft-dtw tag to my repo, per your request. Also, if you haven't already, check out DeepNAG which motivated my CUDA implementation.

Closing this issue.