Open RuABraun opened 3 years ago
I want to confirm that the issues I'm experiencing are a fundamental issue with the loss and not my implementation (which is a slight modification of this).
It seems to me that because the final loss is a sum of different paths, changing the (0, 0,) entry in the cost matrix will cause a much larger change in the loss than changing a later entry, as changing (0, 0,) influences every other entry in the cost matrix. Some simple test cases seem to confirm this. Can someone else confirm?
Have you investigated the most robust and right soft DTW? I have tried it but the GPU is out of memory and I just can only set the batch size =1
I want to confirm that the issues I'm experiencing are a fundamental issue with the loss and not my implementation (which is a slight modification of this).
It seems to me that because the final loss is a sum of different paths, changing the (0, 0,) entry in the cost matrix will cause a much larger change in the loss than changing a later entry, as changing (0, 0,) influences every other entry in the cost matrix. Some simple test cases seem to confirm this. Can someone else confirm?