tslearn-team / tslearn

The machine learning toolkit for time series analysis in Python
https://tslearn.readthedocs.io
BSD 2-Clause "Simplified" License
2.92k stars 342 forks source link

Add keras implementation of softdtw #511

Open Ivorforce opened 8 months ago

Ivorforce commented 8 months ago

Let me know if this looks correct. I tested it and for equal arrays it returns a negative value. I think it's because softmin can return values less than the input values, but I guess that makes sense if it has to be differentiable.

Anyway, here's a minimal use example:

from keras import layers

model = keras.Sequential([
    layers.InputLayer(input_shape=(15, 1)),
])
model.compile(
    optimizer=keras.optimizers.Adam(0.001),
    loss=SoftDTWLoss()
)
history = model.fit(
    np.arange(15, dtype=float)[None, :, None],
    np.arange(15, dtype=float)[None, :, None],
    epochs=6,
)
print(model.predict(np.arange(15, dtype=float)[None, :, None]))

TODO

Ivorforce commented 8 months ago

I added an even more shoddy version for the gradient. There are definitely some cleanup things to do, but at least it runs with the custom gradient function now.

I am currently getting some graph size issues when trying to use the loss with anything that's not trivial, which is I think caused by the dtw being so recursion intensive. I may have to give up on using dtw for my own project unless I can somehow work around that, but I'll leave the PR up either way for future reference, or other people to fix and use for their projects.