Closed Caedin closed 3 years ago
Yes I think it was done on purpose to make sure the TCN can go back "in time" to the beginning of the sequence to solve the problem.
If we remove this line the network will not be able to solve the task.
I've updated the README to clarify it: https://github.com/philipperemy/keras-tcn/blob/master/README.md
Thanks for reporting :)
Describe the bug First example in the documentation gives away the answer by inserting the result into the x_train dataset
I got perfect prediction results with just a few trains on my first run though.
x_train[pos_indices, 0] = 1.0
Removing this line creates a training curve that makes sense.
Following the example as written:
After removing the offending line: