oneday88 / deepTCN

149 stars 48 forks source link

How do you implement Causality in Temporal Convolutional Network (TCN)? #7

Open reachsumit opened 1 year ago

reachsumit commented 1 year ago

Thanks for sharing this work. I noticed that in ResidualTCN, the data isn't being masked. https://github.com/oneday88/deepTCN/blob/master/traffic/MxnetModels/pointModels.py#L11-L18

which is commonly done in other TCN implementations to ensure that there can be no leakage from the future into the past. For example:

How does your code ensure causality without masking? Is it done in data preprocessing instead?

oneday88 commented 1 year ago

Hi, It is a great question, ! You are right, the the data is better masked.

I have just double checked the data preprocessing codes, it seems fine. E.g., for traffic dataset,, https://github.com/oneday88/deepTCN/blob/master/traffic/trafficModelPrepare.py , Given time series of one chosen station ,a moving-windowing size=24 rolling approach with total length =(input=168, output=24) is applied to construct the training dataset and testing dataset, so there are 7 testing sequences, and the testing data do not appear in the training data, it is fine and no data leakage. However, If someone try to change the moving window size, e.g, moving window size=12, there is risk of data leakage of 12 data points if it is not masked.

In our original approach, we just split the data of training and testing part and apply the trained models to predict all 7 testing series at the same time, which may reduce data risk. Sometime, users may also want to utilize the testing dataset to make a better predict one by one, which also increase the risk of data leakage.

I will update a new version of codes when I get some leisure time.