AslanDing / AutoTCL

AutoTCL and Parametric Augmentation for Time Series Contrastive Learning(ICLR2024)
11 stars 0 forks source link

About the augmentation function and the introducing random noise. #2

Open yy823 opened 2 months ago

yy823 commented 2 months ago

Hi! I'm very interested in your work, but I'm also a little confused when I read the code. In the code, where is the part of augmentation function η and the random noise Δv? Correspondingly, in the augmentation network's loss function, where is the part of H(v*)? I would be very grateful if you reply!

AslanDing commented 2 months ago

Hi, The first question, the $\eta$ is not directly implemented in the code because we use random perturbation as the noise. This part is added to the feature of the first layer. In our paper, you can find this part in Section 3.3, the line above the Eq. 3. We followed TS2Vec in this part. $H(V^*)$ is the first term of $L_{pri}$, in Eq. 8.

yy823 commented 2 months ago

Hi, The first question, the η is not directly implemented in the code because we use random perturbation as the noise. This part is added to the feature of the first layer. In our paper, you can find this part in Section 3.3, the line above the Eq. 3. We followed TS2Vec in this part. H(V∗) is the first term of Lpri, in Eq. 8.

Thank you! I get the augmentation function issue already, but where is the loss H(V*) caculated in the code? In the augmentation loss aloss = vx_distance + reg_weight * reg_loss + regular_weight * regular, is it the term _regloss?

AslanDing commented 2 months ago

Yes, it is in AutoTCL_CoST.py, line 148.

yy823 commented 2 months ago

Yes, it is in AutoTCL_CoST.py, line 148.

I'm sorry. I didn't notice your reply and I just edited my 2nd comment. Could you check it again? Thank you so much!

AslanDing commented 2 months ago

Yeah, the reg_loss is about H(V*), and regular is the temporal loss.