yabufarha / ms-tcn

Other
214 stars 58 forks source link

Question about the dilated residual layer #33

Open JustinYuu opened 3 years ago

JustinYuu commented 3 years ago

Hello, The dilated residual layer in Fig. 2 includes a 1*1 convolution after the ReLU activation. However, I cannot find the explanations of the role of this 1*1 convolution. Is this 1*1 Conv used to introduce more parameters to improve the expressiveness of the TCN?