Closed d-01 closed 2 years ago
Sorry for bothering, it was my mistake.
While testing, the relu activation should be disabled (activation=None
). The output will be:
k=2, dilations=(1, 2), receptive_field=7
0. input: 9 8 7 6 5 4 3 2 1; output: 12.6189
1. input: 9 8 7 6 5 4 3 2 0; output: 13.6868 *
2. input: 9 8 7 6 5 4 3 0 1; output: 12.3591 *
3. input: 9 8 7 6 5 4 0 2 1; output: 15.2741 *
4. input: 9 8 7 6 5 0 3 2 1; output: 5.9221 *
5. input: 9 8 7 6 0 4 3 2 1; output: 22.6492 *
6. input: 9 8 7 0 5 4 3 2 1; output: -8.8503 *
7. input: 9 8 0 6 5 4 3 2 1; output: 14.6726 *
8. input: 9 0 7 6 5 4 3 2 1; output: 12.6189
9. input: 0 8 7 6 5 4 3 2 1; output: 12.6189
I realized it after implemented TCN myself and got the same "bug" ¯\(ツ)/¯.
A description of what the bug is
There are two problems with a receptive field:
receptive_field
attribute).A snippet
The test code demonstrates which positions in the input sequence affect the output.
Test code (Google Colab):
Actual result
Output (with comments):
Expected result
Every element in the input sequence covered by receptive field affects the output.
Dependencies