Closed laoreja closed 8 years ago
Why is there a relu layer after the fc2 layer in the 1_F neural network? Traditionally, there is no relu layer after final fully connected layer.
Yes, you are right. It's not necessary. It's kind of strange to add that relu layer :joy: I add that relu layer just because of my target y (landmark positions) are all positive.
y
Why is there a relu layer after the fc2 layer in the 1_F neural network? Traditionally, there is no relu layer after final fully connected layer.