Closed jdenholm closed 5 months ago
Looking at PR #14 they were never there in the first place...?
If you can provide the line for the layer I'll append it to the nets in the notebooks.
Should it just be a ReLU
layer?
I assume you mean Exercise 1, as the other examples use TorchTools and MobileNet?
There were in the original taught version, but I maybe they never made it into the first version of the worked solutions?
So a ReLU layer at the end of the net in Exercise 1?
For some reason the non-linear activation functions have been removed from the models in the solutions. This may well work, but it seems very irregular and, since this is for educational purposes, it should be corrected.