Open fedeloper opened 3 years ago
Very nice! That was good thinking.
I agree it's great to know there's someone else who thinks like you. This can be a curse in disguise though -- since they'll also steal your ideas :) On the other hand, if they do it instead of yourself, then you don't have to do it and can concentrate on other cool things.
I hope this is not the case ahhaha. Half the ideas are already developed by another scientist the other half - 1 are bad ideas. The remaining one could possibly a good idea. I hope my project idea is one of them.
Thinking about today lesson and the explanation of DropOut technique: There are some pros as take under control overfitting using shared node, and some cons as the slowdown in convergence. I thought if instead of switching off completely a node we multiply the error for a random factor (instead of 0 as in the normal DropOut) we can get the pros and avoid the cons. After some searches I found that:
Fast dropout training![image](https://user-images.githubusercontent.com/11047470/116604590-9d71c200-a92e-11eb-9d9f-76a1a2294d4c.png)
That's is exactly what i thought about and side effect also has faster computation than normal DropOut! It's great to know that someone else in the world thinks like you!
I recommend reading the paper very interesting.