erodola / DLAI-s2-2021

Teaching material for the course of Deep Learning and Applied AI, 2nd semester 2021, Sapienza University of Rome
35 stars 5 forks source link

Gaussian DropOut #20

Open fedeloper opened 3 years ago

fedeloper commented 3 years ago

Thinking about today lesson and the explanation of DropOut technique: There are some pros as take under control overfitting using shared node, and some cons as the slowdown in convergence. I thought if instead of switching off completely a node we multiply the error for a random factor (instead of 0 as in the normal DropOut) we can get the pros and avoid the cons. After some searches I found that:

Fast dropout training image

That's is exactly what i thought about and side effect also has faster computation than normal DropOut! It's great to know that someone else in the world thinks like you!

I recommend reading the paper very interesting.

erodola commented 3 years ago

Very nice! That was good thinking.

I agree it's great to know there's someone else who thinks like you. This can be a curse in disguise though -- since they'll also steal your ideas :) On the other hand, if they do it instead of yourself, then you don't have to do it and can concentrate on other cool things.

fedeloper commented 3 years ago

I hope this is not the case ahhaha. Half the ideas are already developed by another scientist the other half - 1 are bad ideas. The remaining one could possibly a good idea. I hope my project idea is one of them.