uchicago-computation-workshop / nicolas_masse

Repository for Nicolas Masse's presentation at the CSS Workshop (1/13/2019)
0 stars 0 forks source link

Intentionally forget and hardware for the study of artificial neural networks #12

Open SiyuanPengMike opened 5 years ago

SiyuanPengMike commented 5 years ago

Thanks for your interesting topics!

While learning these cool methods that could alleviate catastrophic forgetting, I have a question that what if we find that a former training data is actually wrong and we want to update our task skill by forgetting that old, wrong one and learning a new, correct one? I mean, we indeed benefit a lot that the skills of those tasks are not independent and consist of different attributes and weight, however, is there an easy way to update this mixture by deleting a wrong data which has already been inserted?

Also, we all know that the study of artificial neural networks needs GPU with huge capacity. Could you give us a brief introduction that why GPU overmatches CPU in this particular study?

Thanks again for your great paper!