greenelab / pancancer-evaluation

Evaluating genome-wide prediction of driver mutations using pan-cancer data
BSD 3-Clause "New" or "Revised" License
9 stars 3 forks source link

Neural network parameter exploration experiments #77

Closed jjc2718 closed 1 year ago

jjc2718 commented 1 year ago

Following up on #76, since early stopping didn't really seem to work as a form of regularization, we wanted to explore some different ways to regularize neural networks that we could use to study generalization.

We tried varying the hidden layer size, the amount of weight decay, and the amount of dropout. For the most part, they look similar to the results for LASSO when there's a large amount of regularization, but they don't overfit in the same way that the LASSO models did when there's a small amount of regularization. Here are the curves for KRAS (we also looked at EGFR and the results are similar):

image image image

review-notebook-app[bot] commented 1 year ago

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB