mparker2 / mparker_phd_thesis

PhD Thesis
0 stars 0 forks source link

Ims chapter 3 #2

Closed IanSudbery closed 6 years ago

IanSudbery commented 6 years ago

Commons for chapter 3.

Again the lanuage needs almost nothing doing to it, at least not htat I can see. What is missing mostly is an introduction to neural nets, particularly convolutional ones.

What does it mean for a net to be "recurrent convolutional". Why did you choose a recurrent convolutional net? Why did you choose the particular archetecture you did?

mparker2 commented 6 years ago

Hi Ian,

thank you for the review. I will add a technical introduction to neural networks and convolutional/recurrent layers. I am also going to add some background on how these types of models have previously been used in sequence based predictions (transcription factor binding sites etc.) at Karim's suggestion.

What, if any, changes do you think I should be making to the text in order to try and make this chapter into a paper? I was mainly thinking of culling a few bits from the results (the stuff about G-register, loop length and the overlap between G4 sequences in mm10 and hg19).

mparker2 commented 6 years ago

I think my previous comment got lost somewhere amongst the automatic commit messages.

I think I've addressed most of the major comments. I have added some info on use of ML in biology (though its not even a little bit comprehensive) to the intro and an explanation of how CNN/RNNs work to the results. I think I have made where I got the architecture from more clear as well.

I'll merge what I've got into the master branch now but I'll leave this branch open in case you have any more comments.

What, if any, changes do you think I should be making to the text in order to try and make this chapter into a paper? I was mainly thinking of culling a few bits from the results (the stuff about G-register, loop length and the overlap between G4 sequences in mm10 and hg19).

Thanks Matt