To collect different machine learning algorithms here for future revision
Maxium likelihood: just to get a best Θ which can MAX P(X|Θ)
.
P(X|Θ)
, usually log
is a useful way to get rid of the exponential component for things like gaussian distribution.
I've learnt that from Carl's machine learning course in 2018-2019. Just want to have a deep understanding now in 2020. It is easy to forget things if you don't use that. But I am still need to find a key that how to recap it quickly. (I reckon I should use emacs to edit those now, but I feel that emacs editor is not really good for doing this job?)
paper to read - from Zhihu
I was really familiar with those two algorithms. But I need to do some revision now.
MME
MLE
I learned test statistic from Partick's anomaly detection course[notes]. At the beginning, I have no idea what it is. Thanks to a course from Taiwai(劉惠美-數理統計學)which could be found here. 老师在第一节课中说统计中有两个概念:点估计和假设检验。
rejection region
examples: different variance
This section will cover things I don't really understand and hopefully I can add as much as possible and fit them into normal sections
When I did my interview at Oxford University, they asked me some questions that I cannot answer. (which means that I still have too much to learn!)
TODO: learn & update VC dimention part (due by: 25th Aug)
TODO: add things