forked from shunliz/Machine-Learning
-
Notifications
You must be signed in to change notification settings - Fork 0
/
list.md
129 lines (104 loc) · 11.9 KB
/
list.md
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
# 机器学习Machine-Learning
## 主要内容
- [前言](#preparation)
- [课程列表](#curriculum)
- [推荐学习路线](#learning_route)
- [数学基础初级](#math_basic)
- [程序语言能力](#programming_basic)
- [机器学习课程初级](#machine_learning_basic)
- [数学基础中级](#math_median)
- [机器学习课程中级](#machine_learning_median)
- [推荐书籍列表](#booklists)
- [机器学习专项领域学习](#special_learning)
- [致谢](#many_thanks)
##前言
我们要求把这些课程的所有Notes,Slides以及作者强烈推荐的论文看懂看明白,并完成所有的老师布置的习题,而推荐的书籍是不做要求的,如果有些书籍是需要看完的,我们会进行额外的说明。
##课程列表
课程 | 机构 | 参考书 | Notes等其他资料
:-- | :--: | :--: | :--:
[单变量微积分](http://open.163.com/movie/2006/8/M/L/M6GLI5A07_M6GLJH1ML.html) | MIT | [Calculus with Analytic Geometry](https://www.amazon.com/exec/obidos/ASIN/0070576424/ref=nosim/mitopencourse-20) | [链接](https://ocw.mit.edu/courses/mathematics/18-01-single-variable-calculus-fall-2006/)
[多变量微积分](http://open.163.com/special/opencourse/multivariable.html) | MIT | [Multivariable Calculus](https://www.amazon.com/exec/obidos/ASIN/0130339679/ref=nosim/mitopencourse-20) | [链接](https://ocw.mit.edu/courses/mathematics/18-02-multivariable-calculus-fall-2007/)
[线性代数](http://open.163.com/special/opencourse/daishu.html)| MIT | [Introduction to Linear Algebra](http://math.mit.edu/~gs/linearalgebra/) | [链接](https://ocw.mit.edu/courses/mathematics/18-06-linear-algebra-spring-2010/study-materials/)
[统计入门](http://open.163.com/movie/2011/6/6/0/M82IC6GQU_M83J9IK60.html) | 可汗学院 | 暂无 | 暂无
概率论入门: [链接1](http://mooc.guokr.com/course/461/%E6%A9%9F%E7%8E%87/),[链接2](https://www.youtube.com/watch?v=GwSEguqJj6U&index=1&list=PLtvno3VRDR_jMAJcNY1n4pnP5kXtPOmVk)| NTU | 暂无 | 暂无
[概率与统计](https://www.youtube.com/watch?v=j9WZyLZCBzs&list=PLQ3khvAsNhargDx0dG1cQXOrA2u3JsFKc)| MIT | [Introduction to Probability](https://www.amazon.com/exec/obidos/ASIN/188652923X/ref=nosim/mitopencourse-20) | [链接](https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-041-probabilistic-systems-analysis-and-applied-probability-fall-2010/tutorials/)
矩阵论 | 暂无 | [矩阵论](https://www.amazon.cn/%E7%9F%A9%E9%98%B5%E8%AE%BA-%E6%88%B4%E5%8D%8E/dp/B00116BRO0/ref=sr_1_1?s=books&ie=UTF8&qid=1478614198&sr=1-1&keywords=%E6%88%B4%E5%8D%8E%EF%BC%8C+%E7%9F%A9%E9%98%B5%E8%AE%BA) | 暂无
[凸优化1](https://lagunita.stanford.edu/courses/Engineering/CVX101/Winter2014/about)| Stanford | [Convex Optimization](http://www.stanford.edu/~boyd/cvxbook/bv_cvxbook.pdf) | [链接](http://stanford.edu/class/ee364a/index.html)
[凸优化2](https://www.youtube.com/watch?v=U3lJAObbMFI&list=PL3940DD956CDF0622&index=20)| Stanford | 暂无 | [链接](http://stanford.edu/class/ee364b/)
[统计学习入门](https://lagunita.stanford.edu/courses/HumanitiesSciences/StatLearning/Winter2016/about)| Stanford | [An Introduction to Statistical Learning](http://www-bcf.usc.edu/~gareth/ISL/) | [链接](https://lagunita.stanford.edu/courses/HumanitiesSciences/StatLearning/Winter2016/about)
[机器学习基石](https://www.coursera.org/instructor/htlin)| NTU | [Learning from Data](https://www.amazon.com/gp/product/1600490069) | [链接](https://www.csie.ntu.edu.tw/~htlin/course/mlfound16fall/)
[机器学习技法](https://www.coursera.org/instructor/htlin)| NTU | 暂无 | [链接](https://www.csie.ntu.edu.tw/~htlin/course/ml15fall/)
[机器学习](https://www.youtube.com/watch?v=mbyG85GZ0PI&index=1&list=PLD63A284B7615313A)| Caltech | [Learning from Data](https://www.amazon.com/gp/product/1600490069) | [链接](http://work.caltech.edu/lectures.html)
[机器学习(matlab)](http://open.163.com/movie/2008/1/M/C/M6SGF6VB4_M6SGHFBMC.html)| Stanford |暂无| [链接](http://cs229.stanford.edu/materials.html)
Python程序语言设计| 暂无 | 暂无 | 暂无
Matlab程序语言设计| 暂无 | 暂无 | 暂无
## 推荐学习路线
### 数学基础初级
课程 | 机构 | 参考书 | Notes等其他资料
:-- | :--: | :--: | :--:
[单变量微积分](http://open.163.com/movie/2006/8/M/L/M6GLI5A07_M6GLJH1ML.html) | MIT | [Calculus with Analytic Geometry](https://www.amazon.com/exec/obidos/ASIN/0070576424/ref=nosim/mitopencourse-20) | [链接](https://ocw.mit.edu/courses/mathematics/18-01-single-variable-calculus-fall-2006/)
[多变量微积分](http://open.163.com/special/opencourse/multivariable.html) | MIT | [Multivariable Calculus](https://www.amazon.com/exec/obidos/ASIN/0130339679/ref=nosim/mitopencourse-20) | [链接](https://ocw.mit.edu/courses/mathematics/18-02-multivariable-calculus-fall-2007/)
[线性代数](http://open.163.com/special/opencourse/daishu.html)| MIT | [Introduction to Linear Algebra](http://math.mit.edu/~gs/linearalgebra/) | [链接](https://ocw.mit.edu/courses/mathematics/18-06-linear-algebra-spring-2010/study-materials/)
[统计入门](http://open.163.com/movie/2011/6/6/0/M82IC6GQU_M83J9IK60.html) | 可汗学院 | 暂无 | 暂无
概率论入门: [链接1](http://mooc.guokr.com/course/461/%E6%A9%9F%E7%8E%87/),[链接2](https://www.youtube.com/watch?v=GwSEguqJj6U&index=1&list=PLtvno3VRDR_jMAJcNY1n4pnP5kXtPOmVk)| NTU | 暂无 | 暂无
[概率与统计](https://www.youtube.com/watch?v=j9WZyLZCBzs&list=PLQ3khvAsNhargDx0dG1cQXOrA2u3JsFKc)| MIT | [Introduction to Probability](https://www.amazon.com/exec/obidos/ASIN/188652923X/ref=nosim/mitopencourse-20) | [链接](https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-041-probabilistic-systems-analysis-and-applied-probability-fall-2010/tutorials/)
### 程序语言能力
考虑到机器学习的核心是里面的数学原理和算法思想,程序语言目前主要是帮助大家较好的完成课后作业以及实现自己的一些idea,此处我们仅仅给出推荐的参考学习链接,大家掌握一些常用的模块即可,即完成参考学习链接部分的内容即可,推荐书籍比较经典,但不做要求。
课程 | 参考学习链接 | 推荐书籍
:-- | :--: | :--:
Python程序语言设计| [链接](http://cs231n.github.io/python-numpy-tutorial/) | 暂无
Matlab程序语言设计| 暂无 | 暂无
R程序语言设计| 暂无 | 暂无
### 机器学习课程初级
课程 | 机构 | 参考书 | Notes等其他资料
:-- | :--: | :--: | :--:
[统计学习入门](https://lagunita.stanford.edu/courses/HumanitiesSciences/StatLearning/Winter2016/about)| Stanford | [An Introduction to Statistical Learning](http://www-bcf.usc.edu/~gareth/ISL/) | [链接](https://lagunita.stanford.edu/courses/HumanitiesSciences/StatLearning/Winter2016/about)
[机器学习入门](https://www.coursera.org/learn/machine-learning) | Coursera | 暂无 | [链接](https://www.coursera.org/learn/machine-learning)
###数学基础中级
课程 | 机构 | 参考书 | Notes等其他资料
:-- | :--: | :--: | :--:
矩阵论 | 暂无 | [矩阵论](https://www.amazon.cn/%E7%9F%A9%E9%98%B5%E8%AE%BA-%E6%88%B4%E5%8D%8E/dp/B00116BRO0/ref=sr_1_1?s=books&ie=UTF8&qid=1478614198&sr=1-1&keywords=%E6%88%B4%E5%8D%8E%EF%BC%8C+%E7%9F%A9%E9%98%B5%E8%AE%BA) | 暂无
[凸优化1](https://lagunita.stanford.edu/courses/Engineering/CVX101/Winter2014/about)| Stanford | [Convex Optimization](http://www.stanford.edu/~boyd/cvxbook/bv_cvxbook.pdf) | [链接](http://stanford.edu/class/ee364a/index.html)
[凸优化2](https://www.youtube.com/watch?v=U3lJAObbMFI&list=PL3940DD956CDF0622&index=20)| Stanford | 暂无 | [链接](http://stanford.edu/class/ee364b/)
下面这个概述必须看完。
- [Convex Optimization: Algorithms and Complexity](https://link.zhihu.com/?target=http%3A//arxiv.org/abs/1405.4980)
###机器学习课程中级
此处NTU和Caltech两个大学的课程是由《Learning from Data》一书的两个不同的作者讲的,所以仅仅只需选择一个完成即可,注意:如果选择完成NTU的机器学习课程,则**NTU的“机器学习基石”和“机器学习技法”需同时完成。**。
课程 | 机构 | 参考书 | Notes等其他资料
:-- | :--: | :--: | :--:
[机器学习基石](https://www.coursera.org/instructor/htlin)| NTU | [Learning from Data](https://www.amazon.com/gp/product/1600490069) | [链接](https://www.csie.ntu.edu.tw/~htlin/course/mlfound16fall/)
[机器学习技法](https://www.coursera.org/instructor/htlin)| NTU | 暂无 | [链接](https://www.csie.ntu.edu.tw/~htlin/course/ml15fall/)
[机器学习](http://open.163.com/movie/2008/1/M/C/M6SGF6VB4_M6SGHFBMC.html)| Stanford |暂无| [链接](http://cs229.stanford.edu/materials.html)
[机器学习](https://www.youtube.com/watch?v=mbyG85GZ0PI&index=1&list=PLD63A284B7615313A)| Caltech | [Learning from Data](https://www.amazon.com/gp/product/1600490069) | [链接](http://work.caltech.edu/lectures.html)
##推荐书籍列表
以下推荐的书籍都是公认的机器学习领域界的好书,建议**一般难度的书籍至少详细阅读一本,建议看两本**,而较难的书籍不做任何要求,大家可以在学有余力时细细品味经典。
书名 | 难度
:-- | :--:
[统计学习方法](https://www.amazon.cn/%E7%BB%9F%E8%AE%A1%E5%AD%A6%E4%B9%A0%E6%96%B9%E6%B3%95-%E6%9D%8E%E8%88%AA/dp/B007TSFMTA) | 一般
[An Introduction to Statistical Learning](http://www-bcf.usc.edu/~gareth/ISL/) | 一般
[Machine Learning](https://www.amazon.com/gp/product/0071154671?ie=UTF8&tag=jefork-20&linkCode=as2&camp=1789&creative=9325&creativeASIN=0071154671) | 一般
[Learning from Data](https://www.amazon.com/gp/product/1600490069) | 一般,[配套讲义](https://work.caltech.edu/telecourse.html)
[Pattern Recognition and Machine Learning](https://www.amazon.com/Pattern-Recognition-Learning-Information-Statistics/dp/0387310738/ref=pd_sim_14_1?ie=UTF8&dpID=61f0EXfMRvL&dpSrc=sims&preST=_AC_UL160_SR118%2C160_&refRID=119X50P5F0DFA339S9DR) | 较难(偏贝叶斯),[配套讲义](http://cs.brown.edu/courses/csci1420/lectures.html)
[The Elements of Statistical Learning](https://www.amazon.com/The-Elements-Statistical-Learning-Prediction/dp/0387848576/ref=pd_sim_14_2?ie=UTF8&dpID=41LeU3HcBdL&dpSrc=sims&preST=_AC_UL160_SR103%2C160_&refRID=119X50P5F0DFA339S9DR) | 较难
[Understanding Machine Learning:From Theory to Algorithms](http://www.cs.huji.ac.il/~shais/UnderstandingMachineLearning/understanding-machine-learning-theory-algorithms.pdf) | 较难
[Machine Learning: A probabilistic approach](https://www.amazon.com/Machine-Learning-Probabilistic-Perspective-Computation/dp/0262018020) | 较难
##机器学习专项领域学习
如果您已经完成了上述的所有科目,恭喜您已经拥有十分扎实的机器学习基础了,已经是一名合格的机器学习成员了,可以较为顺利的进入下面某一专项领域进行较为深入研究,因为并不是所有的专项领域都有对应的课程或者书籍等学习资料,所以此处我们仅列举一些我们知道的专项领域的学习资料,当然这些领域不能涵盖所有,还有很多领域没有整理(希望大家一起完善),如果这些领域适合你,那就继续加油!如果不清楚,那么大家可以去下面列举的高级会议期刊上去寻找自己感兴趣的话题进行学习研究。
###一些专项领域资料
- [深度学习](https://github.com/JustFollowUs/Deep-Learning)
- [图模型](https://github.com/JustFollowUs/Probabilistic-graphical-models)
- [强化学习](https://github.com/JustFollowUs/Reinforcement-Learning)
- [Hash](http://cs.nju.edu.cn/lwj/L2H.html)
- [理论机器学习](https://github.com/JustFollowUs/Theoretical-Machine-Learning/)
- 其他(尚未完善)
###领域会议期刊
- [NIPS](https://nips.cc/)
- [ICML](http://icml.cc/)
- [AAAI](http://www.aaai.org/)
- [IJCAI](http://www.ijcai.org/)
- [KDD](http://www.kdd.org/)
- [ICDM](http://www.cs.uvm.edu/~icdm/)
- [COLT](http://www.learningtheory.org/)
- 其他(尚未完善)
##致谢
感谢南京大学LAMDA实验组杨杨博士的建议与资料的分享。