outline of Machine Learning机器学习的概要

The following outline is provided as an overview of and topical guide to machine learning.

下面的概要是机器学习的概述和主题指南。

Machine learning – subfield of computer science1 that evolved from the study of pattern recognition and computational learning theory in artificial intelligence.[1] In 1959, Arthur Samuel defined machine learning as a "Field of study that gives computers the ability to learn without being explicitly programmed".[2] Machine learning explores the study and construction of algorithms that can learn from and make predictions on data.[3] Such algorithms operate by building a model from an example training set of input observations in order to make data-driven predictions or decisions expressed as outputs, rather than following strictly static program instructions.

机器学习-计算机科学(更确切地说软计算),它是从人工智能的模式识别和计算学习理论的研究发展而来的。1959年,阿瑟•塞缪尔将机器学习定义为“一门研究领域,它能让计算机在没有明确编程的情况下学习”。机器学习探索的是研究和构建可以学习和预测数据的算法。这种算法用样本训练集建立模型,以便做出数据驱动的预测或决定,并将其表示为输出,而不是严格遵循静态程序指令。

1 What type of thing is machine learning?

2 Branches of machine learning

2.1 Subfields of machine learning

2.2 Cross-disciplinary fields involving machine learning

3 Applications of machine learning

4 Machine learning hardware

5 Machine learning tools

5.1 Machine learning frameworks

Proprietary machine learning frameworks - Amazon Machine Learning - Microsoft Azure Machine Learning Studio - DistBelief – replaced by TensorFlow Open source machine learning frameworks - Apache Singa - Caffe - H2O - PyTorch - mlpack - TensorFlow - Torch - CNTK - Accord.Net

5.2 Machine learning libraries

5.3 Machine learning algorithms

6 Machine learning methods

6.1 Dimensionality reduction

6.2 Ensemble learning

6.3 Meta learning

6.4 Reinforcement learning

6.5 Supervised learning

Supervised learning - AODE - Artificial neural network - Association rule learning algorithms - Apriori algorithm - Eclat algorithm - Case-based reasoning - Gaussian process regression - Gene expression programming - Group method of data handling (GMDH) - Inductive logic programming - Instance-based learning - Lazy learning - Learning Automata - Learning Vector Quantization - Logistic Model Tree - Minimum message length (decision trees, decision graphs, etc.) - Nearest Neighbor Algorithm - Analogical modeling - Probably approximately correct learning (PAC) learning - Ripple down rules, a knowledge acquisition methodology - Symbolic machine learning algorithms - Support vector machines - Random Forests - Ensembles of classifiers - Bootstrap aggregating (bagging) - Boosting (meta-algorithm) - Ordinal classification - Information fuzzy networks (IFN) - Conditional Random Field - ANOVA - Quadratic classifiers - k-nearest neighbor - Boosting - SPRINT - Bayesian networks - Naive Bayes - Hidden Markov models - Hierarchical hidden Markov model Bayesian statistics - Bayesian knowledge base - Naive Bayes - Gaussian Naive Bayes - Multinomial Naive Bayes - Averaged One-Dependence Estimators (AODE) - Bayesian Belief Network (BBN) - Bayesian Network (BN) Decision tree algorithms - Decision tree - Classification and regression tree (CART) - Iterative Dichotomiser 3 (ID3) - C4.5 algorithm - C5.0 algorithm - Chi-squared Automatic Interaction Detection (CHAID) - Decision stump - Conditional decision tree - ID3 algorithm - Random forest - SLIQ Linear classifier - Fisher's linear discriminant - Linear regression - Logistic regression - Multinomial logistic regression - Naive Bayes classifier - Perceptron - Support vector machine

6.6 Unsupervised learning

Unsupervised learning - Expectation-maximization algorithm - Vector Quantization - Generative topographic map - Information bottleneck method Artificial neural networks - Feedforward neural network - Extreme learning machine - Convolutional neural network - Recurrent neural network - Long short-term memory (LSTM) - Logic learning machine - Self-organizing map Association rule learning - Apriori algorithm - Eclat algorithm - FP-growth algorithm Hierarchical clustering - Single-linkage clustering - Conceptual clustering Cluster analysis - BIRCH - DBSCAN - Expectation-maximization (EM) - Fuzzy clustering - Hierarchical Clustering - K-means clustering - K-medians - Mean-shift - OPTICS algorithm Anomaly detection - k-nearest neighbors classification (k-NN) - Local outlier factor

6.7 Semi-supervised learning

6.8 Deep learning

6.9 Other machine learning methods and problems

7 Machine learning research

8 History of machine learning

9 Machine learning projects

10 Machine learning organizations

10.1 Machine learning conferences and workshops

11 Machine learning publications

11.1 Books on machine learning

11.2 Machine learning journals

12 Persons influential in machine learning

13 See also

13.1 Other

14 Further reading

Trevor Hastie, Robert Tibshirani and Jerome H. Friedman (2001). The Elements of Statistical Learning, Springer. ISBN 0-387-95284-5. Pedro Domingos (September 2015), The Master Algorithm, Basic Books, ISBN 978-0-465-06570-7 Mehryar Mohri, Afshin Rostamizadeh, Ameet Talwalkar (2012). Foundations of Machine Learning, The MIT Press. ISBN 978-0-262-01825-8. Ian H. Witten and Eibe Frank (2011). Data Mining: Practical machine learning tools and techniques Morgan Kaufmann, 664pp., ISBN 978-0-12-374856-0. David J. C. MacKay. Information Theory, Inference, and Learning Algorithms Cambridge: Cambridge University Press, 2003. ISBN 0-521-64298-1 Richard O. Duda, Peter E. Hart, David G. Stork (2001) Pattern classification (2nd edition), Wiley, New York, ISBN 0-471-05669-3. Christopher Bishop (1995). Neural Networks for Pattern Recognition, Oxford University Press. ISBN 0-19-853864-2. Vladimir Vapnik (1998). Statistical Learning Theory. Wiley-Interscience, ISBN 0-471-03003-1. Ray Solomonoff, An Inductive Inference Machine, IRE Convention Record, Section on Information Theory, Part 2, pp., 56-62, 1957. Ray Solomonoff, "An Inductive Inference Machine" A privately circulated report from the 1956 Dartmouth Summer Research Conference on AI.

15 References

^ Jump up to: a b http://www.britannica.com/EBchecked/topic/1116194/machine-learning This tertiary source reuses information from other sources but does not name them. Jump up ^ Phil Simon (March 18, 2013). Too Big to Ignore: The Business Case for Big Data. Wiley. p. 89. ISBN 978-1-118-63817-0. Jump up ^ Ron Kohavi; Foster Provost (1998). "Glossary of terms". Machine Learning. 30: 271–274. Jump up ^ http://www.learningtheory.org/ Jump up ^ Settles, Burr (2010), "Active Learning Literature Survey" (PDF), Computer Sciences Technical Report 1648. University of Wisconsin–Madison, retrieved 2014-11-18 Jump up ^ Rubens, Neil; Elahi, Mehdi; Sugiyama, Masashi; Kaplan, Dain (2016). "Active Learning in Recommender Systems". In Ricci, Francesco; Rokach, Lior; Shapira, Bracha. Recommender Systems Handbook (2 ed.). Springer US. doi:10.1007/978-1-4899-7637-6. ISBN 978-1-4899-7637-6. Jump up ^ https://en.wikipedia.org/wiki/Generative_adversarial_network#cite_note-GANs-1

Data Science: Data to Insights from MIT (machine learning) Popular online course by Andrew Ng, at Coursera. It uses GNU Octave. The course is a free version of Stanford University's actual course taught by Ng, see.stanford.edu/Course/CS229 available for free]. mloss is an academic database of open-source machine learning software.