没有合适的资源?快使用搜索试试~ 我知道了~
首页Introduction to Machine Learning - Nils J Nilsson.pdf
Introduction to Machine Learning - Nils J Nilsson.pdf
需积分: 10 86 浏览量
更新于2023-03-16
评论
收藏 1.01MB PDF 举报
Introduction to Machine Learning - Nils J Nilsson.pdf Introduction to Machine Learning - Nils J Nilsson.pdf Introduction to Machine Learning - Nils J Nilsson.pdf
资源详情
资源评论
资源推荐

INTRODUCTION
TO
MACHINE LEARNING
AN EARLY DRAFT OF A PROPOSED
TEXTBOOK
Nils J. Nilsson
Rob otics Lab oratory
Department of Computer Science
Stanford University
Stanford, CA 94305
e-mail: nilsson@cs.stanford.edu
December 4, 1996
Copyright
c
1997 Nils J. Nilsson
This material may not b e copied, reproduced, or distributed without the
written p ermission of the copyright holder.
www.aibbt.com 艾伯特 - 专注人工智能

Contents
1 Preliminaries 1
1.1 Intro duction
::::::::::::::::::::::: ::::
1
1.1.1 What is Machine Learning?
::::::::::::::
1
1.1.2 Wellsprings of Machine Learning
:::::::::::
3
1.1.3 Varieties of Machine Learning
:::::::::::::
5
1.2 Learning Input-Output Functions
:::::::::::::::
6
1.2.1 Typ es of Learning
:::::::::::::::::::
6
1.2.2 Input Vectors
::::::::::::::::::::::
8
1.2.3 Outputs
:::::::::::::::::::::::::
9
1.2.4 Training Regimes
::::::::::::::::::::
9
1.2.5 Noise
:::::::::::::::::::::: ::::
10
1.2.6 Performance Evaluation
::::::::::::::::
10
1.3 Learning Requires Bias
:::::::::::::::::::::
10
1.4 Sample Applications
::::::::::::::::::::::
13
1.5 Sources
::::::::::::::::::::::::: ::::
14
1.6 Bibliographical and Historical Remarks
:::::::::::
15
2 Bo olean Functions 17
2.1 Representation
:::::::::::::::::::::::::
17
2.1.1 Bo olean Algebra
::::::::::::::::::::
17
2.1.2 Diagrammatic Representations
::::::::::::
18
2.2 Classes of Bo olean Functions
:::::::::::::::::
19
2.2.1 Terms and Clauses
:::::::::::::::::::
19
2.2.2 DNF Functions
:::::::::::::::::::::
20
i
www.aibbt.com 艾伯特 - 专注人工智能

2.2.3 CNF Functions
:::::::::::::::::::::
24
2.2.4 Decision Lists
::::::::::::::::::::::
25
2.2.5 Symmetric and Voting Functions
:::::::::::
26
2.2.6 Linearly Separable Functions
:::::::::::::
26
2.3 Summary
:::::::::::::::::::::::: ::::
27
2.4 Bibliographical and Historical Remarks
:::::::::::
28
3 Using Version Spaces for Learning 29
3.1 Version Spaces and Mistake Bounds
:::::::::::::
29
3.2 Version Graphs
:::::::::::::::::::::::::
31
3.3 Learning as SearchofaVersion Space
::::::::::::
34
3.4 The Candidate Elimination Metho d
:::::::::::::
35
3.5 Bibliographical and Historical Remarks
:::::::::::
37
4 Neural Networks 39
4.1 Threshold Logic Units
:::::::::::::::::::::
39
4.1.1 Denitions and Geometry
:::::::::::::::
39
4.1.2 Sp ecial Cases of Linearly Separable Functions
::::
41
4.1.3 Error-Correction Training of a TLU
:::::::::
42
4.1.4 WeightSpace
::::::::::::::::::::::
45
4.1.5 The Widrow-Ho Pro cedure
::::::::::::::
46
4.1.6 Training a TLU on Non-Linearly-Separable Training
Sets
::::::::::::::::::::::: ::::
49
4.2 Linear Machines
::::::::::::::::::::::::
50
4.3 Networks of TLUs
:::::::::::::::::::::::
51
4.3.1 Motivation and Examples
:::::::::::::::
51
4.3.2 Madalines
::::::::::::::::::::::::
54
4.3.3 Piecewise Linear Machines
:::::::::::::::
56
4.3.4 Cascade Networks
:::::::::::::::::::
57
4.4 Training Feedforward Networks byBackpropagation
::::
58
4.4.1 Notation
:::::::::::::::::::::::::
58
4.4.2 The Backpropagation Metho d
:::::::::::::
60
4.4.3 Computing Weight Changes in the Final Layer
:::
62
4.4.4 Computing Changes to the Weights in Intermediate
Layers
:::::::::::::::::::::: ::::
64
ii
www.aibbt.com 艾伯特 - 专注人工智能

4.4.5 Variations on Backprop
::::::::::::::::
66
4.4.6 An Application: Steering a Van
::::::::::::
67
4.5 Synergies Between Neural Network and Knowledge-Based
Metho ds
::::::::::::::::::::::::: ::::
68
4.6 Bibliographical and Historical Remarks
:::::::::::
68
5 Statistical Learning 69
5.1 Using Statistical Decision Theory
:::::::::::::::
69
5.1.1 Background and General Metho d
:::::::::::
69
5.1.2 Gaussian (or Normal) Distributions
:::::::::
71
5.1.3 Conditionally Indep endent Binary Comp onents
:::
75
5.2 Learning Belief Networks
:::::::::::::::::::
77
5.3 Nearest-Neighb or Metho ds
:::::::::::::::::::
77
5.4 Bibliographical and Historical Remarks
:::::::::::
79
6 Decision Trees 81
6.1 Denitions
:::::::::::::::::::::::: ::::
81
6.2 Sup ervised Learning of Univariate Decision Trees
::::::
83
6.2.1 Selecting the TypeofTest
:::::::::::::::
83
6.2.2 Using Uncertainty Reduction to Select Tests
::::
84
6.2.3 Non-Binary Attributes
:::::::::::::::::
88
6.3 Networks Equivalent to Decision Trees
::::::::::::
88
6.4 Overtting and Evaluation
::::::::::::::::::
89
6.4.1 Overtting
:::::::::::::::::::::::
89
6.4.2 Validation Metho ds
::::::::::::::::::
90
6.4.3 Avoiding Overtting in Decision Trees
::::::::
91
6.4.4 Minimum-Description Length Metho ds
::::::::
92
6.4.5 Noise in Data
::::::::::::::::::::::
93
6.5 The Problem of Replicated Subtrees
:::::::::::::
94
6.6 The Problem of Missing Attributes
::::::::::::::
96
6.7 Comparisons
:::::::::::::::::::::: ::::
96
6.8 Bibliographical and Historical Remarks
:::::::::::
96
iii
www.aibbt.com 艾伯特 - 专注人工智能

7 Inductive Logic Programming 97
7.1 Notation and Denitions
::::::::::::::::::::
99
7.2 A Generic ILP Algorithm
:::::::::::::::::::
100
7.3 An Example
::::::::::::::::::::::: ::::
103
7.4 Inducing Recursive Programs
:::::::::::::::::
107
7.5 Cho osing Literals to Add
:::::::::::::::::::
110
7.6 Relationships Between ILP and Decision Tree Induction
::
111
7.7 Bibliographical and Historical Remarks
:::::::::::
114
8 Computational Learning Theory 117
8.1 Notation and Assumptions for PAC Learning Theory
::::
117
8.2 PAC Learning
:::::::::::::::::::::: ::::
119
8.2.1 The Fundamental Theorem
::::::::::::::
119
8.2.2 Examples
::::::::::::::::::::::::
121
8.2.3 Some Prop erly PAC-Learnable Classes
::::::::
122
8.3 The Vapnik-Chervonenkis Dimension
:::::::::::::
124
8.3.1 Linear Dichotomies
:::::::::::::::::::
124
8.3.2 Capacity
::::::::::::::::::::::::
126
8.3.3 A More General Capacity Result
:::::::::::
127
8.3.4 Some Facts and Sp eculations Ab out the VC Dimension 129
8.4 VC Dimension and PAC Learning
::::::::::::::
129
8.5 Bibliographical and Historical Remarks
:::::::::::
130
9 Unsup ervised Learning 131
9.1 What is Unsup ervised Learning?
:::::::::::::::
131
9.2 Clustering Metho ds
:::::::::::::::::::::::
133
9.2.1 A Metho d Based on Euclidean Distance
:::::::
133
9.2.2 A Metho d Based on Probabilities
:::::::::::
136
9.3 Hierarchical Clustering Metho ds
:::::::::::::::
138
9.3.1 A Metho d Based on Euclidean Distance
:::::::
138
9.3.2 A Metho d Based on Probabilities
:::::::::::
138
9.4 Bibliographical and Historical Remarks
:::::::::::
143
iv
www.aibbt.com 艾伯特 - 专注人工智能
剩余208页未读,继续阅读

















安全验证
文档复制为VIP权益,开通VIP直接复制

评论0