没有合适的资源?快使用搜索试试~ 我知道了~
首页Introduction.to.Deep.Learning.with.complete.Python.and.TensorFlow.examples
Introduction.to.Deep.Learning.with.complete.Python.and.TensorFlo...
需积分: 9 109 浏览量
更新于2023-03-16
评论
收藏 31.74MB PDF 举报
In this book, it introduces the topic of Deep Learning and neural networks by adopting exactly the opposite position. The reasons are mentioned above. The eld of Deep Learning and neural networks can make major advances if there is a continuous exchange between neuroscience and machine learning. For this, the book will present also some interesting facts from neuroscience and will not just present only technical neuron models.
资源详情
资源评论
资源推荐

Introduction to Deep Learning
Prof. Dr. J¨urgen Brauer
August 2018

1
Introduction to Deep Learning
1st Edition, August 2018.
Copyright
c
2018 by Prof. Dr. J¨urgen Brauer.
All rights reserved. This book or any portion thereof
may not be reproduced or used in any manner whatsoever
without the express written permission of the publisher
except for the use of brief quotations in a book review.
Prof. Dr. J¨urgen Brauer
University of Applied Sciences Kempten.
Bahnhofstr. 61
87435 Kempten (Allg¨au), Germany
www.juergenbrauer.org

Contents
1 What is Deep Learning? 5
1.1 ”How are they called? Neutrons?” . . . . . . . . . . . . . . . . . . . . 5
1.2 Convolutional Neural Networks drive the boom . . . . . . . . . . . . . 6
1.3 Deep Learning without neurons . . . . . . . . . . . . . . . . . . . . . . 14
1.4 Neuroscience as a treasure for machine learning . . . . . . . . . . . . . 14
1.5 About this book . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
2 Deep Learning: An agile field 20
2.1 Exponential growth of interest . . . . . . . . . . . . . . . . . . . . . . 20
2.2 Acquisition of DL startups . . . . . . . . . . . . . . . . . . . . . . . . . 23
2.3 Hardware for DL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
2.4 Software for DL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
3 The biological role model: The Neuron 32
3.1 Your brain - A fascinating computing device . . . . . . . . . . . . . . . 32
3.2 Structure of a neuron . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
3.3 Signal processing by action potentials . . . . . . . . . . . . . . . . . . 37
3.4 Synapses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
3.5 Neuronal plasticity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
3.6 Spike-Timing Dependent Plasticity (STDP) . . . . . . . . . . . . . . . 42
4 The many faces of a neuron 46
4.1 What is the function of a biological neuron? . . . . . . . . . . . . . . . 46
4.2 Neurons as spatial feature or evidence detectors . . . . . . . . . . . . . 47
4.3 Neurons as temporal coincidence detectors . . . . . . . . . . . . . . . . 51
4.4 Perceptron neuron model . . . . . . . . . . . . . . . . . . . . . . . . . 52
4.5 Neurons as filters . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
4.6 Other neuron models . . . . . . . . . . . . . . . . . . . . . . . . . . . . 60
4.7 Neural Coding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
2

CONTENTS 3
5 The Perceptron 64
5.1 The Perceptron neuro-computer . . . . . . . . . . . . . . . . . . . . . . 64
5.2 Perceptron learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66
5.3 Perceptron in Python . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
5.4 Limitations of the Perceptron . . . . . . . . . . . . . . . . . . . . . . . 76
6 Self-Organizing Maps 82
6.1 The SOM neural network model . . . . . . . . . . . . . . . . . . . . . 82
6.2 A SOM in Python . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
6.3 SOM and the Cortex . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100
7 Multi Layer Perceptrons 107
7.1 The goal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
7.2 Basic idea is gradient descent . . . . . . . . . . . . . . . . . . . . . . . 108
7.3 Splitting the weight change formula into three parts . . . . . . . . . . 110
7.4 Computing the first part . . . . . . . . . . . . . . . . . . . . . . . . . . 111
7.5 Computing the second part . . . . . . . . . . . . . . . . . . . . . . . . 112
7.6 Computing the third part . . . . . . . . . . . . . . . . . . . . . . . . . 112
7.7 Backpropagation pseudo code . . . . . . . . . . . . . . . . . . . . . . . 116
7.8 MLP in Python . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
7.9 Visualization of decision boundaries . . . . . . . . . . . . . . . . . . . 133
7.10 The need for non-linear transfer functions . . . . . . . . . . . . . . . . 137
8 TensorFlow 140
8.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 140
8.2 Training a linear model with TensorFlow . . . . . . . . . . . . . . . . . 149
8.3 A MLP with TensorFlow . . . . . . . . . . . . . . . . . . . . . . . . . . 151
9 Convolutional Neural Networks 159
9.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 159
9.2 Some history about the CNN model . . . . . . . . . . . . . . . . . . . 163
9.3 Convolutional and pooling layers in TensorFlow . . . . . . . . . . . . . 166
9.4 Parameters to be defined for a convolution layer . . . . . . . . . . . . 172
9.5 How to compute the dimension of an output tensor . . . . . . . . . . . 177
9.6 Parameters to be defined for a pooling layer . . . . . . . . . . . . . . . 178
9.7 A CNN in TensorFlow . . . . . . . . . . . . . . . . . . . . . . . . . . . 178
10 Deep Learning Tricks 194
10.1 Fighting against vanishing gradients . . . . . . . . . . . . . . . . . . . 194
10.2 Momentum optimization . . . . . . . . . . . . . . . . . . . . . . . . . . 196
10.3 Nesterov Momentum Optimization . . . . . . . . . . . . . . . . . . . . 199

CONTENTS 4
10.4 AdaGrad . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
10.5 RMSProp . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201
10.6 Adam . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 202
10.7 Comparison of optimizers . . . . . . . . . . . . . . . . . . . . . . . . . 203
10.8 Batch normalization . . . . . . . . . . . . . . . . . . . . . . . . . . . . 206
11 Beyond Deep Learning 209
11.1 Principle of attention . . . . . . . . . . . . . . . . . . . . . . . . . . . . 209
11.2 Principle of lifelong learning . . . . . . . . . . . . . . . . . . . . . . . . 210
11.3 Principle of incremental learning . . . . . . . . . . . . . . . . . . . . . 211
11.4 Principle of embodiment . . . . . . . . . . . . . . . . . . . . . . . . . . 211
11.5 Principle of prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . 212
11.6 Cognitive architectures . . . . . . . . . . . . . . . . . . . . . . . . . . . 213
12 Exercises 216
12.1 Ex. 1 - Preparing to work with Python . . . . . . . . . . . . . . . . . 216
12.2 Ex. 2 - Python syntax . . . . . . . . . . . . . . . . . . . . . . . . . . . 220
12.3 Ex. 3 - Understanding convolutions . . . . . . . . . . . . . . . . . . . . 223
12.4 Ex. 4 - NumPy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226
12.5 Ex. 5 - Perceptron . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
12.6 Ex. 6 - Speech Recognition with a SOM . . . . . . . . . . . . . . . . . 233
12.7 Ex. 7 - MLP with feedfoward step . . . . . . . . . . . . . . . . . . . . 234
12.8 Ex. 8 - Backpropagation . . . . . . . . . . . . . . . . . . . . . . . . . . 235
12.9 Ex. 9 - A MLP with TensorFlow . . . . . . . . . . . . . . . . . . . . . 236
12.10Ex. 10 - CNN Experiments . . . . . . . . . . . . . . . . . . . . . . . . 237
12.11Ex. 11 - CNN for word recognition using Keras . . . . . . . . . . . . . 238
12.12Ex. 12 - Vanishing gradients problem . . . . . . . . . . . . . . . . . . . 239
12.13Ex. 13 - Batch normalization in TensorFlow . . . . . . . . . . . . . . . 240
剩余244页未读,继续阅读















安全验证
文档复制为VIP权益,开通VIP直接复制

评论0