没有合适的资源?快使用搜索试试~ 我知道了~
首页深度学习在无人驾驶机器视觉上的应用
资源详情
资源评论
资源推荐

Lex Fridman
lex.mit.edu
January
2018
MIT 6.S094: Deep Learning for Self-Driving Cars
https://selfdrivingcars.mit.edu
For the full updated list of references visit:
https://selfdrivingcars.mit.edu/references
Lecture 4:
Computer Vision
Lex Fridman
lex.mit.edu
January
2018
MIT 6.S094: Deep Learning for Self-Driving Cars
https://selfdrivingcars.mit.edu
5rjs.cn 专注无人驾驶

Lex Fridman
lex.mit.edu
January
2018
MIT 6.S094: Deep Learning for Self-Driving Cars
https://selfdrivingcars.mit.edu
Supervised
Learning
Unsupervised
Learning
Semi-Supervised
Learning
Reinforcement
Learning
Computer Vision is Deep Learning
References: [81]
Computer Vision
5rjs.cn 专注无人驾驶

Lex Fridman
lex.mit.edu
January
2018
MIT 6.S094: Deep Learning for Self-Driving Cars
https://selfdrivingcars.mit.edu
Images are Numbers
References: [89]
• Regression: The output variable takes continuous values
• Classification: The output variable takes class labels
• Underneath it may still produce continuous values such as
probability of belonging to a particular class.
5rjs.cn 专注无人驾驶

Lex Fridman
lex.mit.edu
January
2018
MIT 6.S094: Deep Learning for Self-Driving Cars
https://selfdrivingcars.mit.edu
Computer Vision with Deep Learning:
Our intuition about what’s “hard” is flawed (in complicated ways)
References: [6, 7, 11, 68]
“Encoded in the large, highly evolve sensory and motor portions of the human brain is a billion years of experience about the nature of
the world and how to survive in it.… Abstract thought, though, is a new trick, perhaps less than 100 thousand years old. We have not yet
mastered it. It is not all that intrinsically difficult; it just seems so when we do it.”
- Hans Moravec, Mind Children (1988)
Visual perception: 540,000,000 years of data
Bipedal movement: 230,000,000 years of data
Abstract thought: 100,000 years of data
Prediction: Dog
+ Distortion Prediction: Ostrich
5rjs.cn 专注无人驾驶

Lex Fridman
lex.mit.edu
January
2018
MIT 6.S094: Deep Learning for Self-Driving Cars
https://selfdrivingcars.mit.edu
For the full updated list of references visit:
https://selfdrivingcars.mit.edu/references
Neuron: Biological Inspiration for Computation
• Neuron: computational building
block for the brain
[18, 143]
• (Artificial) Neuron: computational
building block for the “neural network”
Differences (among others):
• Parameters: Human brains have ~10,000,000
times synapses than artificial neural networks.
• Topology: Human brains have no “layers”.
Topology is complicated.
• Async: The human brain works
asynchronously, ANNs work synchronously.
• Learning algorithm: ANNs use gradient
descent for learning. Human brains use … (we
don’t know)
• Processing speed: Single biological neurons
are slow, while standard neurons in ANNs are
fast.
• Power consumption: Biological neural
networks use very little power compared to
artificial networks
• Stages: Biological networks usually don't stop
/ start learning. ANNs have different fitting
(train) and prediction (evaluate) phases.
Similarity (among others):
• Distributed computation on a large scale.
5rjs.cn 专注无人驾驶
剩余71页未读,继续阅读
















安全验证
文档复制为VIP权益,开通VIP直接复制

评论0