深度学习:Python中的卷积神经网络

需积分: 11 12 下载量 43 浏览量 更新于2024-09-08 收藏 48B TXT 举报
"Udemy 的深度学习课程——Python中的卷积神经网络" 这门课程是针对深度学习在Python中的应用,特别是卷积神经网络(Convolutional Neural Networks, CNN)的专项训练。由Lazy Programmer Inc.创建,最近更新于2017年5月。课程内容涵盖英语,适合对计算机视觉、数据科学以及机器学习有兴趣的人士。 在本课程中,你将学到以下知识点: 1. **理解卷积**:卷积是CNN的基础,它在图像处理、音频效果和分类中都有重要应用。 2. **卷积在音频效果中的应用**:如何利用卷积实现简单的回声效果。 3. **卷积在图像效果中的应用**:如何通过编程实现高斯模糊和边缘检测。 4. **卷积在图像分类中的作用**:了解卷积如何提升图像分类的准确性。 5. **卷积神经网络(CNN)的架构**:理解并能解释CNN的结构,包括卷积层、池化层、全连接层等。 6. **在Theano中实现CNN**:学习如何使用Theano框架构建CNN模型。 7. **在TensorFlow中实现CNN**:掌握使用TensorFlow进行CNN编程的方法。 8. **计算机视觉挑战**:通过街景房屋数字识别(SVHN)数据集,实践更大、多角度彩色图像的分类任务。 9. **生物学启发**:讨论CNN与动物视觉皮层的相似性。 10. **实验和可视化**:通过实验和可视化来直观理解模型内部的工作原理。 课程要求学员具备以下基础知识: - 安装Python、Numpy、Scipy、Matplotlib、Scikit Learn、Theano和TensorFlow。 - 了解深度学习基础,包括反向传播,以及从Deep Learning in Python part 1学习的内容。 - 熟悉Theano和TensorFlow的神经网络实现,如在Deep Learning part 2中所学。 课程提供所有必要的免费材料,包括代码示例,可在GitHub上找到(/lazyprogrammer/machine_learning_examples 的 cnn_class 目录)。确保定期拉取最新版本的代码。 此课程强调“理解和构建”,而不仅仅是“使用”。学习目标是通过实验和观察,而不是死记硬背,深入理解机器学习模型的工作机制。适合希望深入理解卷积神经网络而非仅仅掌握API使用方法的学员。 请注意,本课程要求学员具有一定的预备知识,包括微积分、线性代数等。
2017-10-29 上传
https://www.udemy.com/deep-learning-recurrent-neural-networks-in-python/ Deep Learning: Recurrent Neural Networks in Python GRU, LSTM, + more modern deep learning, machine learning, and data science for sequences Created by Lazy Programmer Inc. Last updated 5/2017 English What Will I Learn? Understand the simple recurrent unit (Elman unit) Understand the GRU (gated recurrent unit) Understand the LSTM (long short-term memory unit) Write various recurrent networks in Theano Understand backpropagation through time Understand how to mitigate the vanishing gradient problem Solve the XOR and parity problems using a recurrent neural network Use recurrent neural networks for language modeling Use RNNs for generating text, like poetry Visualize word embeddings and look for patterns in word vector representations Requirements Calculus Linear algebra Python, Numpy, Matplotlib Write a neural network in Theano Understand backpropagation Probability (conditional and joint distributions) Write a neural network in Tensorflow Description Like the course I just released on Hidden Markov Models, Recurrent Neural Networks are all about learning sequences – but whereas Markov Models are limited by the Markov assumption, Recurrent Neural Networks are not – and as a result, they are more expressive, and more powerful than anything we’ve seen on tasks that we haven’t made progress on in decades. So what’s going to be in this course and how will it build on the previous neural network courses and Hidden Markov Models? In the first section of the course we are going to add the concept of time to our neural networks. I’ll introduce you to the Simple Recurrent Unit, also known as the Elman unit. We are going to revisit the XOR problem, but we’re going to extend it so that it becomes the parity problem – you’ll see that regular feedforward neural networks will have trouble solving this problem but recurrent networks will work because the key is to treat the input as a sequence. In the next section of the course, we are going to revisit one of the most popular applications of recurrent neural networks – language modeling. You saw when we studied Markov Models that we could do things like generate poetry and it didn’t look too bad. We could even discriminate between 2 different poets just from the sequence of parts-of-speech tags they used. In this course, we are going to extend our language model so that it no longer makes the Markov assumption. Another popular application of neural networks for language is word vectors or word embeddings. The most common technique for this is called Word2Vec, but I’ll show you how recurrent neural networks can also be used for creating word vectors. In the section after, we’ll look at the very popular LSTM, or long short-term memory unit, and the more modern and efficient GRU, or gated recurrent unit, which has been proven to yield comparable performance. We’ll apply these to some more practical problems, such as learning a language model from Wikipedia data and visualizing the word embeddings we get as a result. All of the materials required for this course can be downloaded and installed for FREE. We will do most of our work in Numpy, Matplotlib, and Theano. I am always available to answer your questions and help you along your data science journey. This course focuses on “how to build and understand“, not just “how to use”. Anyone can learn to use an API in 15 minutes after reading some documentation. It’s not about “remembering facts”, it’s about “seeing for yourself” via experimentation. It will teach you how to visualize what’s happening in the model internally. If you want more than just a superficial look at machine learning models, this course is for you. See you in class! NOTES: All the code for this course can be downloaded from my github: /lazyprogrammer/machine_learning_examples In the directory: rnn_class Make sure you always “git pull” so you have the latest version! HARD PREREQUISITES / KNOWLEDGE YOU ARE ASSUMED TO HAVE: calculus linear algebra probability (conditional and joint distributions) Python coding: if/else, loops, lists, dicts, sets Numpy coding: matrix and vector operations, loading a CSV file Deep learning: backpropagation, XOR problem Can write a neural network in Theano and Tensorflow TIPS (for getting through the course): Watch it at 2x. Take handwritten notes. This will drastically increase your ability to retain the information. Write down the equations. If you don’t, I guarantee it will just look like gibberish. Ask lots of questions on the discussion board. The more the better! Realize that most exercises will take you days or weeks to complete. Write code yourself, don’t just sit there and look at my code. USEFUL COURSE ORDERING: (The Numpy Stack in Python) Linear Regression in Python Logistic Regression in Python (Supervised Machine Learning in Python) (Bayesian Machine Learning in Python: A/B Testing) Deep Learning in Python Practical Deep Learning in Theano and TensorFlow (Supervised Machine Learning in Python 2: Ensemble Methods) Convolutional Neural Networks in Python (Easy NLP) (Cluster Analysis and Unsupervised Machine Learning) Unsupervised Deep Learning (Hidden Markov Models) Recurrent Neural Networks in Python Artificial Intelligence: Reinforcement Learning in Python Natural Language Processing with Deep Learning in Python Who is the target audience? If you want to level up with deep learning, take this course. If you are a student or professional who wants to apply deep learning to time series or sequence data, take this course. If you want to learn about word embeddings and language modeling, take this course. If you want to improve the performance you got with Hidden Markov Models, take this course. If you’re interested the techniques that led to new developments in machine translation, take this course. If you have no idea about deep learning, don’t take this course, take the prerequisites.
2021-03-31 上传