没有合适的资源?快使用搜索试试~ 我知道了~
首页Introduction to Recurrent Neural Networks
• Why Recurrent Neural Networks (RNNs)?? • The Vanilla RNN unit? • The RNN forward pass? • Backpropagation refresher? • The RNN backward pass? • Issues with the Vanilla RNN? • The Long Short-Term Memory (LSTM) unit? • The LSTM Forward & Backward pass? • LSTM variants and tips? – Peephole LSTM? – GRU?
资源详情
资源评论
资源推荐

Introduction to RNNs!
Arun Mallya!
Best viewed with Computer Modern fonts installed!

Outline!
• Why Recurrent Neural Networks (RNNs)?!
• The Vanilla RNN unit!
• The RNN forward pass!
• Backpropagation refresher!
• The RNN backward pass!
• Issues with the Vanilla RNN!
• The Long Short-Term Memory (LSTM) unit!
• The LSTM Forward & Backward pass!
• LSTM variants and tips!
– Peephole LSTM!
– GRU!

Motivation!
• Not all problems can be converted into one with fixed-
length inputs and outputs!
!
• Problems such as Speech Recognition or Time-series
Prediction require a system to store and use context
information!
– Simple case: Output YES if the number of 1s is even, else NO!
1000010101 – YES, 100011 – NO, … !
!
• Hard/Impossible to choose a fixed context window!
– There can always be a new sample longer than anything seen!

Recurrent Neural Networks (RNNs)!
• Recurrent Neural Networks take the previous output or
hidden states as inputs. !
The composite input at time t has some historical
information about the happenings at time T < t!
• RNNs are useful as their intermediate values (state) can
store information about past inputs for a time that is not
fixed a priori!
!

Sample Feed-forward Network!
5"
h
1
!
y
1
!
x
1
!
t = 1!
剩余51页未读,继续阅读


















安全验证
文档复制为VIP权益,开通VIP直接复制

评论0