the gated recurrent unit
时间: 2024-05-25 15:16:48 浏览: 116
GRU(Gated Recurrent)神经网络介绍及公式推导
5星 · 资源好评率100%
The gated recurrent unit (GRU) is a type of recurrent neural network (RNN) that was introduced in 2014 by Cho et al. It is a variant of the traditional RNN that uses gating mechanisms to control the flow of information through the network. The GRU has gates that regulate the amount of information that is passed on from one time step to the next, allowing it to selectively remember or forget previous inputs. This gating mechanism helps to mitigate the vanishing gradient problem that is common in traditional RNNs, where the gradient signal becomes too small to effectively update the network weights over long sequences.
The GRU has two gates: the reset gate and the update gate. The reset gate determines how much of the previous hidden state should be forgotten, while the update gate determines how much of the current input should be added to the current hidden state. These gates are controlled by trainable parameters that are updated during training.
Compared to traditional RNNs, GRUs have been shown to have better performance on tasks such as speech recognition and machine translation. They are also more computationally efficient than other RNN variants such as the long short-term memory (LSTM) network.
阅读全文