长短期记忆神经网络与GRU的pytorch实现
时间: 2024-02-05 13:04:09 浏览: 99
长短期记忆网络(LSTM)和门控循环单元网络(GRU)是循环神经网络(RNN)的两个变种,用于处理时序数据。在PyTorch中,可以使用torch.nn模块来实现LSTM和GRU。
下面是LSTM和GRU的PyTorch实现示例:
1. LSTM的PyTorch实现:
```python
import torch
import torch.nn as nn
# 定义LSTM模型
class LSTMModel(nn.Module):
def __init__(self, input_size, hidden_size, num_layers, output_size):
super(LSTMModel, self).__init__()
self.hidden_size = hidden_size
self.num_layers = num_layers
self.lstm = nn.LSTM(input_size, hidden_size, num_layers, batch_first=True)
self.fc = nn.Linear(hidden_size, output_size)
def forward(self, x):
h0 = torch.zeros(self.num_layers, x.size(0), self.hidden_size).to(x.device)
c0 = torch.zeros(self.num_layers, x.size(0), self.hidden_size).to(x.device)
out, _ = self.lstm(x, (h0, c0))
out = self.fc(out[:, -1, :])
return out
# 创建LSTM模型实例
input_size = 10
hidden_size = 20
num_layers = 2
output_size = 1
lstm_model = LSTMModel(input_size, hidden_size, num_layers, output_size)
```
2. GRU的PyTorch实现:
```python
import torch
import torch.nn as nn
# 定义GRU模型
class GRUModel(nn.Module):
def __init__(self, input_size, hidden_size, num_layers, output_size):
super(GRUModel, self).__init__()
self.hidden_size = hidden_size
self.num_layers = num_layers
self.gru = nn.GRU(input_size, hidden_size, num_layers, batch_first=True)
self.fc = nn.Linear(hidden_size, output_size)
def forward(self, x):
h0 = torch.zeros(self.num_layers, x.size(0), self.hidden_size).to(x.device)
out, _ = self.gru(x, h0)
out = self.fc(out[:, -1, :])
return out
# 创建GRU模型实例
input_size = 10
hidden_size = 20
num_layers = 2
output_size = 1
gru_model = GRUModel(input_size, hidden_size, num_layers, output_size)
```
阅读全文