pytorch LSTM时间序列多变量多步预测
时间: 2023-07-28 08:46:03 浏览: 115
要使用PyTorch进行多变量多步时间序列预测,可以使用LSTM模型。这里提供一个简单的示例代码:
```python
import torch
import torch.nn as nn
import numpy as np
# 定义LSTM模型
class LSTM(nn.Module):
def __init__(self, input_size, hidden_size, output_size):
super(LSTM, self).__init__()
self.hidden_size = hidden_size
self.lstm = nn.LSTM(input_size, hidden_size)
self.fc = nn.Linear(hidden_size, output_size)
def forward(self, x):
lstm_out, _ = self.lstm(x)
out = self.fc(lstm_out[:, -1, :])
return out
# 生成数据
def generate_data(data, n_steps_in, n_steps_out):
X, y = [], []
for i in range(len(data)-n_steps_in-n_steps_out):
x = data[i:(i+n_steps_in), :]
X.append(x)
y.append(data[(i+n_steps_in):(i+n_steps_in+n_steps_out), :])
return np.array(X), np.array(y)
# 准备数据
data = np.random.randn(100, 3)
n_steps_in, n_steps_out = 10, 5
X, y = generate_data(data, n_steps_in, n_steps_out)
X = torch.from_numpy(X).float()
y = torch.from_numpy(y).float()
# 定义模型和优化器
model = LSTM(input_size=X.shape[-1], hidden_size=64, output_size=y.shape[-1])
criterion = nn.MSELoss()
optimizer = torch.optim.Adam(model.parameters(), lr=0.001)
# 训练模型
num_epochs = 1000
for epoch in range(num_epochs):
optimizer.zero_grad()
output = model(X)
loss = criterion(output, y)
loss.backward()
optimizer.step()
if epoch % 100 == 0:
print('Epoch [{}/{}], Loss: {:.4f}'.format(epoch+1, num_epochs, loss.item()))
# 预测
with torch.no_grad():
x_test = torch.from_numpy(data[-n_steps_in:, :]).float().unsqueeze(0)
preds = model(x_test)
print(preds)
```
在这个示例中,我们生成了一个随机的三维数据,然后使用`generate_data`函数将其转换为输入和输出序列。我们定义了一个`LSTM`类作为模型,然后使用均方误差作为损失函数和Adam优化器进行训练。最后,我们使用训练好的模型对最后一段输入序列进行预测。
阅读全文