def train(dataloader, model, loss_fn, optimizer): size = len(dataloader.dataset) model.train() for batch, (X, y) in enumerate(dataloader): X, y = X.to(device), y.to(device) y=y.long() # Compute prediction error pred = model(X) loss = loss_fn(pred, y) # Backpropagation optimizer.zero_grad() loss.backward() optimizer.step() if batch % 50 == 0: loss, current = loss.item(), (batch + 1) * len(X) print(f"loss: {loss:>7f} [{current:>5d}/{size:>5d}]") return Losslist def test(dataloader, model, loss_fn): size = len(dataloader.dataset) num_batches = len(dataloader) model.eval() test_loss, correct = 0, 0 with torch.no_grad(): for X, y in dataloader: X, y = X.to(device), y.to(device) y=y.long() pred = model(X) test_loss += loss_fn(pred, y).item() correct += (pred.argmax(1) == y).type(torch.float).sum().item() test_loss /= num_batches correct /= size print(f"Test Error: \n Accuracy: {(100*correct):>0.1f}%, Avg loss: {test_loss:>8f} \n") return LossList,acclist如何修改
时间: 2024-01-12 17:05:20 浏览: 139
保存数据-pytorch dataloader 取batch_size时候出现bug的解决方式
如果你想要在这段代码中记录每个 epoch 的损失和准确率,你需要做以下修改:
1. 在 train 和 test 函数中定义 LossList 和 AccList 两个列表,用于记录每个 epoch 的损失和准确率。可以在函数开始时定义它们,比如:
```
def train(dataloader, model, loss_fn, optimizer):
LossList = []
...
def test(dataloader, model, loss_fn):
LossList = []
AccList = []
...
```
2. 在 train 循环中,每 50 个 batch 输出当前的损失和已经处理的样本数量的同时,将损失添加到 LossList 中,比如:
```
if batch % 50 == 0:
loss, current = loss.item(), (batch + 1) * len(X)
print(f"loss: {loss:>7f} [{current:>5d}/{size:>5d}]")
LossList.append(loss)
```
3. 在 test 循环中,将每个 batch 的损失和准确率记录到对应的列表中,比如:
```
for X, y in dataloader:
...
test_loss += loss_fn(pred, y).item()
correct += (pred.argmax(1) == y).type(torch.float).sum().item()
test_loss /= num_batches
correct /= size
LossList.append(test_loss)
AccList.append(correct)
```
4. 最后,在 train 和 test 函数的返回值中添加 LossList 和 AccList,以便在之后的分析和可视化中使用,比如:
```
return LossList
...
return LossList, AccList
```
这些修改会让 train 和 test 函数记录每个 epoch 的损失和准确率,并将它们存储在 LossList 和 AccList 中。你可以在每个 epoch 结束时保存这些列表,然后使用它们进行可视化或其他分析。
阅读全文