pytorch框架下保存一次训练的损失,代码演示
时间: 2024-05-03 20:21:05 浏览: 12
Sure! Here's some example code that saves the loss after each training epoch:
```
import torch
from torch.utils.data import DataLoader
from torch.nn.utils import clip_grad_norm_
from my_dataset import MyDataset
from my_model import MyModel
# Initialize dataset and data loader
dataset = MyDataset()
dataloader = DataLoader(dataset, batch_size=64, shuffle=True)
# Initialize model and optimizer
model = MyModel()
optimizer = torch.optim.Adam(model.parameters(), lr=0.001)
# Training loop
num_epochs = 10
for epoch in range(num_epochs):
epoch_loss = 0
# Iterate over batches
for batch in dataloader:
# Zero gradients
optimizer.zero_grad()
# Forward pass
inputs, targets = batch
outputs = model(inputs)
loss = torch.nn.functional.cross_entropy(outputs, targets)
# Backward pass
loss.backward()
clip_grad_norm_(model.parameters(), max_norm=1)
# Update parameters
optimizer.step()
# Add loss to epoch loss
epoch_loss += loss.item()
# Print and save average loss for epoch
avg_loss = epoch_loss / len(dataloader)
print(f"Epoch {epoch}: Average Loss = {avg_loss}")
with open("loss.txt", "a") as f:
f.write(f"{avg_loss}\n")
```
This code assumes that you have already defined `MyDataset` and `MyModel` classes (which would handle loading your data and defining your model architecture, respectively). It also assumes that you have a `loss.txt` file in the same directory where you want to save the losses from each epoch.