'ExponentialLR' object has no attribute 'zero_grad'
时间: 2023-09-27 22:05:58 浏览: 164
The error message you are seeing suggests that you are trying to call the zero_grad()
method on an instance of the ExponentialLR
class, which does not have this method.
zero_grad()
is a method defined in the torch.optim.Optimizer
class, which is used to set the gradients of all the parameters to zero before backpropagation.
You should make sure that you are calling zero_grad()
on an instance of the Optimizer
class, not the ExponentialLR
class. Here is an example of how to use zero_grad()
properly:
import torch
import torch.optim as optim
optimizer = optim.SGD(model.parameters(), lr=0.1)
scheduler = optim.lr_scheduler.ExponentialLR(optimizer, gamma=0.9)
# perform a forward pass and compute loss
loss.backward()
# set gradients to zero
optimizer.zero_grad()
# perform backpropagation
optimizer.step()
# update learning rate
scheduler.step()
In this example, zero_grad()
is called on the optimizer
instance, not the scheduler
instance.
相关推荐



















