torch.no_grad
时间: 2023-12-13 17:55:08 浏览: 133
PyTorch autograd模块1
torch.no_grad is a context manager that disables gradient calculation. It is used during inference or testing, when we do not need to compute gradients. By disabling gradient calculation, the memory consumption is reduced and the computation speed is improved.
Example usage:
```
with torch.no_grad():
outputs = model(inputs)
```
In this example, the forward pass of the model is computed without tracking gradients. This is useful for inference where we only need the output of the model and don't need to compute gradients for backpropagation.
阅读全文