How to adjust the learning rate in PyTorch?
In PyTorch, there are several ways to adjust the learning rate.
- Automatically adjust the learning rate using the learning rate scheduler from the torch.optim.lr_scheduler module. Various learning rate scheduling strategies such as StepLR, ReduceLROnPlateau, CosineAnnealingLR can be chosen. Simply call the step method of the learning rate scheduler at the end of each epoch or batch to update the learning rate.
import torch.optim as optim
from torch.optim.lr_scheduler import StepLR
optimizer = optim.SGD(model.parameters(), lr=0.1)
scheduler = StepLR(optimizer, step_size=30, gamma=0.1)
for epoch in range(num_epochs):
# Train the model
...
# Update learning rate
scheduler.step()
- Manually adjust the learning rate. You can manually change the learning rate during training as needed, for example, by adjusting it at specific epochs or under certain conditions.
optimizer = optim.SGD(model.parameters(), lr=0.1)
for epoch in range(num_epochs):
# Train the model
...
if epoch == 30:
for param_group in optimizer.param_groups:
param_group['lr'] = 0.01
- Adjust the learning rate by using optimizer.param_groups in the torch.optim module. Update the learning rate by modifying the lr parameter in optimizer.param_groups.
optimizer = optim.SGD(model.parameters(), lr=0.1)
for epoch in range(num_epochs):
# Train the model
...
if epoch % 10 == 0:
for param_group in optimizer.param_groups:
param_group['lr'] *= 0.1
The above are several common methods for adjusting the learning rate, which can be chosen based on the actual situation when training a neural network.