PyTorch Learning Rate Schedulers
There are several types of learning rate schedulers in PyTorch.
- StepLR: The learning rate decreases by a factor of gamma at every given step size.
 - MultiStepLR: Define a list where the learning rate decreases by a factor of gamma at each step size in the list.
 - ExponentialLR: The learning rate decays exponentially.
 - CosineAnnealingLR: Cosine annealing learning rate scheduling.
 - ReduceLROnPlateau: Decrease the learning rate when a metric stops improving.
 - LambdaLR: Implement a learning rate scheduler by utilizing a designated function.
 - CyclicLR: Periodic adjustment of the learning rate within a cycle range.
 - OneCycleLR: a learning rate scheduler that accelerates model convergence by using varying learning rates during training.
 - Cosine Annealing with warm restarts is a learning rate scheduler that incorporates the functionality of cosine annealing.
 - MultiplicativeLR: multiply the learning rate by a given factor at each step.