About Keras
Getting started
Developer guides
Keras 3 API documentation
Models API
Layers API
Callbacks API
Ops API
Optimizers
SGD
RMSprop
Adam
AdamW
Adadelta
Adagrad
Adamax
Adafactor
Nadam
Ftrl
Lion
Lamb
Loss Scale Optimizer
Learning rate schedules API
Metrics
Losses
Data loading
Built-in small datasets
Keras Applications
Mixed precision
Multi-device distribution
RNG API
Utilities
KerasTuner
KerasCV
KerasNLP
KerasHub
Keras 2 API documentation
Code examples
KerasTuner: Hyperparameter Tuning
KerasHub: Pretrained Models
KerasCV: Computer Vision Workflows
KerasNLP: Natural Language Workflows
search
►
Keras 3 API documentation
/
Optimizers
/ Learning rate schedules API
Learning rate schedules API
LearningRateSchedule
ExponentialDecay
PiecewiseConstantDecay
PolynomialDecay
InverseTimeDecay
CosineDecay
CosineDecayRestarts