Optimizers - Keras
https://keras.io/api/optimizersExponentialDecay (initial_learning_rate = 1e-2, decay_steps = 10000, decay_rate = 0.9) optimizer = keras. optimizers. SGD ( learning_rate = lr_schedule ) Check out the learning rate schedule API documentation for a list of available schedules.