tfa.optimizers.AdamW | TensorFlow Addons
www.tensorflow.org › addons › api_docsNov 15, 2021 · This optimizer can also be instantiated as. extend_with_decoupled_weight_decay(tf.keras.optimizers.Adam, weight_decay=weight_decay) Note: when applying a decay to the learning rate, be sure to manually apply the decay to the weight_decay as well. For example: step = tf.Variable(0, trainable=False) schedule = tf.optimizers.schedules ...