Description
We have several experimental learning rate schedules in TensorFlow/Keras currently:
https://www.tensorflow.org/api_docs/python/tf/keras/experimental/CosineDecay
tf.experimental.CosineDecay
tf.experimental.CosineDecayRestarts
tf.experimental.LinearCosineDecay
tf.experimental.NoisyLinearCosineDecay
These have been marked experimental for several releases, but have seen little to no usage (0 for linearcosine/noisylinearcosine, and only a handful of usages for CosineDecay/CosineDecayRestarts)
So, we're thinking of deprecating these in the next TF release, and dropping them entirely in the release after that.
Would Tensorflow-addons be interested in packaging learning rate schedules that have some users but not quite enough to include as part of tensorflow core?
E.g. CosineDecay / cosine decay w/ restarts in this case. As I've found zero users of LinearCosineDecay/NoisyLinearCosineDecay it's not clear that they should be included in addons either.