-
Notifications
You must be signed in to change notification settings - Fork 614
Packaging learning rate schedules with addons? #1781
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
cc @tensorflow/sig-addons-maintainers |
That seems like a good fit for addons. We already have some learning rate schedulers. We would just need a codeowner though. |
How this usage was identified? Model Garden? GitHub third_party repos? |
A combination of a github search and a search through our internal codebases. Github produces enough search results that my estimates could be wrong, but for noisylinearcosinedecay and linearcosinedecay I seemed to just find forks of tensorflow: However, github does show code results for CosineDecay that aren't just forks of tensorflow core: |
I think you need to search also for something like |
TensorFlow Addons is transitioning to a minimal maintenance and release mode. New features will not be added to this repository. For more information, please see our public messaging on this decision: Please consider sending feature requests / contributions to other repositories in the TF community with a similar charters to TFA: |
We have several experimental learning rate schedules in TensorFlow/Keras currently:
https://www.tensorflow.org/api_docs/python/tf/keras/experimental/CosineDecay
tf.experimental.CosineDecay
tf.experimental.CosineDecayRestarts
tf.experimental.LinearCosineDecay
tf.experimental.NoisyLinearCosineDecay
These have been marked experimental for several releases, but have seen little to no usage (0 for linearcosine/noisylinearcosine, and only a handful of usages for CosineDecay/CosineDecayRestarts)
So, we're thinking of deprecating these in the next TF release, and dropping them entirely in the release after that.
Would Tensorflow-addons be interested in packaging learning rate schedules that have some users but not quite enough to include as part of tensorflow core?
E.g. CosineDecay / cosine decay w/ restarts in this case. As I've found zero users of LinearCosineDecay/NoisyLinearCosineDecay it's not clear that they should be included in addons either.
The text was updated successfully, but these errors were encountered: