-
Notifications
You must be signed in to change notification settings - Fork 618
Description
Describe the feature and the current behavior/state.
Callbacks such as EarlyStopping
and ReduceLROnPlateau
monitor a specified metric (usually the validation loss) to activate their functionality. However, many times the metric value across the training epochs is quite noisy, specially with small datasets or models with stochastic layers (such as models with dropout, variational autoencoders...). This can cause EarlyStopping
and ReduceLROnPlateau
to be triggered prematurely, even with large patience, due to the presence of outliers. In fact, this is why the TensorBoard already implements a smoothing of the metrics.
My feature request is to make a metric wrapper that is able to smooth/filter any other metric. Then, the EarlyStopping
and ReduceLROnPlateau
can monitor the smoothed metric. A possible implementation can be found in this stackoverflow answer. What do you think about this?
Relevant information
- Are you willing to contribute it (yes/no): yes, but I am new to contributing so I will need some help
If you wish to contribute, then read the requirements for new contributions inCONTRIBUTING.md
- Are you willing to maintain it going forward? (yes/no): yes
- Is there a relevant academic paper? (if so, where): no
- Does the relavent academic paper exceed 50 citations? (yes/no):
- Is there already an implementation in another framework? (if so, where): no
- Was it part of tf.contrib? (if so, where): no
Which API type would this fall under (layer, metric, optimizer, etc.)
Metric, alhtough it needs and auxiliary callback.
Who will benefit with this feature?
Anyone using EarlyStopping or ReduceLROnPlateau callbacks, and want to avoid premature or false triggers.
Any other info.