Composing scheduler #8768
Unanswered
csilvaab
asked this question in
code help: CV
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
Since it is my first post, I would like to start thanking for making pytorch-lightning available to the public. It is a nice and well engineered piece of software.
I have been looking and do not find a way to compose two schedulers (learning rate and weight decay - I am using AdamW). The solution I implemented was to return the optimizer and the scheduler (my custom ExponentialLR) in configure_optimizers and couple my custom ReduceLROnPlateau in validation_epoch_end hook.
It is working except that I have started to randomly get the error "RuntimeError: CUDA error: an illegal memory access was encountered". I am not sure if I am breaking any abstraction of Pytorch-Lightning that could cause such error. Is there an easier and "official" way to do what I want?
Beta Was this translation helpful? Give feedback.
All reactions