Training does not decrease error given a custom loss function #9323
Unanswered
ajsanjoaquin
asked this question in
code help: CV
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello,
I implemented this loss function from Section 2.1 of Right for the Right Reasons by Ross, et al., 2017
Training updates normally when using vanilla torch, but it does not decrease the loss when I implemented the same setup in Pytorch Lightning. Specifically, I use a zero matrix (A term in the equation) as an input, which makes the custom loss equivalent to cross entropy.
Here is a notebook to reproduce the error, and show that the same setup works in vanilla pytorch. Hoping that I just missed a step somewhere and that someone can point it out.
Thank you!
Beta Was this translation helpful? Give feedback.
All reactions