loss not decreasing in AutoModelForQuestionAnswering (Transformers) #8829
Unanswered
mmgxa
asked this question in
code help: NLP / ASR / TTS
Replies: 2 comments 5 replies
-
How does your |
Beta Was this translation helpful? Give feedback.
4 replies
-
Dear @mmgxa, Mind sharing a reproducible script with the trainer arguments you are using and make sure to update to latest PyTorch Lightning version ? Best, |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I am training my Model on SQUAD dataset. I get the same problem for either
distilbert-base-uncased
orbert-base-uncased
checkpoint - the loss does not decrease.In my
forward
method, I have this lineThis should work for backpropagation, right?
Also, it gives the following warning
Is this affecting the loss/backprop?
Beta Was this translation helpful? Give feedback.
All reactions