-
Notifications
You must be signed in to change notification settings - Fork 7.1k
Make custom ops differentiable #1314
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
and replace autograd.Function. Use ops unconditionally. We may consider removing the extension functions in a follow-up. The code-path is tested by the exisitng tests for differentiability.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wow, this is pretty clean, thanks a lot!
Could you also add a test showing that the gradients of a scripted function works as expected?
Removing the extension module in a follow-up sounds good to me
I'm also ccing @ezyang , this looks like the first example of how to use C++ Function for autograd, and this looks great.
@t-vi tests failures seems to be related
|
Which CI failure is that? I'm seeing a lot of |
Here is one example https://travis-ci.org/pytorch/vision/jobs/582603380 , line 954 I've re-run some of the CircleCI failures, which I think were still picking the patch before my fix |
@t-vi can you rebase your PR with current master? This looks pretty good! |
Codecov Report
@@ Coverage Diff @@
## master #1314 +/- ##
==========================================
+ Coverage 65.83% 65.84% +0.01%
==========================================
Files 75 75
Lines 5824 5782 -42
Branches 886 884 -2
==========================================
- Hits 3834 3807 -27
+ Misses 1725 1710 -15
Partials 265 265
Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome, thanks a lot!
Make custom ops differentiable and replace autograd.Function. Use ops unconditionally.
We may consider removing the extension module in a follow-up.
The code-path is tested by the existing tests for differentiability.