-
Notifications
You must be signed in to change notification settings - Fork 127
Questions for normalize_ir() #185
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi Wei, in this case, you are mutating the input itself. Suppose, there is a If the input was not mutated (opposed to your example), the def-and-use of the mutated variable would be contained within the scope of the Fx graph. And, therefore, we could do graph rewrite to get rid of mutation. This is what normalize-ir does for majority (not all) of the cases. But, handling input mutation requires little more effort.
Somewhat related to this topic is |
@anijain2305 thanks for your explanation and it helps me a lot to my concerns. Mutating the input is an extreme case which we rarely see it. I tried with case not mutating the input and saw the expected behavior.
|
So, this is not AOT Autograd per se. In fact, in the case of AOT Autograd, we plan to first get the forward and backward graph from the usual AOT tracing. And then call functionalization on top of these forward and backward graphs. Therefore, I don't see any reason why we cannot use Functionalization for Dynamo-created Fx Graphs.
More details on Functionalization and AOT Autograd integration is here - #88 |
@anijain2305 it looks great for me of this PR https://github.com/pytorch/functorch/pull/703/files. I am expecting it will remove some mutations on dynamo created fx graph which will help us remove some blockers in some models. |
Uh oh!
There was an error while loading. Please reload this page.
It looks like the normalize_ir() has different behavior than I thought.
For ex, I was hoping Functionalization will replace the in-place op to standard op like relu_ to relu.
I run a simple test program with gm.graph as follows:
After adding the normalizer in my_compiler. The result gm.graph does not change.
After more debugging, I found that n.meta["is_input_mutation"] is True for relu node. Here is the code
So my question is,
The text was updated successfully, but these errors were encountered: