-
Notifications
You must be signed in to change notification settings - Fork 24.2k
[quant][fx] Only do reference moduel swapping for floating point fused modules #74231
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…d modules Summary: Add a check to make sure the weighted modules we swap is actually a float fused module, since the reference fused module like reference version of linear - relu would have the same fused type as the floating point linear - relu (and the linear submodule will have different types) Test Plan: phabricator diff for now, can add a test case after we know exactly what the problem is Facebook: see if this fixes the cogwheel test https://www.internalfb.com/diff/D34778506?dst_version_fbid=303538185178175&transaction_fbid=1332625957206219 Reviewers: Subscribers: Tasks: Tags: [ghstack-poisoned]
CI Flow Status⚛️ CI FlowRuleset - Version:
|
🔗 Helpful links
💊 CI failures summary and remediationsAs of commit eeb6e07 (more details on the Dr. CI page): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Please report bugs/suggestions to the (internal) Dr. CI Users group. |
…d modules Summary: Add a check to make sure the weighted modules we swap is actually a float fused module, since the reference fused module like reference version of linear - relu would have the same fused type as the floating point linear - relu (and the linear submodule will have different types) Test Plan: phabricator diff for now, can add a test case after we know exactly what the problem is Facebook: see if this fixes the cogwheel test https://www.internalfb.com/diff/D34778506?dst_version_fbid=303538185178175&transaction_fbid=1332625957206219 Reviewers: Subscribers: Tasks: Tags: ghstack-source-id: d868b36 Pull Request resolved: #74231
@jerryzh168 has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
… point fused modules" Summary: Add a check to make sure the weighted modules we swap is actually a float fused module, since the reference fused module like reference version of linear - relu would have the same fused type as the floating point linear - relu (and the linear submodule will have different types) Test Plan: phabricator diff for now, can add a test case after we know exactly what the problem is Facebook: see if this fixes the cogwheel test https://www.internalfb.com/diff/D34778506?dst_version_fbid=303538185178175&transaction_fbid=1332625957206219 Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D34888290](https://our.internmc.facebook.com/intern/diff/D34888290) [ghstack-poisoned]
…d modules Summary: Add a check to make sure the weighted modules we swap is actually a float fused module, since the reference fused module like reference version of linear - relu would have the same fused type as the floating point linear - relu (and the linear submodule will have different types) Test Plan: phabricator diff for now, can add a test case after we know exactly what the problem is Facebook: see if this fixes the cogwheel test https://www.internalfb.com/diff/D34778506?dst_version_fbid=303538185178175&transaction_fbid=1332625957206219 Reviewers: Subscribers: Tasks: Tags: ghstack-source-id: 33c65f0 Pull Request resolved: #74231
@jerryzh168 has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
… point fused modules" Summary: Add a check to make sure the weighted modules we swap is actually a float fused module, since the reference fused module like reference version of linear - relu would have the same fused type as the floating point linear - relu (and the linear submodule will have different types) Test Plan: phabricator diff for now, can add a test case after we know exactly what the problem is Facebook: see if this fixes the cogwheel test https://www.internalfb.com/diff/D34778506?dst_version_fbid=303538185178175&transaction_fbid=1332625957206219 Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D34888290](https://our.internmc.facebook.com/intern/diff/D34888290) [ghstack-poisoned]
@jerryzh168 has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
… point fused modules" Summary: Add a check to make sure the weighted modules we swap is actually a float fused module, since the reference fused module like reference version of linear - relu would have the same fused type as the floating point linear - relu (and the linear submodule will have different types) Test Plan: phabricator diff for now, can add a test case after we know exactly what the problem is Facebook: see if this fixes the cogwheel test https://www.internalfb.com/diff/D34778506?dst_version_fbid=303538185178175&transaction_fbid=1332625957206219 Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D34888290](https://our.internmc.facebook.com/intern/diff/D34888290) [ghstack-poisoned]
… point fused modules" Summary: Add a check to make sure the weighted modules we swap is actually a float fused module, since the reference fused module like reference version of linear - relu would have the same fused type as the floating point linear - relu (and the linear submodule will have different types) Test Plan: phabricator diff for now, can add a test case after we know exactly what the problem is Facebook: see if this fixes the cogwheel test https://www.internalfb.com/diff/D34778506?dst_version_fbid=303538185178175&transaction_fbid=1332625957206219 Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D34888290](https://our.internmc.facebook.com/intern/diff/D34888290) [ghstack-poisoned]
@jerryzh168 has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
… point fused modules" Summary: Add a check to make sure the weighted modules we swap is actually a float fused module, since the reference fused module like reference version of linear - relu would have the same fused type as the floating point linear - relu (and the linear submodule will have different types) Test Plan: phabricator diff for now, can add a test case after we know exactly what the problem is Facebook: see if this fixes the cogwheel test https://www.internalfb.com/diff/D34778506?dst_version_fbid=303538185178175&transaction_fbid=1332625957206219 Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D34888290](https://our.internmc.facebook.com/intern/diff/D34888290) [ghstack-poisoned]
… point fused modules" Summary: Add a check to make sure the weighted modules we swap is actually a float fused module, since the reference fused module like reference version of linear - relu would have the same fused type as the floating point linear - relu (and the linear submodule will have different types) Test Plan: phabricator diff for now, can add a test case after we know exactly what the problem is Facebook: see if this fixes the cogwheel test https://www.internalfb.com/diff/D34778506?dst_version_fbid=303538185178175&transaction_fbid=1332625957206219 Reviewers: Subscribers: Tasks: Tags: Differential Revision: [D34888290](https://our.internmc.facebook.com/intern/diff/D34888290) [ghstack-poisoned]
@jerryzh168 has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
…d modules (#74231) Summary: Pull Request resolved: #74231 Add a check to make sure the weighted modules we swap is actually a float fused module, since the reference fused module like reference version of linear - relu would have the same fused type as the floating point linear - relu (and the linear submodule will have different types) Test Plan: phabricator diff for now, can add a test case after we know exactly what the problem is Reviewed By: andrewor14 Differential Revision: D34888290 fbshipit-source-id: a7f53368a7c17f7d1a82afaa50d14d569b4923df
…d modules (#74231) Summary: Pull Request resolved: #74231 Add a check to make sure the weighted modules we swap is actually a float fused module, since the reference fused module like reference version of linear - relu would have the same fused type as the floating point linear - relu (and the linear submodule will have different types) Test Plan: phabricator diff for now, can add a test case after we know exactly what the problem is Reviewed By: andrewor14 Differential Revision: D34888290 fbshipit-source-id: a7f53368a7c17f7d1a82afaa50d14d569b4923df (cherry picked from commit 458dac9)
Stack from ghstack (oldest at bottom):
Summary:
Add a check to make sure the weighted modules we swap is actually a float fused module,
since the reference fused module like reference version of linear - relu would have the same
fused type as the floating point linear - relu (and the linear submodule will have different types)
Test Plan:
phabricator diff for now, can add a test case after we know exactly what the problem is
Facebook:
see if this fixes the cogwheel test https://www.internalfb.com/diff/D34778506?dst_version_fbid=303538185178175&transaction_fbid=1332625957206219
Reviewers:
Subscribers:
Tasks:
Tags:
Differential Revision: D34888290