-
Notifications
You must be signed in to change notification settings - Fork 24.3k
make_module: First version #23288
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
make_module: First version #23288
Conversation
Example: mod = make_module(torch.add)
mod_quant = make_module(torch.add, quantized=True)
a = torch.tensor(1, dtype=torch.float)
qa = torch.quantize_linear(a, 1.0, 127, torch.quint8)
b = torch.tensor(2, dtype=torch.float)
qb = torch.quantize_linear(b, 1.0, 127, torch.quint8)
print("Types\t\t:", type(mod), type(mod_quant))
print("__repr__\t:", mod, mod_quant)
print("forward call\t:", mod(a, b))
print("forward call\t:", mod_quant(qa, qb, 1.0, 0)) Prints:
I can make the returned type to be |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please see suggestions
Differential Revision: [D16455390](https://our.internmc.facebook.com/intern/diff/D16455390)
Differential Revision: [D16455390](https://our.internmc.facebook.com/intern/diff/D16455390)
Differential Revision: [D16455390](https://our.internmc.facebook.com/intern/diff/D16455390)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
First draft, will add more comments later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My biggest comment is that we need to figure out how to make it work with TorchScript (as you want the resulting model to be scriptable and runnable without python). Given that codegen + script to codegen might be an more robust approach (if you have a closed set of modules to cover which appears to be the case)
Differential Revision: [D16455390](https://our.internmc.facebook.com/intern/diff/D16455390)
Differential Revision: [D16455390](https://our.internmc.facebook.com/intern/diff/D16455390)
Differential Revision: [D16455390](https://our.internmc.facebook.com/intern/diff/D16455390)
Differential Revision: [D16455390](https://our.internmc.facebook.com/intern/diff/D16455390)
Differential Revision: [D16455390](https://our.internmc.facebook.com/intern/diff/D16455390)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please see comments
Differential Revision: [D16455390](https://our.internmc.facebook.com/intern/diff/D16455390)
Differential Revision: [D16455390](https://our.internmc.facebook.com/intern/diff/D16455390)
Differential Revision: [D16455390](https://our.internmc.facebook.com/intern/diff/D16455390)
Differential Revision: [D16455390](https://our.internmc.facebook.com/intern/diff/D16455390)
Differential Revision: [D16455390](https://our.internmc.facebook.com/intern/diff/D16455390)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM, we can change to the base class solution in future prs.
@zafartahirov merged this pull request in 9c549df. |
Stack from ghstack:
Differential Revision: D16455390