forked from pytorch/pytorch
-
Notifications
You must be signed in to change notification settings - Fork 7
Closed
Description
🚀 Feature
Our end-2-end testing always compare the output of fuser with Pytorch eager. Since PyTorch eager execute binary operation one after the other, versus fuser which does multiple operation as a whole. FMA used in fuser kernel provides better accuracy, but it drifts away from the eager result.
We would want a quick flag to disable FMA usage just to avoid such numerical issue in functional test.
Assigning myself on this.
Metadata
Metadata
Assignees
Labels
No labels