-
Notifications
You must be signed in to change notification settings - Fork 24.3k
Prepare THNN/THCUNN for first class scalars. #10023
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
I previous did some transformations, e.g. _nDimension,_dim -> nDimensionLegacyAll, nDimension -> nDimensionLegacyNoScalars. But this didn't touch dim(), which needs to be updated to support scalars. Instead of doing an (ugly) move, I audited the call sites and updated the cases that could be size 1.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
gchanan has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
@pytorchbot retest this please. |
@@ -4,17 +4,17 @@ | |||
|
|||
static bool THNN_(checkInput)(THCTensor* t) | |||
{ | |||
return !t->is_empty() && THTensor_nDimensionLegacyAll(t) == 2 && t->size(1) == 3; | |||
return !t->is_empty() && t->dim() == 2 && t->size(1) == 3; |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
} | ||
|
||
static bool THNN_(checkSize1D)(THCTensor* t, int64_t size0) | ||
{ | ||
return !t->is_empty() && THTensor_nDimensionLegacyAll(t) == 1 && t->size(0) == size0; | ||
return !t->is_empty() && THTensor_nDimensionLegacyNoScalars(t) == 1 && THTensor_sizeLegacyNoScalars(t, 0) == size0; |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
@pytorchbot retest this please. |
Summary: I previous did some transformations, e.g. _nDimension,_dim -> nDimensionLegacyAll, nDimension -> nDimensionLegacyNoScalars. But this didn't touch dim(), which needs to be updated to support scalars. Instead of doing an (ugly) move, I audited the call sites and updated the cases that could be size 1. Pull Request resolved: pytorch/pytorch#10023 Differential Revision: D9068996 Pulled By: gchanan fbshipit-source-id: c63820767dd1496e908a5a96c34968482193f2c5
Summary: I previous did some transformations, e.g. _nDimension,_dim -> nDimensionLegacyAll, nDimension -> nDimensionLegacyNoScalars. But this didn't touch dim(), which needs to be updated to support scalars. Instead of doing an (ugly) move, I audited the call sites and updated the cases that could be size 1. Pull Request resolved: pytorch#10023 Differential Revision: D9068996 Pulled By: gchanan fbshipit-source-id: c63820767dd1496e908a5a96c34968482193f2c5
Summary: I previous did some transformations, e.g. _nDimension,_dim -> nDimensionLegacyAll, nDimension -> nDimensionLegacyNoScalars. But this didn't touch dim(), which needs to be updated to support scalars. Instead of doing an (ugly) move, I audited the call sites and updated the cases that could be size 1. Pull Request resolved: pytorch#10023 Differential Revision: D9068996 Pulled By: gchanan fbshipit-source-id: c63820767dd1496e908a5a96c34968482193f2c5
I previous did some transformations, e.g. _nDimension,_dim -> nDimensionLegacyAll, nDimension -> nDimensionLegacyNoScalars.
But this didn't touch dim(), which needs to be updated to support scalars. Instead of doing an (ugly) move, I audited the call sites and updated the cases that could be size 1.