-
Notifications
You must be signed in to change notification settings - Fork 24.3k
Move dropout and alpha dropout to ATen #10384
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
%65 : Double(1, 1000) = aten::addmm(%64, %60, %61, %21, %21), scope: AlexNet/Sequential[classifier]/Linear[6] | ||
return (%65); | ||
%49 : float = prim::Constant[value=0.5](), scope: AlexNet/Sequential[classifier]/Dropout[0] | ||
%50 : Double(1!, 9216) = aten::dropout(%48, %49, %21), scope: AlexNet/Sequential[classifier]/Dropout[0] |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
auto input_sizes = input.sizes(); | ||
AT_CHECK(input.dim() >= 2, "Feature dropout requires at least 2 dimensions in the input"); | ||
std::vector<int64_t> sizes; | ||
sizes.reserve(input.dim()); |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
@@ -592,27 +594,47 @@ def adaptive_avg_pool3d(input, output_size): | |||
|
|||
# Activation functions | |||
def dropout(input, p=0.5, training=False, inplace=False): | |||
return _functions.dropout.Dropout.apply(input, p, training, inplace) | |||
if p < 0 or p > 1: |
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
This comment was marked as off-topic.
This comment was marked as off-topic.
Sorry, something went wrong.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
apaszke has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.
Summary: zdevito ezyang Pull Request resolved: pytorch/pytorch#10384 Reviewed By: ezyang Differential Revision: D9272583 Pulled By: apaszke fbshipit-source-id: ed5d37b28ce9ff25800bbaa0daf066cfbf1f9921
Summary: zdevito ezyang Pull Request resolved: pytorch#10384 Reviewed By: ezyang Differential Revision: D9272583 Pulled By: apaszke fbshipit-source-id: ed5d37b28ce9ff25800bbaa0daf066cfbf1f9921
@zdevito @ezyang