Skip to content

[batch-rule] householder_product #322

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation

kshitij12345
Copy link
Contributor

@kshitij12345 kshitij12345 commented Dec 8, 2021

Reference: #240

@@ -160,6 +178,7 @@ TORCH_LIBRARY_IMPL(aten, FT_BATCHED_KEY, m) {
VMAP_SUPPORT("mv", mv_batch_rule);
VMAP_SUPPORT("mm", mm_batch_rule);
m.impl("linear", linear_decomp);
m.impl("orgqr", orgqr_decomp);
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Without this we get
RuntimeError: aten::orgqr hit the vmap fallback which is currently disabled

But orgqr is a composite operator 🤔.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Maybe, you can add it into BatchRulesDecompositions.cpp

@kshitij12345 kshitij12345 marked this pull request as ready for review December 8, 2021 14:56
@@ -151,6 +151,20 @@ Tensor linear_decomp(
return result;
}

std::tuple<Tensor, c10::optional<int64_t>>
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we use EXISTING_BDIM or EXISTING_BDIM_ALL_BOXED for this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unfortunately neither of them works.

  • EXISTING_BDIM seems to assume that only self has bdim
  • With EXISTING_BDIM_ALL_BOXED we get torch.linalg.householder_product: input.shape[-1] must be greater than or equal to tau.shape[-1]

@zou3519
Copy link
Contributor

zou3519 commented Mar 14, 2022

@Chillee @kshitij12345 can we merge this?

@Chillee
Copy link
Contributor

Chillee commented Mar 14, 2022

@zou3519 I'm fine with merging it.

zou3519 added a commit that referenced this pull request Jul 18, 2022
Original code written by @kshitij12345 in
#322,
this PR is just a rebase onto main
zou3519 added a commit that referenced this pull request Jul 18, 2022
Original code written by @kshitij12345 in
#322,
this PR is just a rebase onto main
@zou3519
Copy link
Contributor

zou3519 commented Jul 18, 2022

Merged in #972

@zou3519 zou3519 closed this Jul 18, 2022
zou3519 added a commit to zou3519/pytorch that referenced this pull request Jul 20, 2022
bigfootjon pushed a commit to pytorch/pytorch that referenced this pull request Jul 21, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants