Skip to content

Vectorize the equalize transformation #3173

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
datumbox opened this issue Dec 14, 2020 · 3 comments
Closed

Vectorize the equalize transformation #3173

datumbox opened this issue Dec 14, 2020 · 3 comments

Comments

@datumbox
Copy link
Contributor

datumbox commented Dec 14, 2020

🚀 Feature

The current implementation of the equalize transformation is not vectorized. It processes each channel of each image of the batch using a for loop. This is done because the method uses internally the torch.histc which currently can't produce a histogram across a dimension.

def _scale_channel(img_chan):
hist = torch.histc(img_chan.to(torch.float32), bins=256, min=0, max=255)

@fmassa proposed a workaround that trades memory for speed and achieves vectorization. Check #3123 (comment) for the overview/context of the proposal.

To adopt his, a few addtional changes need to be made:

  • The rest of the _scale_channel() needs to be adapted to vectorize the remaining operations.
  • The _equalize_single_image() needs to be removed and the stacking operation in equalize() need to be adapted.

cc @vfdev-5 @fmassa

@avijit9
Copy link
Contributor

avijit9 commented Dec 15, 2020

@datumbox Can I try to solve this one, if nobody is working on it?

@datumbox
Copy link
Contributor Author

Go for it @avijit9 !

@NicolasHug
Copy link
Member

NicolasHug commented Feb 12, 2021

According to #3334, vectorizing the histogram computation seems to lead to a slow-down. I'll close the issue

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants