Skip to content

Moving normalize, erase to functional_tensor.py #1474

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 10 commits into from

Conversation

surgan12
Copy link
Contributor

@surgan12 surgan12 commented Oct 16, 2019

This is a follow up to #1466.
Since, we have decided to shift the vertical to transforms via pytorch ops. This PR, moves existing such functions from functional.py to functional_tensor.py to ensure a unified setting.

@surgan12
Copy link
Contributor Author

cc: @fmassa , what are your thoughts on this ?

Copy link
Member

@fmassa fmassa left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR!

I think that the state of functional_tensor.py as it currently stands is temporary.
We should only expose to the users the torchvision.transforms.functional namespace, and the functional_tensor.py is an implementation detail.

What I'm thinking is to move the implementations for the PIL-based transforms into a separate file, like functional_pil.py, and have functional.py implement the dispatch to different functions depending on its input type.

A rough implementation would be

def resize(...):
    if isinstance(img, torch.Tensor):
        return resize_tensor(img, ...)
     elif isinstance(img, Image.Image):
        return resize_pil(img, ...)

and have resize_pil be a @torch.jit.unused function, so that it doesn't get scripted.

Does this make sense?

@@ -188,36 +188,6 @@ def to_pil_image(pic, mode=None):
return Image.fromarray(npimg, mode=mode)


def normalize(tensor, mean, std, inplace=False):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a breaking change, because users of torch.nn.functional.normalize won't find it anymore.
At least, you should from .functional_tensor import normalize, erase

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, Got it.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Given what I mentioned above, I'm not sure if there is a need to move normalize and erase to functional_tensor.py, because we don't have any implementation based on PIL anyway.

Thoughts?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes but if we want to do something like dispatch from functional -> functional_tensor / functional -> functional_pil, then this might help in ensuring consistency, what do you think ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@fmassa, let me know if I am on same page as you?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My thinking is that we don't need to move normalize nor erase to functional_tensor.py.
Users shouldn't need to worry about the existence of functional_pil nor functional_tensor, just functional.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, got it ! Thanks

@surgan12 surgan12 closed this Oct 18, 2019
@fmassa
Copy link
Member

fmassa commented Oct 18, 2019

Thanks for the PR anyway!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants