Skip to content

Minor fix antialias arg #6209

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 3 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion torchvision/prototype/transforms/functional/_geometry.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ def resize_image_tensor(
size: List[int],
interpolation: InterpolationMode = InterpolationMode.BILINEAR,
max_size: Optional[int] = None,
antialias: Optional[bool] = None,
antialias: bool = False,
) -> torch.Tensor:
num_channels, old_height, old_width = get_dimensions_image_tensor(image)
new_height, new_width = _compute_output_size((old_height, old_width), size=size, max_size=max_size)
Expand Down
3 changes: 3 additions & 0 deletions torchvision/transforms/functional.py
Original file line number Diff line number Diff line change
Expand Up @@ -471,6 +471,9 @@ def resize(
pil_interpolation = pil_modes_mapping[interpolation]
return F_pil.resize(img, size=output_size, interpolation=pil_interpolation)

if antialias is None:
antialias = False

return F_t.resize(img, size=output_size, interpolation=interpolation.value, antialias=antialias)


Expand Down
5 changes: 1 addition & 4 deletions torchvision/transforms/functional_tensor.py
Original file line number Diff line number Diff line change
Expand Up @@ -430,16 +430,13 @@ def resize(
img: Tensor,
size: List[int],
interpolation: str = "bilinear",
antialias: Optional[bool] = None,
antialias: bool = False,
) -> Tensor:
_assert_image_tensor(img)

if isinstance(size, tuple):
size = list(size)

if antialias is None:
antialias = False

Comment on lines -440 to -442
Copy link
Member

@NicolasHug NicolasHug Jun 28, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is a BC-breaking change for users who explicitly passed antialias=None, or similar patterns.

Considering the relatively low benefit of removing support for None, we might want to keep it to avoid any potential issues?

Copy link
Collaborator Author

@vfdev-5 vfdev-5 Jun 28, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is private API which is not intended to be used by users. I keep public resize same but just changing private method.

EDIT:
Main purpose is to drop None option from resize_image_tensor:

- antialias: Optional[bool] = None,
+ antialias: bool = False,

If we want to keep private resize API unchanged, I think we can still update resize_image_tensor.
@NicolasHug let me know what you think, please

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This function is importable without a leading underscore as

from torchvision.transforms.functional_tensor import resize

and as a result, it is likely that most users will consider it public, as this is a widely used convention throughout the pydata ecosystem.

I think it's fine to modify the prototype, but I'd rather keep the stable area untouched to avoid any potential breakage. Especially considering the low benefit, and the fact that these transforms will eventually be superseded by the prototype ones.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, we should have added leading underscore to them. It very unfortunate if people today rely on this undocumented API. In this PR #3200 we removed "PRIVATE METHOD" in the docstring.
Previously I already refactored and removed an arg from resize method in functional_pil and functional_tensor #6191 .
Should we restore the arg for BC and warn about new behaviour ?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry just realized this is a public function, thanks @NicolasHug for noticing.
I agree that we shouldn't make BC changes in this case.

Looking at the update of @vfdev-5 comment, I think resize_image_tensor is under prototype which should be okay to change explicitly, but lets keep functional_tensor.resize unchanged.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I still have them in private space

Sorry what do you mean? We should treat everything in functional_tensor and functional_pil as public

Copy link
Contributor

@datumbox datumbox Jun 28, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@NicolasHug this API is not public and the decision to do this has happened I believe 1.5 year ago. I am happy to chat more once we are both back.

Cc @fmassa who could provide more details as he made the decision

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I found some useful references. The last version that contained warnings that these are private APIs was 0.8: https://github.com/pytorch/vision/blob/release/0.8.0/torchvision/transforms/functional_tensor.py

After that the decision was made to remove the documentation warning, as back then that was the deprecation process followed.

Copy link
Collaborator Author

@vfdev-5 vfdev-5 Jun 28, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After that the decision was made to remove the documentation warning

Decision was very quick: #3069 (review) and #3200 and here is your issue #3071 to do that

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this API is not public

As mentioned in #6191 (comment) there is absolutely nothing that shows that this API is private, unfortunately.

It is strictly the same as what's in transforms.functional. The only difference is the existence of docstrings, which isn't a reasonable frontier marker considering how users can discover APIs (through blogs, through auto-completion, etc).

I do understand that this is unfortunate, but we've had various issues in the past where users had assumed that some APIs where public when we meant them to be private. A recent example is #5836 where we eventually had to reinstate BC through deprecation. This causes more frustration for our users, and more work for us.

We should also note that functional_pil in particular may be used by some users as a compatibility layer for torch and pil - as you know HF has shown interest in this recently. They might not be the only ones.

Is there a very strong reason to break BC here? It looks like maintaining BC could be done at very little cost both in this PR and in #6191.

if antialias and interpolation not in ["bilinear", "bicubic"]:
raise ValueError("Antialias option is supported for bilinear and bicubic interpolation modes only")

Expand Down