Skip to content

Check Type Annotations for transforms/functional_pil and transforms/functional_tensor #4282

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
oke-aditya opened this issue Aug 17, 2021 · 4 comments

Comments

@oke-aditya
Copy link
Contributor

oke-aditya commented Aug 17, 2021

🚀 Feature

As spotted in #4234 (review)
There are some minor cases where type annotations do not match.

We should probably rectify them and ensure that the types annotations match.

P.S.
But I'm willing to leave for any new contributor! Otherwise I will complete it in the next week

cc @vfdev-5 @datumbox @pmeier I think this can be a good first issue for people awaiting to contribute 😄

@pmeier
Copy link
Collaborator

pmeier commented Aug 18, 2021

After some more thought, this is probably a torch.jit issue in the sense that Union is not supported. We could try to change the annotation for the tensor transform, but my guess is that the tests will fail.

@oke-aditya oke-aditya changed the title Check Typen Annotations for transforms/functional_pil and transforms/functional_tensor Check Type Annotations for transforms/functional_pil and transforms/functional_tensor Aug 26, 2021
@oke-aditya
Copy link
Contributor Author

oke-aditya commented Aug 26, 2021

Probably no new contributor seems to take this So I will have go this week.

@lezwon
Copy link
Contributor

lezwon commented Mar 6, 2022

hi @oke-aditya I see the type annotations do not match here too:

def _center_crop_parse_output_size(output_size: List[int]) -> List[int]:
if isinstance(output_size, numbers.Number):
return [int(output_size), int(output_size)]
elif isinstance(output_size, (tuple, list)) and len(output_size) == 1:
return [output_size[0], output_size[0]]
else:
return list(output_size)

I could make a PR to fix this.

@datumbox
Copy link
Contributor

datumbox commented Mar 6, 2022

@lezwon I believe this might be intentional to support JIT. :(

If it can be fixed without breaking the JIT-scriptability that's great. But if that's not possible, we should leave as-is to allow inference on mobile devices.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants