Skip to content

Refactored and fixed flaky resize tests #3907

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
May 25, 2021

Conversation

vfdev-5
Copy link
Collaborator

@vfdev-5 vfdev-5 commented May 25, 2021

Addresses #3906 (comment)

Description:

  • Refactored and fixed flaky resize tests

I refactored the test using pytest and could find seed value with MAE larger 8. The case when it is failing looks like to be related to max_size=33. I suppose that odd values are not handled in the same way between PIL and pytorch. If I use max_size=34 I could not find a seed value between 0 to 2000 such that MAE > 8.0.

@vfdev-5 vfdev-5 force-pushed the fix-bicubic-flaky-issues branch from 49b9ba0 to 5372b13 Compare May 25, 2021 07:17
Copy link
Collaborator

@pmeier pmeier left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor comment below. Otherwise LGTM! Thanks Victor.

@vfdev-5 vfdev-5 changed the title [WIP] Refactored and fixed flaky resize tests Refactored and fixed flaky resize tests May 25, 2021
@vfdev-5 vfdev-5 requested a review from datumbox May 25, 2021 07:37
Copy link
Member

@NicolasHug NicolasHug left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @vfdev-5 ! Some minor comments but LGTM mostly

@pytest.mark.parametrize('interpolation', [BILINEAR, BICUBIC, NEAREST])
def test_resize(device, dt, size, max_size, interpolation, tester):

torch.manual_seed(12)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do we have to set the seed? If yes maybe add a comment with a potential FIXME?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should create deterministic and random input instead purely random one

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

OK, we generally don't do that, but I agree with you that it's better practice when we know the test aren't flaky.

Another option is to actually parametrize the seed, which we can do if the tests are very fast.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Otherwise I would put it module-wise. Seed as parameter is something new to me :)

Copy link
Member

@NicolasHug NicolasHug May 25, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this is already setting the seed module-wise, at least for the tests that will be run after this one.

In the long term we'll use something local like rng = np.random.RandomState(0) (surely there's a pytorch equivalent?) and generate stuff from the local rng variable, but for now this is good enough

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In the long term we'll use something local like rng = np.random.RandomState(0) (surely there's a pytorch equivalent?) and generate stuff from the local rng variable, but for now this is good enough

Yes, something like that : https://pytorch.org/docs/stable/generated/torch.Generator.html#torch.Generator.manual_seed

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

For all tests that do not perform a loop and thus only need one seed per test invocation, we could use an autouse fixture that is defined in the root conftest.py

import pytest

@pytest.fixture(scope="function", autouse=True)
def random_seed():
    torch.manual_seed(12)

Copy link
Contributor

@datumbox datumbox left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for your quick fixes.

@NicolasHug NicolasHug merged commit 21824ce into pytorch:master May 25, 2021
facebook-github-bot pushed a commit that referenced this pull request May 25, 2021
Summary: Co-authored-by: Philip Meier <[email protected]>

Reviewed By: vincentqb, cpuhrsch

Differential Revision: D28679988

fbshipit-source-id: 0851bd362c25128f143216c063b13ba4ef6c88f1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants