Skip to content

[DEBUG] flaky gaussian blur #6755

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 16 commits into from
3,092 changes: 0 additions & 3,092 deletions .circleci/config.yml

This file was deleted.

160 changes: 0 additions & 160 deletions .github/workflows/build-m1-binaries.yml

This file was deleted.

35 changes: 0 additions & 35 deletions .github/workflows/pr-labels.yml

This file was deleted.

54 changes: 16 additions & 38 deletions .github/workflows/prototype-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,13 +7,20 @@ jobs:
prototype:
strategy:
matrix:
os:
- ubuntu-latest
- windows-latest
- macos-latest
runner:
- 1
- 2
- 3
- 4
- 5
- 6
- 7
- 8
- 9
- 10
fail-fast: false

runs-on: ${{ matrix.os }}
runs-on: ubuntu-latest

steps:
- name: Set up python
Expand Down Expand Up @@ -43,41 +50,12 @@ jobs:
id: setup
run: exit 0

- name: Run prototype features tests
shell: bash
run: |
pytest \
--durations=20 \
--cov=torchvision/prototype/features \
--cov-report=term-missing \
test/test_prototype_features*.py

- name: Run prototype datasets tests
if: success() || ( failure() && steps.setup.conclusion == 'success' )
shell: bash
- name: Test non-determinism
run: |
pytest \
--durations=20 \
--cov=torchvision/prototype/datasets \
--cov-report=term-missing \
test/test_prototype_datasets*.py
pip install tqdm
python test_gaussian_blur_non_determinism.py

- name: Run prototype transforms tests
if: success() || ( failure() && steps.setup.conclusion == 'success' )
shell: bash
run: |
pytest \
--durations=20 \
--cov=torchvision/prototype/transforms \
--cov-report=term-missing \
test/test_prototype_transforms*.py

- name: Run prototype models tests
if: success() || ( failure() && steps.setup.conclusion == 'success' )
shell: bash
run: |
pytest \
--durations=20 \
--cov=torchvision/prototype/models \
--cov-report=term-missing \
test/test_prototype_models*.py
run: pytest -rP test/test_prototype_transforms_functional.py::TestKernels::test_scripted_vs_eager -k gauss
50 changes: 0 additions & 50 deletions .github/workflows/test-m1.yml

This file was deleted.

54 changes: 0 additions & 54 deletions .github/workflows/tests-schedule.yml

This file was deleted.

8 changes: 7 additions & 1 deletion test/test_prototype_transforms_functional.py
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@ class TestKernels:

@sample_inputs
@pytest.mark.parametrize("device", cpu_and_gpu())
def test_scripted_vs_eager(self, info, args_kwargs, device):
def test_scripted_vs_eager(self, request, info, args_kwargs, device):
kernel_eager = info.kernel
kernel_scripted = script(kernel_eager)

Expand All @@ -95,6 +95,12 @@ def test_scripted_vs_eager(self, info, args_kwargs, device):
actual = kernel_scripted(*args, **kwargs)
expected = kernel_eager(*args, **kwargs)

import pathlib

artifacts = pathlib.Path(__file__).parent / "artifacts"
artifacts.mkdir(exist_ok=True)
torch.save((args, kwargs, actual, expected), str(artifacts / f"{request.node.name}.pt"))

assert_close(actual, expected, **info.closeness_kwargs)

def _unbatch(self, batch, *, data_dims):
Expand Down
Loading