Skip to content

sym_size_custom implementation for dynamic shape fails at tensor.sum().backward() call #3827

@miladm

Description

@miladm

🐛 Bug

sym_size_custom implementation for dynamic shape causes the following error. I wonder if this is due to WIP on autograd implementation. Wdyt @Krovatkin?

======================================================================
ERROR: test_nn_scalars_reductions_xla (__main__.TestNNDeviceTypeXLA)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/opt/conda/lib/python3.7/site-packages/torch/testing/_internal/common_device_type.py", line 390, in instantiated_test
    raise rte
  File "/opt/conda/lib/python3.7/site-packages/torch/testing/_internal/common_device_type.py", line 377, in instantiated_test
    result = test(self, **param_kwargs)
  File "/workspace/pytorch/xla/test/../../test/test_nn.py", line 15516, in test_nn_scalars_reductions
    verify_reduction_scalars(input, reduction, output)
  File "/workspace/pytorch/xla/test/../../test/test_nn.py", line 15502, in verify_reduction_scalars
    output.sum().backward()
  File "/opt/conda/lib/python3.7/site-packages/torch/_tensor.py", line 485, in backward
    self, gradient, retain_graph, create_graph, inputs=inputs
  File "/opt/conda/lib/python3.7/site-packages/torch/autograd/__init__.py", line 193, in backward
    allow_unreachable=True, accumulate_grad=True)  # Calls into the C++ engine to run the backward pass
RuntimeError: Function SumBackward0 returned an invalid gradient at index 0 - got [5, 6] but expected shape compatible with [0]
----------------------------------------------------------------------

Metadata

Metadata

Assignees

No one assigned

    Labels

    dynamismDynamic Shape Features

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions