Skip to content

Commit 7b8a8cf

Browse files
dnikolaev-amdAMD AMD
authored andcommitted
[release/2.6] Fix dtype before comparing torch and numpy tensors (#2340)
Cast numpy dtype result to torch dtype result before compare Numpy returns `np.power(float32, int64) => float64` [Promotion rules for Python scalars](https://numpy.org/neps/nep-0050-scalar-promotion.html) Pytorch returns `torch.pow(float32, int64) => float32` Reverts #2287 and fixes tests in a different way Fixes: - SWDEV-538110 - `'dtype' do not match: torch.float32 != torch.float64` > - test_binary_ufuncs.py::TestBinaryUfuncsCUDA::test_cuda_tensor_pow_scalar_tensor_cuda - SWDEV-539171 - `AttributeError: 'float' object has no attribute 'dtype` > - test_binary_ufuncs.py::TestBinaryUfuncsCUDA::test_long_tensor_pow_floats_cuda > - test_binary_ufuncs.py::TestBinaryUfuncsCUDA::test_complex_scalar_pow_tensor_cuda_* > - test_binary_ufuncs.py::TestBinaryUfuncsCUDA::test_float_scalar_pow_float_tensor_cuda_*
1 parent f86d184 commit 7b8a8cf

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

test/test_binary_ufuncs.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1447,7 +1447,7 @@ def to_np(value):
14471447
try:
14481448
np_res = np.power(to_np(base), to_np(np_exponent))
14491449
expected = (
1450-
torch.from_numpy(np_res).to(dtype=base.dtype)
1450+
torch.from_numpy(np_res)
14511451
if isinstance(np_res, np.ndarray)
14521452
else torch.tensor(np_res, dtype=base.dtype)
14531453
)
@@ -1480,8 +1480,8 @@ def to_np(value):
14801480
self.assertRaisesRegex(RuntimeError, regex, base.pow_, exponent)
14811481
elif torch.can_cast(torch.result_type(base, exponent), base.dtype):
14821482
actual2 = actual.pow_(exponent)
1483-
self.assertEqual(actual, expected)
1484-
self.assertEqual(actual2, expected)
1483+
self.assertEqual(actual, expected.to(actual))
1484+
self.assertEqual(actual2, expected.to(actual))
14851485
else:
14861486
self.assertRaisesRegex(
14871487
RuntimeError,

0 commit comments

Comments
 (0)