You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In [1]: import sparse
In [2]: import numpy as np
In [3]: x = np.arange(10)
In [4]: y = sparse.COO(x)
In [5]: x + y
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-5-cd60f97aa77f> in <module>
----> 1 x + y
~/miniconda3/envs/xarray-py37-dev/lib/python3.7/site-packages/sparse/coo/core.py in __array_ufunc__(self, ufunc, method, *inputs, **kwargs)
1452
1453 if method == '__call__':
-> 1454 result = elemwise(ufunc, *inputs, **kwargs)
1455 elif method == 'reduce':
1456 result = COO._reduce(ufunc, *inputs, **kwargs)
~/miniconda3/envs/xarray-py37-dev/lib/python3.7/site-packages/sparse/coo/umath.py in elemwise(func, *args, **kwargs)
46 """
47
---> 48 return _Elemwise(func, *args, **kwargs).get_result()
49
50
~/miniconda3/envs/xarray-py37-dev/lib/python3.7/site-packages/sparse/coo/umath.py in __init__(self, func, *args, **kwargs)
421 self.cache = {}
422
--> 423 self._get_fill_value()
424 self._check_broadcast()
425
~/miniconda3/envs/xarray-py37-dev/lib/python3.7/site-packages/sparse/coo/umath.py in _get_fill_value(self)
483
484 if not equivalent(fill_value, fill_value_array).all():
--> 485 raise ValueError('Performing a mixed sparse-dense operation that would result in a dense array. '
486 'Please make sure that func(sparse_fill_values, ndarrays) is a constant array.')
487
ValueError: Performing a mixed sparse-dense operation that would result in a dense array. Please make sure that func(sparse_fill_values, ndarrays) is a constant array.
Certainly it's a bad idea to densify automatically in general, but in this case, the user is arguably OK with dense output, because they provided a dense input array of the same shape. The risk of running out of memory due to an output much larger than any of the inputs is not really there.
The text was updated successfully, but these errors were encountered:
Hi! The policy we follow when taking these kinds of decisions is the following: If we have an existing algorithm that works better with sparse arrays rather than dense, we'll always prefer the sparse algorithm.
In dot and tensordot, matrix multiplication is defined for mixed sparse/dense and is usually dense.
For stack/concatenate, we don't, and we'd have to densify.
Uh oh!
There was an error while loading. Please reload this page.
To reproduce:
Certainly it's a bad idea to densify automatically in general, but in this case, the user is arguably OK with dense output, because they provided a dense input array of the same shape. The risk of running out of memory due to an output much larger than any of the inputs is not really there.
The text was updated successfully, but these errors were encountered: