Skip to content

Commit 4057088

Browse files
committed
Merge remote-tracking branch 'upstream/master' into v3-store-part1
2 parents 983d190 + e6483f9 commit 4057088

24 files changed

+137
-77
lines changed

.github/workflows/minimal.yml

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -26,3 +26,10 @@ jobs:
2626
conda activate minimal
2727
python -m pip install .
2828
pytest -svx
29+
- name: Fixture generation
30+
shell: "bash -l {0}"
31+
run: |
32+
conda activate minimal
33+
rm -rf fixture/
34+
pytest -svx zarr/tests/test_dim_separator.py zarr/tests/test_storage.py
35+
# This simulates fixture-less tests in conda and debian packaging

docs/contributing.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -152,7 +152,7 @@ the tests will be skipped. To install all optional dependencies, run::
152152
$ pip install -r requirements_dev_optional.txt
153153

154154
To also run the doctests within docstrings (requires optional
155-
depencies to be installed), run::
155+
dependencies to be installed), run::
156156

157157
$ pytest -v --doctest-plus zarr
158158

docs/release.rst

Lines changed: 18 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,6 +9,8 @@ Unreleased
99
Enhancements
1010
~~~~~~~~~~~~
1111

12+
* Allow to assign array ``fill_values`` and update metadata accordingly. :issue:`662`
13+
1214
* array indexing with [] (getitem and setitem) now supports fancy indexing.
1315
By :user:`Juan Nunez-Iglesias <jni>`; :issue:`725`.
1416

@@ -21,12 +23,26 @@ Enhancements
2123
* Create a Base store class for Zarr Store.
2224
By :user:`Greggory Lee <grlee77>`; :issue:`789`.
2325

26+
.. _release_2.10.3:
27+
28+
2.10.3
29+
------
30+
2431
Bug fixes
2532
~~~~~~~~~
2633

2734
* N5 keywords now emit UserWarning instead of raising a ValueError.
2835
By :user:`Boaz Mohar <boazmohar>`; :issue:`860`.
2936

37+
* blocks_to_decompress not used in read_part function.
38+
By :user:`Boaz Mohar <boazmohar>`; :issue:`861`.
39+
40+
* defines blocksize for array, updates hexdigest values.
41+
By :user:`Andrew Fulton <andrewfulton9>`; :issue:`867`.
42+
43+
* Fix test failure on Debian and conda-forge builds.
44+
By :user:`Josh Moore <joshmoore>`; :issue:`871`.
45+
3046
.. _release_2.10.2:
3147

3248
2.10.2
@@ -328,7 +344,7 @@ See `this link <https://github.com/zarr-developers/zarr-python/milestone/11?clos
328344
merged PR tagged with the 2.6 milestone.
329345

330346
* Add ability to partially read and decompress arrays, see :issue:`667`. It is
331-
only available to chunks stored using fs-spec and using bloc as a compressor.
347+
only available to chunks stored using fsspec and using Blosc as a compressor.
332348

333349
For certain analysis case when only a small portion of chunks is needed it can
334350
be advantageous to only access and decompress part of the chunks. Doing
@@ -359,7 +375,7 @@ This release will be the last to support Python 3.5, next version of Zarr will b
359375
without ``ipytree`` installed.
360376
By :user:`Zain Patel <mzjp2>`; :issue:`537`
361377

362-
* Add typing informations to many of the core functions :issue:`589`
378+
* Add typing information to many of the core functions :issue:`589`
363379

364380
* Explicitly close stores during testing.
365381
By :user:`Elliott Sales de Andrade <QuLogic>`; :issue:`442`

docs/spec/v1.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -150,7 +150,7 @@ and columns 4000-5000 and is stored under the key '2.4'; etc.
150150

151151
There is no need for all chunks to be present within an array
152152
store. If a chunk is not present then it is considered to be in an
153-
uninitialized state. An unitialized chunk MUST be treated as if it
153+
uninitialized state. An uninitialized chunk MUST be treated as if it
154154
was uniformly filled with the value of the 'fill_value' field in the
155155
array metadata. If the 'fill_value' field is ``null`` then the
156156
contents of the chunk are undefined.

docs/spec/v2.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -81,7 +81,7 @@ filters
8181
The following keys MAY be present within the object:
8282

8383
dimension_separator
84-
If present, either the string ``"."`` or ``"/""`` definining the separator placed
84+
If present, either the string ``"."`` or ``"/""`` defining the separator placed
8585
between the dimensions of a chunk. If the value is not set, then the
8686
default MUST be assumed to be ``"."``, leading to chunk keys of the form "0.0".
8787
Arrays defined with ``"/"`` as the dimension separator can be considered to have
@@ -222,7 +222,7 @@ columns 4000-5000 and is stored under the key "2.4"; etc.
222222

223223
There is no need for all chunks to be present within an array store. If a chunk
224224
is not present then it is considered to be in an uninitialized state. An
225-
unitialized chunk MUST be treated as if it was uniformly filled with the value
225+
uninitialized chunk MUST be treated as if it was uniformly filled with the value
226226
of the "fill_value" field in the array metadata. If the "fill_value" field is
227227
``null`` then the contents of the chunk are undefined.
228228

docs/tutorial.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1297,7 +1297,7 @@ ratios, depending on the correlation structure within the data. E.g.::
12971297
Chunks initialized : 100/100
12981298

12991299
In the above example, Fortran order gives a better compression ratio. This is an
1300-
artifical example but illustrates the general point that changing the order of
1300+
artificial example but illustrates the general point that changing the order of
13011301
bytes within chunks of an array may improve the compression ratio, depending on
13021302
the structure of the data, the compression algorithm used, and which compression
13031303
filters (e.g., byte-shuffle) have been applied.

requirements_dev_optional.txt

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ ipytree==0.2.1
77
# don't let pyup change pinning for azure-storage-blob, need to pin to older
88
# version to get compatibility with azure storage emulator on appveyor (FIXME)
99
azure-storage-blob==12.8.1 # pyup: ignore
10-
redis==4.0.0
10+
redis==4.0.2
1111
types-redis
1212
types-setuptools
1313
pymongo==3.12.1
@@ -16,8 +16,8 @@ tox==3.24.4
1616
coverage
1717
flake8==4.0.1
1818
pytest-cov==3.0.0
19-
pytest-doctestplus==0.11.0
19+
pytest-doctestplus==0.11.1
2020
pytest-timeout==2.0.1
21-
h5py==3.5.0
22-
fsspec[s3]==2021.11.0
21+
h5py==3.6.0
22+
fsspec[s3]==2021.11.1
2323
moto[server]>=1.3.14

setup.cfg

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
[codespell]
2+
skip = ./.git
3+
ignore-words-list = ba, ihs, kake, nd, noe, nwo, te

zarr/_storage/absstore.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -76,7 +76,8 @@ def __init__(self, container=None, prefix='', account_name=None, account_key=Non
7676
self._account_name = account_name
7777
self._account_key = account_key
7878

79-
def _warn_deprecated(self, property_):
79+
@staticmethod
80+
def _warn_deprecated(property_):
8081
msg = ("The {} property is deprecated and will be removed in a future "
8182
"version. Get the property from 'ABSStore.client' instead.")
8283
warnings.warn(msg.format(property_), FutureWarning, stacklevel=3)

zarr/convenience.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -428,7 +428,7 @@ def tree(grp, expand=False, level=None):
428428
return TreeViewer(grp, expand=expand, level=level)
429429

430430

431-
class _LogWriter(object):
431+
class _LogWriter:
432432

433433
def __init__(self, log):
434434
self.log_func = None

zarr/core.py

Lines changed: 10 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -74,7 +74,7 @@ class Array:
7474
operations. If False, user attributes are reloaded from the store prior
7575
to all attribute read operations.
7676
partial_decompress : bool, optional
77-
If True and while the chunk_store is a FSStore and the compresion used
77+
If True and while the chunk_store is a FSStore and the compression used
7878
is Blosc, when getting data from the array chunks will be partially
7979
read and decompressed when possible.
8080
@@ -347,6 +347,11 @@ def fill_value(self):
347347
"""A value used for uninitialized portions of the array."""
348348
return self._fill_value
349349

350+
@fill_value.setter
351+
def fill_value(self, new):
352+
self._fill_value = new
353+
self._flush_metadata_nosync()
354+
350355
@property
351356
def order(self):
352357
"""A string indicating the order in which bytes are arranged within
@@ -454,7 +459,7 @@ def nchunks_initialized(self):
454459
# count chunk keys
455460
return sum(1 for k in listdir(self.chunk_store, self._path) if prog.match(k))
456461

457-
# backwards compability
462+
# backwards compatibility
458463
initialized = nchunks_initialized
459464

460465
@property
@@ -1103,7 +1108,7 @@ def get_mask_selection(self, selection, out=None, fields=None):
11031108
>>> import numpy as np
11041109
>>> z = zarr.array(np.arange(100).reshape(10, 10))
11051110
1106-
Retrieve items by specifying a maks::
1111+
Retrieve items by specifying a mask::
11071112
11081113
>>> sel = np.zeros_like(z, dtype=bool)
11091114
>>> sel[1, 1] = True
@@ -1953,17 +1958,15 @@ def _chunk_delitems(self, ckeys):
19531958
# that will trigger this condition, but it's possible that they
19541959
# will be developed in the future.
19551960
tuple(map(self._chunk_delitem, ckeys))
1956-
return None
19571961

19581962
def _chunk_delitem(self, ckey):
19591963
"""
19601964
Attempt to delete the value associated with ckey.
19611965
"""
19621966
try:
19631967
del self.chunk_store[ckey]
1964-
return
19651968
except KeyError:
1966-
return
1969+
pass
19671970

19681971
def _chunk_setitem(self, chunk_coords, chunk_selection, value, fields=None):
19691972
"""Replace part or whole of a chunk.
@@ -2065,7 +2068,7 @@ def _decode_chunk(self, cdata, start=None, nitems=None, expected_shape=None):
20652068
if self._compressor:
20662069
# only decode requested items
20672070
if (
2068-
all([x is not None for x in [start, nitems]])
2071+
all(x is not None for x in [start, nitems])
20692072
and self._compressor.codec_id == "blosc"
20702073
) and hasattr(self._compressor, "decode_partial"):
20712074
chunk = self._compressor.decode_partial(cdata, start, nitems)

zarr/creation.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -136,7 +136,7 @@ def create(shape, chunks=True, dtype=None, compressor='default',
136136
store_separator = getattr(store, "_dimension_separator", None)
137137
if store_separator not in (None, dimension_separator):
138138
raise ValueError(
139-
f"Specified dimension_separtor: {dimension_separator}"
139+
f"Specified dimension_separator: {dimension_separator}"
140140
f"conflicts with store's separator: "
141141
f"{store_separator}")
142142
dimension_separator = normalize_dimension_separator(dimension_separator)
@@ -439,7 +439,7 @@ def open_array(
439439
If using an fsspec URL to create the store, these will be passed to
440440
the backend implementation. Ignored otherwise.
441441
partial_decompress : bool, optional
442-
If True and while the chunk_store is a FSStore and the compresion used
442+
If True and while the chunk_store is a FSStore and the compression used
443443
is Blosc, when getting data from the array chunks will be partially
444444
read and decompressed when possible.
445445
write_empty_chunks : bool, optional

zarr/indexing.py

Lines changed: 15 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -132,7 +132,7 @@ def normalize_integer_selection(dim_sel, dim_len):
132132
"""
133133

134134

135-
class IntDimIndexer(object):
135+
class IntDimIndexer:
136136

137137
def __init__(self, dim_sel, dim_len, dim_chunk_len):
138138

@@ -157,7 +157,7 @@ def ceildiv(a, b):
157157
return math.ceil(a / b)
158158

159159

160-
class SliceDimIndexer(object):
160+
class SliceDimIndexer:
161161

162162
def __init__(self, dim_sel, dim_len, dim_chunk_len):
163163

@@ -308,19 +308,19 @@ def is_positive_slice(s):
308308

309309
def is_contiguous_selection(selection):
310310
selection = ensure_tuple(selection)
311-
return all([
311+
return all(
312312
(is_integer_array(s) or is_contiguous_slice(s) or s == Ellipsis)
313313
for s in selection
314-
])
314+
)
315315

316316

317317
def is_basic_selection(selection):
318318
selection = ensure_tuple(selection)
319-
return all([is_integer(s) or is_positive_slice(s) for s in selection])
319+
return all(is_integer(s) or is_positive_slice(s) for s in selection)
320320

321321

322322
# noinspection PyProtectedMember
323-
class BasicIndexer(object):
323+
class BasicIndexer:
324324

325325
def __init__(self, selection, array):
326326

@@ -361,7 +361,7 @@ def __iter__(self):
361361
yield ChunkProjection(chunk_coords, chunk_selection, out_selection)
362362

363363

364-
class BoolArrayDimIndexer(object):
364+
class BoolArrayDimIndexer:
365365

366366
def __init__(self, dim_sel, dim_len, dim_chunk_len):
367367

@@ -451,7 +451,7 @@ def boundscheck_indices(x, dim_len):
451451
raise BoundsCheckError(dim_len)
452452

453453

454-
class IntArrayDimIndexer(object):
454+
class IntArrayDimIndexer:
455455
"""Integer array selection against a single dimension."""
456456

457457
def __init__(self, dim_sel, dim_len, dim_chunk_len, wraparound=True, boundscheck=True,
@@ -579,7 +579,7 @@ def oindex_set(a, selection, value):
579579

580580

581581
# noinspection PyProtectedMember
582-
class OrthogonalIndexer(object):
582+
class OrthogonalIndexer:
583583

584584
def __init__(self, selection, array):
585585

@@ -649,7 +649,7 @@ def __iter__(self):
649649
yield ChunkProjection(chunk_coords, chunk_selection, out_selection)
650650

651651

652-
class OIndex(object):
652+
class OIndex:
653653

654654
def __init__(self, array):
655655
self.array = array
@@ -671,8 +671,8 @@ def __setitem__(self, selection, value):
671671
def is_coordinate_selection(selection, array):
672672
return (
673673
(len(selection) == len(array._shape)) and
674-
all([is_integer(dim_sel) or is_integer_array(dim_sel)
675-
for dim_sel in selection])
674+
all(is_integer(dim_sel) or is_integer_array(dim_sel)
675+
for dim_sel in selection)
676676
)
677677

678678

@@ -686,7 +686,7 @@ def is_mask_selection(selection, array):
686686

687687

688688
# noinspection PyProtectedMember
689-
class CoordinateIndexer(object):
689+
class CoordinateIndexer:
690690

691691
def __init__(self, selection, array):
692692

@@ -805,7 +805,7 @@ def __init__(self, selection, array):
805805
super().__init__(selection, array)
806806

807807

808-
class VIndex(object):
808+
class VIndex:
809809

810810
def __init__(self, array):
811811
self.array = array
@@ -905,7 +905,7 @@ def make_slice_selection(selection):
905905
return ls
906906

907907

908-
class PartialChunkIterator(object):
908+
class PartialChunkIterator:
909909
"""Iterator to retrieve the specific coordinates of requested data
910910
from within a compressed chunk.
911911

0 commit comments

Comments
 (0)