Skip to content

Commit 826df44

Browse files
author
dcherian
committed
Merge branch 'master' into maahn-groupy_plot2
* master: (51 commits) xarray.backends refactor (pydata#2261) Fix indexing error for data loaded with open_rasterio (pydata#2456) Properly support user-provided norm. (pydata#2443) pep8speaks (pydata#2462) isort (pydata#2469) tests shoudn't need to pass for a PR (pydata#2471) Replace the last of unittest with pytest (pydata#2467) Add python_requires to setup.py (pydata#2465) Update whats-new.rst (pydata#2466) Clean up _parse_array_of_cftime_strings (pydata#2464) plot.contour: Don't make cmap if colors is a single color. (pydata#2453) np.AxisError was added in numpy 1.13 (pydata#2455) Add CFTimeIndex.shift (pydata#2431) Fix FutureWarning in CFTimeIndex.date_type (pydata#2448) fix:2445 (pydata#2446) Enable use of cftime.datetime coordinates with differentiate and interp (pydata#2434) restore ddof support in std (pydata#2447) Future warning for default reduction dimension of groupby (pydata#2366) Remove incorrect statement about "drop" in the text docs (pydata#2439) Use profile mechanism, not no-op mutation (pydata#2442) ...
2 parents 87ef1cc + 289b377 commit 826df44

File tree

104 files changed

+6486
-2051
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

104 files changed

+6486
-2051
lines changed

.github/PULL_REQUEST_TEMPLATE.md

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,3 @@
11
- [ ] Closes #xxxx (remove if there is no corresponding issue, which should only be the case for minor changes)
22
- [ ] Tests added (for all bug fixes or enhancements)
3-
- [ ] Tests passed (for all non-documentation changes)
43
- [ ] Fully documented, including `whats-new.rst` for all changes and `api.rst` for new API (remove if this change should not be visible to users, e.g., if it is an internal clean-up, or if this is part of a larger project that will be documented later)

.pep8speaks.yml

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
# File : .pep8speaks.yml
2+
3+
scanner:
4+
diff_only: True # If True, errors caused by only the patch are shown
5+
6+
pycodestyle:
7+
max-line-length: 79
8+
ignore: # Errors and warnings to ignore
9+
- E402, # module level import not at top of file
10+
- E731, # do not assign a lambda expression, use a def
11+
- W503 # line break before binary operator

.stickler.yml

Lines changed: 0 additions & 11 deletions
This file was deleted.

.travis.yml

Lines changed: 29 additions & 53 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
# Based on http://conda.pydata.org/docs/travis.html
2-
language: python
2+
language: minimal
33
sudo: false # use container based build
44
notifications:
55
email: false
@@ -10,72 +10,48 @@ branches:
1010
matrix:
1111
fast_finish: true
1212
include:
13-
- python: 2.7
14-
env: CONDA_ENV=py27-min
15-
- python: 2.7
16-
env: CONDA_ENV=py27-cdat+iris+pynio
17-
- python: 3.5
18-
env: CONDA_ENV=py35
19-
- python: 3.6
20-
env: CONDA_ENV=py36
21-
- python: 3.6
22-
env:
13+
- env: CONDA_ENV=py27-min
14+
- env: CONDA_ENV=py27-cdat+iris+pynio
15+
- env: CONDA_ENV=py35
16+
- env: CONDA_ENV=py36
17+
- env: CONDA_ENV=py37
18+
- env:
2319
- CONDA_ENV=py36
2420
- EXTRA_FLAGS="--run-flaky --run-network-tests"
25-
- python: 3.6
26-
env: CONDA_ENV=py36-netcdf4-dev
21+
- env: CONDA_ENV=py36-netcdf4-dev
2722
addons:
2823
apt_packages:
2924
- libhdf5-serial-dev
3025
- netcdf-bin
3126
- libnetcdf-dev
32-
- python: 3.6
33-
env: CONDA_ENV=py36-dask-dev
34-
- python: 3.6
35-
env: CONDA_ENV=py36-pandas-dev
36-
- python: 3.6
37-
env: CONDA_ENV=py36-bottleneck-dev
38-
- python: 3.6
39-
env: CONDA_ENV=py36-condaforge-rc
40-
- python: 3.6
41-
env: CONDA_ENV=py36-pynio-dev
42-
- python: 3.6
43-
env: CONDA_ENV=py36-rasterio-0.36
44-
- python: 3.6
45-
env: CONDA_ENV=py36-zarr-dev
46-
- python: 3.5
47-
env: CONDA_ENV=docs
48-
- python: 3.6
49-
env: CONDA_ENV=py36-hypothesis
27+
- env: CONDA_ENV=py36-dask-dev
28+
- env: CONDA_ENV=py36-pandas-dev
29+
- env: CONDA_ENV=py36-bottleneck-dev
30+
- env: CONDA_ENV=py36-condaforge-rc
31+
- env: CONDA_ENV=py36-pynio-dev
32+
- env: CONDA_ENV=py36-rasterio-0.36
33+
- env: CONDA_ENV=py36-zarr-dev
34+
- env: CONDA_ENV=docs
35+
- env: CONDA_ENV=py36-hypothesis
36+
5037
allow_failures:
51-
- python: 3.6
52-
env:
38+
- env:
5339
- CONDA_ENV=py36
5440
- EXTRA_FLAGS="--run-flaky --run-network-tests"
55-
- python: 3.6
56-
env: CONDA_ENV=py36-netcdf4-dev
41+
- env: CONDA_ENV=py36-netcdf4-dev
5742
addons:
5843
apt_packages:
5944
- libhdf5-serial-dev
6045
- netcdf-bin
6146
- libnetcdf-dev
62-
- python: 3.6
63-
env: CONDA_ENV=py36-pandas-dev
64-
- python: 3.6
65-
env: CONDA_ENV=py36-bottleneck-dev
66-
- python: 3.6
67-
env: CONDA_ENV=py36-condaforge-rc
68-
- python: 3.6
69-
env: CONDA_ENV=py36-pynio-dev
70-
- python: 3.6
71-
env: CONDA_ENV=py36-zarr-dev
47+
- env: CONDA_ENV=py36-pandas-dev
48+
- env: CONDA_ENV=py36-bottleneck-dev
49+
- env: CONDA_ENV=py36-condaforge-rc
50+
- env: CONDA_ENV=py36-pynio-dev
51+
- env: CONDA_ENV=py36-zarr-dev
7252

7353
before_install:
74-
- if [[ "$TRAVIS_PYTHON_VERSION" == "2.7" ]]; then
75-
wget http://repo.continuum.io/miniconda/Miniconda-3.16.0-Linux-x86_64.sh -O miniconda.sh;
76-
else
77-
wget http://repo.continuum.io/miniconda/Miniconda3-3.16.0-Linux-x86_64.sh -O miniconda.sh;
78-
fi
54+
- wget http://repo.continuum.io/miniconda/Miniconda3-3.16.0-Linux-x86_64.sh -O miniconda.sh;
7955
- bash miniconda.sh -b -p $HOME/miniconda
8056
- export PATH="$HOME/miniconda/bin:$PATH"
8157
- hash -r
@@ -95,9 +71,9 @@ install:
9571
- python xarray/util/print_versions.py
9672

9773
script:
98-
# TODO: restore this check once the upstream pandas issue is fixed:
99-
# https://github.com/pandas-dev/pandas/issues/21071
100-
# - python -OO -c "import xarray"
74+
- which python
75+
- python --version
76+
- python -OO -c "import xarray"
10177
- if [[ "$CONDA_ENV" == "docs" ]]; then
10278
conda install -c conda-forge sphinx sphinx_rtd_theme sphinx-gallery numpydoc;
10379
sphinx-build -n -j auto -b html -d _build/doctrees doc _build/html;

README.rst

Lines changed: 20 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,8 @@ xarray: N-D labeled arrays and datasets
1515
:target: https://zenodo.org/badge/latestdoi/13221727
1616
.. image:: http://img.shields.io/badge/benchmarked%20by-asv-green.svg?style=flat
1717
:target: http://pandas.pydata.org/speed/xarray/
18+
.. image:: https://img.shields.io/badge/powered%20by-NumFOCUS-orange.svg?style=flat&colorA=E1523D&colorB=007D8A
19+
:target: http://numfocus.org
1820

1921
**xarray** (formerly **xray**) is an open source project and Python package that aims to bring the
2022
labeled data power of pandas_ to the physical sciences, by providing
@@ -103,20 +105,36 @@ Get in touch
103105
.. _mailing list: https://groups.google.com/forum/#!forum/xarray
104106
.. _on GitHub: http://github.com/pydata/xarray
105107

108+
NumFOCUS
109+
--------
110+
111+
.. image:: https://numfocus.org/wp-content/uploads/2017/07/NumFocus_LRG.png
112+
:scale: 25 %
113+
:target: https://numfocus.org/
114+
115+
Xarray is a fiscally sponsored project of NumFOCUS_, a nonprofit dedicated
116+
to supporting the open source scientific computing community. If you like
117+
Xarray and want to support our mission, please consider making a donation_
118+
to support our efforts.
119+
120+
.. _donation: https://www.flipcause.com/secure/cause_pdetails/NDE2NTU=
121+
106122
History
107123
-------
108124

109125
xarray is an evolution of an internal tool developed at `The Climate
110126
Corporation`__. It was originally written by Climate Corp researchers Stephan
111127
Hoyer, Alex Kleeman and Eugene Brevdo and was released as open source in
112-
May 2014. The project was renamed from "xray" in January 2016.
128+
May 2014. The project was renamed from "xray" in January 2016. Xarray became a
129+
fiscally sponsored project of NumFOCUS_ in August 2018.
113130

114131
__ http://climate.com/
132+
.. _NumFOCUS: https://numfocus.org
115133

116134
License
117135
-------
118136

119-
Copyright 2014-2017, xarray Developers
137+
Copyright 2014-2018, xarray Developers
120138

121139
Licensed under the Apache License, Version 2.0 (the "License");
122140
you may not use this file except in compliance with the License.

asv_bench/asv.conf.json

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -64,6 +64,7 @@
6464
"scipy": [""],
6565
"bottleneck": ["", null],
6666
"dask": [""],
67+
"distributed": [""],
6768
},
6869

6970

asv_bench/benchmarks/dataset_io.py

Lines changed: 42 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,13 @@
11
from __future__ import absolute_import, division, print_function
22

3+
import os
4+
35
import numpy as np
46
import pandas as pd
57

68
import xarray as xr
79

8-
from . import randn, randint, requires_dask
10+
from . import randint, randn, requires_dask
911

1012
try:
1113
import dask
@@ -14,6 +16,9 @@
1416
pass
1517

1618

19+
os.environ['HDF5_USE_FILE_LOCKING'] = 'FALSE'
20+
21+
1722
class IOSingleNetCDF(object):
1823
"""
1924
A few examples that benchmark reading/writing a single netCDF file with
@@ -405,3 +410,39 @@ def time_open_dataset_scipy_with_time_chunks(self):
405410
with dask.set_options(get=dask.multiprocessing.get):
406411
xr.open_mfdataset(self.filenames_list, engine='scipy',
407412
chunks=self.time_chunks)
413+
414+
415+
def create_delayed_write():
416+
import dask.array as da
417+
vals = da.random.random(300, chunks=(1,))
418+
ds = xr.Dataset({'vals': (['a'], vals)})
419+
return ds.to_netcdf('file.nc', engine='netcdf4', compute=False)
420+
421+
422+
class IOWriteNetCDFDask(object):
423+
timeout = 60
424+
repeat = 1
425+
number = 5
426+
427+
def setup(self):
428+
requires_dask()
429+
self.write = create_delayed_write()
430+
431+
def time_write(self):
432+
self.write.compute()
433+
434+
435+
class IOWriteNetCDFDaskDistributed(object):
436+
def setup(self):
437+
try:
438+
import distributed
439+
except ImportError:
440+
raise NotImplementedError
441+
self.client = distributed.Client()
442+
self.write = create_delayed_write()
443+
444+
def cleanup(self):
445+
self.client.shutdown()
446+
447+
def time_write(self):
448+
self.write.compute()

asv_bench/benchmarks/unstacking.py

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
from __future__ import absolute_import, division, print_function
2+
3+
import numpy as np
4+
5+
import xarray as xr
6+
7+
from . import requires_dask
8+
9+
10+
class Unstacking(object):
11+
def setup(self):
12+
data = np.random.RandomState(0).randn(1, 1000, 500)
13+
self.ds = xr.DataArray(data).stack(flat_dim=['dim_1', 'dim_2'])
14+
15+
def time_unstack_fast(self):
16+
self.ds.unstack('flat_dim')
17+
18+
def time_unstack_slow(self):
19+
self.ds[:, ::-1].unstack('flat_dim')
20+
21+
22+
class UnstackingDask(Unstacking):
23+
def setup(self, *args, **kwargs):
24+
requires_dask()
25+
super(UnstackingDask, self).setup(**kwargs)
26+
self.ds = self.ds.chunk({'flat_dim': 50})

ci/requirements-py37.yml

Lines changed: 13 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,13 @@
1+
name: test_env
2+
channels:
3+
- defaults
4+
dependencies:
5+
- python=3.7
6+
- pip:
7+
- pytest
8+
- flake8
9+
- mock
10+
- numpy
11+
- pandas
12+
- coveralls
13+
- pytest-cov

doc/_static/numfocus_logo.png

24.4 KB
Loading

doc/api-hidden.rst

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -151,3 +151,5 @@
151151
plot.FacetGrid.set_titles
152152
plot.FacetGrid.set_ticks
153153
plot.FacetGrid.map
154+
155+
CFTimeIndex.shift

doc/api.rst

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -150,6 +150,7 @@ Computation
150150
Dataset.resample
151151
Dataset.diff
152152
Dataset.quantile
153+
Dataset.differentiate
153154

154155
**Aggregation**:
155156
:py:attr:`~Dataset.all`
@@ -317,6 +318,7 @@ Computation
317318
DataArray.diff
318319
DataArray.dot
319320
DataArray.quantile
321+
DataArray.differentiate
320322

321323
**Aggregation**:
322324
:py:attr:`~DataArray.all`
@@ -555,6 +557,13 @@ Custom Indexes
555557

556558
CFTimeIndex
557559

560+
Creating custom indexes
561+
-----------------------
562+
.. autosummary::
563+
:toctree: generated/
564+
565+
cftime_range
566+
558567
Plotting
559568
========
560569

@@ -615,3 +624,6 @@ arguments for the ``from_store`` and ``dump_to_store`` Dataset methods:
615624
backends.H5NetCDFStore
616625
backends.PydapDataStore
617626
backends.ScipyDataStore
627+
backends.FileManager
628+
backends.CachingFileManager
629+
backends.DummyFileManager

doc/computation.rst

Lines changed: 25 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -200,6 +200,31 @@ You can also use ``construct`` to compute a weighted rolling sum:
200200
To avoid this, use ``skipna=False`` as the above example.
201201

202202

203+
Computation using Coordinates
204+
=============================
205+
206+
Xarray objects have some handy methods for the computation with their
207+
coordinates. :py:meth:`~xarray.DataArray.differentiate` computes derivatives by
208+
central finite differences using their coordinates,
209+
210+
.. ipython:: python
211+
212+
a = xr.DataArray([0, 1, 2, 3], dims=['x'], coords=[[0.1, 0.11, 0.2, 0.3]])
213+
a
214+
a.differentiate('x')
215+
216+
This method can be used also for multidimensional arrays,
217+
218+
.. ipython:: python
219+
220+
a = xr.DataArray(np.arange(8).reshape(4, 2), dims=['x', 'y'],
221+
coords={'x': [0.1, 0.11, 0.2, 0.3]})
222+
a.differentiate('x')
223+
224+
.. note::
225+
This method is limited to simple cartesian geometry. Differentiation along
226+
multidimensional coordinate is not supported.
227+
203228
.. _compute.broadcasting:
204229

205230
Broadcasting by dimension name

doc/data-structures.rst

Lines changed: 0 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -408,13 +408,6 @@ operations keep around coordinates:
408408
list(ds[['x']])
409409
list(ds.drop('temperature'))
410410
411-
If a dimension name is given as an argument to ``drop``, it also drops all
412-
variables that use that dimension:
413-
414-
.. ipython:: python
415-
416-
list(ds.drop('time'))
417-
418411
As an alternate to dictionary-like modifications, you can use
419412
:py:meth:`~xarray.Dataset.assign` and :py:meth:`~xarray.Dataset.assign_coords`.
420413
These methods return a new dataset with additional (or replaced) or values:

0 commit comments

Comments
 (0)