diff --git a/_toc.yml b/_toc.yml
index b6d299de8..21eb07f40 100644
--- a/_toc.yml
+++ b/_toc.yml
@@ -37,6 +37,10 @@
chapters:
- file: core/overview
- file: core/numpy
+ sections:
+ - file: core/numpy/numpy-basics
+ - file: core/numpy/intermediate-numpy
+ - file: core/numpy/numpy-broadcasting
- file: core/matplotlib
sections:
- file: core/matplotlib/matplotlib
diff --git a/core/numpy/array_index.png b/core/numpy/array_index.png
new file mode 100644
index 000000000..d1da26020
Binary files /dev/null and b/core/numpy/array_index.png differ
diff --git a/core/numpy/intermediate-numpy.ipynb b/core/numpy/intermediate-numpy.ipynb
new file mode 100644
index 000000000..eabfcd9ed
--- /dev/null
+++ b/core/numpy/intermediate-numpy.ipynb
@@ -0,0 +1,677 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "
\n",
+ "# Intermediate NumPy\n",
+ "---"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Overview\n",
+ "1. Working with multiple dimensions\n",
+ "1. Subsetting of irregular arrays with booleans\n",
+ "1. Sorting, or indexing with indices"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Prerequisites\n",
+ "\n",
+ "| Concepts | Importance | Notes |\n",
+ "| --- | --- | --- |\n",
+ "| [NumPy Basics](numpy-basics) | Necessary | |\n",
+ "\n",
+ "* **Experience level**: user\n",
+ "* **Time to learn**: medium\n",
+ "---"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Imports\n",
+ "We will be including [matplotlib](../matplotlib) to illustrate some of our examples, but you don't need knowledge of it to complete this notebook."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "import matplotlib.pyplot as plt\n",
+ "import numpy as np"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Using axes to slice arrays\n",
+ "\n",
+ "Here we introduce an important concept when working with NumPy: the axis. This indicates the particular dimension along which a function should operate (provided the function does something taking multiple values and converts to a single value). \n",
+ "\n",
+ "Let's look at a concrete example with `sum`:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a = np.arange(12).reshape(3, 4)\n",
+ "a"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "This calculates the total of all values in the array"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "np.sum(a)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "
\n",
+ "
Info
\n",
+ " Some of NumPy's functions can be accessed as `ndarray` methods!\n",
+ "
"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a.sum()"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now, with a reminder about how our array is shaped,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a.shape"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "we can specify `axis` to get _just_ the sum across each of our rows."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "np.sum(a, axis=0)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Or do the same and take the sum across columns:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "np.sum(a, axis=1)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "After putting together some data and introducing some more advanced calculations, let's demonstrate a multi-layered example: calculating temperature advection. If you're not familiar with this (don't worry!), we'll be looking to calculate\n",
+ "\n",
+ "\\begin{equation*}\n",
+ "\\text{advection} = -\\vec{v} \\cdot \\nabla T\n",
+ "\\end{equation*}\n",
+ "\n",
+ "and to do so we'll start with some random $T$ and $\\vec{v}$ values,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "temp = np.random.randn(100, 50)\n",
+ "u = np.random.randn(100, 50)\n",
+ "v = np.random.randn(100, 50)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "We can calculate the `np.gradient` of our new $T(100x50)$ field as two separate component gradients,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "gradient_x, gradient_y = np.gradient(temp)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "In order to calculate $-\\vec{v} \\cdot \\nabla T$, we will use `np.dstack` to turn our two separate component gradient fields into one multidimensional field containing $x$ and $y$ gradients at each of our $100x50$ points,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "grad_vectors = np.dstack([gradient_x, gradient_y])\n",
+ "print(grad_vectors.shape)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "and then do the same for our separate $u$ and $v$ wind components,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "wind_vectors = np.dstack([u, v])\n",
+ "print(wind_vectors.shape)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Finally, we can calculate the dot product of these two multidimensional fields of wind and temperature gradient components by hand as an element-wise multiplication, `*`, and then a `sum` of our separate components at each point (i.e., along the last `axis`),"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "advection = (wind_vectors * -grad_vectors).sum(axis=-1)\n",
+ "print(advection.shape)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Indexing arrays with boolean values\n",
+ "\n",
+ "### Array comparisons\n",
+ "NumPy can easily create arrays of boolean values and use those to select certain values to extract from an array"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# Create some synthetic data representing temperature and wind speed data\n",
+ "np.random.seed(19990503) # Make sure we all have the same data\n",
+ "temp = 20 * np.cos(np.linspace(0, 2 * np.pi, 100)) + 50 + 2 * np.random.randn(100)\n",
+ "speed = np.abs(\n",
+ " 10 * np.sin(np.linspace(0, 2 * np.pi, 100)) + 10 + 5 * np.random.randn(100)\n",
+ ")"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "plt.plot(temp, 'tab:red')\n",
+ "plt.plot(speed, 'tab:blue');"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "By doing a comparison between a NumPy array and a value, we get an\n",
+ "array of values representing the results of the comparison between\n",
+ "each element and the value"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "temp > 45"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "This, which is its own NumPy array of `boolean` values, can be used as an index to another array of the same size. We can even use it as an index within the original `temp` array we used to compare,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "temp[temp > 45]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "\n",
+ "
Info
\n",
+ " This only returns the values from our original array meeting the indexing conditions, nothing more! Note the size,\n",
+ "
"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "temp[temp > 45].shape"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "\n",
+ "
Warning
\n",
+ " Indexing arrays with arrays requires them to be the same size!\n",
+ "
"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "If we store this array somewhere new,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "temp_45 = temp[temp > 45]"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "tags": [
+ "raises-exception"
+ ]
+ },
+ "outputs": [],
+ "source": [
+ "temp_45[temp < 45]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "We find that our original `(100,)` shape array is too large to subset our new `(60,)` array."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "If their sizes _do_ match, the boolean array can come from a totally different array!"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "speed > 10"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "temp[speed > 10]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Replacing values\n",
+ "To extend this, we can use this conditional indexing to _assign_ new values to certain positions within our array, somewhat like a masking operation."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# Make a copy so we don't modify the original data\n",
+ "temp2 = temp.copy()\n",
+ "speed2 = speed.copy()\n",
+ "\n",
+ "# Replace all places where speed is <10 with NaN (not a number)\n",
+ "temp2[speed < 10] = np.nan\n",
+ "speed2[speed < 10] = np.nan"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "plt.plot(temp2, 'tab:red');"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "and to put this in context,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "plt.plot(temp, 'r:')\n",
+ "plt.plot(temp2, 'r')\n",
+ "plt.plot(speed, 'b:')\n",
+ "plt.plot(speed2, 'b');"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "If we use parentheses to preserve the order of operations, we can combine these conditions with other bitwise operators like the `&` for `bitwise_and`,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "multi_mask = (temp < 45) & (speed > 10)\n",
+ "multi_mask"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "temp[multi_mask]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Heat index is only defined for temperatures >= 80F and relative humidity values >= 40%. Using the data generated below, we can use boolean indexing to extract the data where heat index has a valid value."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# Here's the \"data\"\n",
+ "np.random.seed(19990503)\n",
+ "temp = 20 * np.cos(np.linspace(0, 2 * np.pi, 100)) + 80 + 2 * np.random.randn(100)\n",
+ "relative_humidity = np.abs(\n",
+ " 20 * np.cos(np.linspace(0, 4 * np.pi, 100)) + 50 + 5 * np.random.randn(100)\n",
+ ")\n",
+ "\n",
+ "# Create a mask for the two conditions described above\n",
+ "good_heat_index = (temp >= 80) & (relative_humidity >= 0.4)\n",
+ "\n",
+ "# Use this mask to grab the temperature and relative humidity values that together\n",
+ "# will give good heat index values\n",
+ "print(temp[good_heat_index])"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Another bitwise operator we can find helpful is Python's `~` complement operator, which can give us the **inverse** of our specific mask to let us assign `np.nan` to every value _not_ satisfied in `good_heat_index`."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "plot_temp = temp.copy()\n",
+ "plot_temp[~good_heat_index] = np.nan\n",
+ "plt.plot(plot_temp, 'tab:red');"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Indexing using arrays of indices\n",
+ "\n",
+ "You can also use a list or array of indices to extract particular values--this is a natural extension of the regular indexing. For instance, just as we can select the first element:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "temp[0]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "We can also extract the first, fifth, and tenth elements as a list:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "temp[[0, 4, 9]]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "One of the ways this comes into play is trying to sort NumPy arrays using `argsort`. This function returns the indices of the array that give the items in sorted order. So for our `temp`,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "inds = np.argsort(temp)\n",
+ "inds"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "i.e., our lowest value is at index `52`, next `47`, and so on. We can use this array of indices as an index for `temp`,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "temp[inds]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "to get a sorted array back!"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "With some clever slicing, we can pull out the last 10, or 10 highest, values of `temp`,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "ten_highest = inds[-10:]\n",
+ "print(temp[ten_highest])"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "There are other NumPy `arg` functions that return indices for operating; check out the [NumPy docs](https://numpy.org/doc/stable/reference/routines.sort.html) on sorting your arrays!"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "---"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Summary\n",
+ "In this notebook we introduced the power of understanding the dimensions of our data by specifying math along `axis`, used `True` and `False` values to subset our data according to conditions, and used lists of positions within our array to sort our data.\n",
+ "\n",
+ "### What's Next\n",
+ "Taking some time to practice this is valuable to be able to quickly manipulate arrays of information in useful or scientific ways."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Resources and references\n",
+ "The [NumPy Users Guide](https://numpy.org/devdocs/user/quickstart.html#less-basic) expands further on some of these topics, as well as suggests various [Tutorials](https://numpy.org/learn/), lectures, and more at this stage."
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "Python 3",
+ "language": "python",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.8.10"
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 4
+}
diff --git a/core/numpy/numpy-basics.ipynb b/core/numpy/numpy-basics.ipynb
new file mode 100644
index 000000000..1f1ce298c
--- /dev/null
+++ b/core/numpy/numpy-basics.ipynb
@@ -0,0 +1,1035 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "
\n",
+ "# NumPy Basics\n",
+ "---"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Overview\n",
+ "NumPy is the fundamental package for scientific computing with Python. It contains among other things:\n",
+ "\n",
+ "- a powerful N-dimensional array object\n",
+ "- sophisticated (broadcasting) functions\n",
+ "- useful linear algebra, Fourier transform, and random number capabilities\n",
+ "\n",
+ "The NumPy array object is the common interface for working with typed arrays of data across a wide-variety of scientific Python packages. NumPy also features a C-API, which enables interfacing existing Fortran/C/C++ libraries with Python and NumPy. In this notebook we will cover\n",
+ "\n",
+ "1. Creating an `array`\n",
+ "1. Math and calculations with arrays\n",
+ "1. Inspecting an array with slicing and indexing"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Prerequisites\n",
+ "\n",
+ "| Concepts | Importance | Notes |\n",
+ "| --- | --- | --- |\n",
+ "| [Python Quickstart](../../foundations/quickstart) | Necessary | Lists, indexing, slicing, math |\n",
+ "\n",
+ "* **Experience level**: beginner\n",
+ "* **Time to learn**: medium\n",
+ "---"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Imports\n",
+ "A common convention you might encounter is to rename `numpy` to `np` on import to shorten it for the many times we will be calling on `numpy` for functionality."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "import numpy as np"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Create an array of 'data'\n",
+ "\n",
+ "The NumPy array represents a *contiguous* block of memory, holding entries of a given type (and hence fixed size). The entries are laid out in memory according to the shape, or list of dimension sizes. Let's start by creating an array from a list of integers and taking a look at it,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a = np.array([1, 2, 3])\n",
+ "a"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "We can inspect the number of dimensions our array is organized along with `ndim`, and how long each of these dimensions are with `shape`"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a.ndim"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a.shape"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "So our 1-dimensional array has a shape of `3` along that dimension! Finally we can check out the underlying type of our underlying data,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a.dtype"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now, let's expand this with a new data type, and by using a list of lists we can grow the dimensions of our array!"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a = np.array([[1.0, 2.0, 3.0], [4.0, 5.0, 6.0]])\n",
+ "a"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a.ndim"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a.shape"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a.dtype"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "And as before we can use `ndim`, `shape`, and `dtype` to discover how many dimensions of what lengths are making up our array of floats."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Generation\n",
+ "NumPy also provides helper functions for generating arrays of data to save you typing for regularly spaced data. Don't forget your Python indexing rules!\n",
+ "\n",
+ "* `arange(start, stop, step)` creates a range of values in the interval `[start,stop)` with `step` spacing.\n",
+ "* `linspace(start, stop, num)` creates a range of `num` evenly spaced values over the range `[start,stop]`."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "#### arange"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a = np.arange(5)\n",
+ "a"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a = np.arange(3, 11)\n",
+ "a"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a = np.arange(1, 10, 2)\n",
+ "a"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "#### linspace"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b = np.linspace(0, 4, 5)\n",
+ "b"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b.shape"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b = np.linspace(3, 10, 15)\n",
+ "b"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b = np.linspace(2.5, 10.25, 11)\n",
+ "b"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b = np.linspace(0, 100, 30)\n",
+ "b"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Perform calculations with NumPy\n",
+ "\n",
+ "### Arithmetic\n",
+ "\n",
+ "In core Python, that is *without* NumPy, creating sequences of values and adding them together requires writing a lot of manual loops, just like one would do in C/C++:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a = list(range(5, 10))\n",
+ "b = [3 + i * 1.5 / 4 for i in range(5)]\n",
+ "\n",
+ "a, b"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "result = []\n",
+ "for x, y in zip(a, b):\n",
+ " result.append(x + y)\n",
+ "print(result)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "That is very verbose and not very intuitive. Using NumPy this becomes:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a = np.arange(5, 10)\n",
+ "b = np.linspace(3, 4.5, 5)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a + b"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Many major mathematical operations operate in the same way. They perform an element-by-element calculation of the two arrays."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a - b"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a / b"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a ** b"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "\n",
+ "
Warning
\n",
+ " These arrays must be the same shape!\n",
+ "
"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b = np.linspace(3, 4.5, 6)\n",
+ "a.shape, b.shape"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "tags": [
+ "raises-exception"
+ ]
+ },
+ "outputs": [],
+ "source": [
+ "a * b"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Constants\n",
+ "\n",
+ "NumPy provides us access to some useful constants as well - remember you should never be typing these in manually! Other libraries such as SciPy and MetPy have their own set of constants that are more domain specific."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "np.pi"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "np.e"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "You can use these for classic calculations you might be familiar with! Here we can create a range `t = [0, 2 pi]` by `pi/4`,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "t = np.arange(0, 2 * np.pi + np.pi / 4, np.pi / 4)\n",
+ "t"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "t / np.pi"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Array math functions\n",
+ "\n",
+ "NumPy also has math functions that can operate on arrays. Similar to the math operations, these greatly simplify and speed up these operations. Let's start with calculating $\\sin(t)$!"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "sin_t = np.sin(t)\n",
+ "sin_t"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "and clean it up a bit by `round`ing to three decimal places."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "np.round(sin_t, 3)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "cos_t = np.cos(t)\n",
+ "cos_t"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "\n",
+ "
Info
\n",
+ " Check out NumPy's list of mathematical functions
here!\n",
+ "
"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "We can convert between degrees and radians with only NumPy, by hand"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "t / np.pi * 180"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "or with built-in function `rad2deg`,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "degrees = np.rad2deg(t)\n",
+ "degrees"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "We are similarly provided algorithms for operations including integration, bulk summing, and cumulative summing."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "sine_integral = np.trapz(sin_t, t)\n",
+ "np.round(sine_integral, 3)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "cos_sum = np.sum(cos_t)\n",
+ "cos_sum"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "cos_csum = np.cumsum(cos_t)\n",
+ "print(cos_csum)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Indexing and subsetting arrays\n",
+ "\n",
+ "### Indexing\n",
+ "\n",
+ "We can use integer indexing to reach into our arrays and pull out individual elements. Let's make a toy 2-d array to explore. Here we create a 12-value `arange` and `reshape` it into a 3x4 array."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a = np.arange(12).reshape(3, 4)\n",
+ "a"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Recall that Python indexing starts at `0`, and we can begin indexing our array with the list-style `list[element]` notation,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a[0]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "to pull out just our first _row_ of data within `a`. Similarly we can index in reverse with negative indices,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a[-1]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "to pull out just the last row of data within `a`. This notation extends to as many dimensions as make up our array as `array[m, n, p, ...]`. The following diagram shows these indices for an example, 2-dimensional `6x6` array,"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ ""
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "For example, let's find the entry in our array corresponding to the 2nd row (`m=1` in Python) and the 3rd column (`n=2` in Python)"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a[1, 2]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "We can again use these negative indices to index backwards,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a[-1, -1]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "and even mix-and-match along dimensions,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a[1, -2]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Slices\n",
+ "\n",
+ "Slicing syntax is written as `array[start:stop[:step]]`, where **all numbers are optional**.\n",
+ "- defaults: \n",
+ " - start = 0\n",
+ " - stop = len(dim)\n",
+ " - step = 1\n",
+ "- The second colon is **also optional** if no step is used.\n",
+ "\n",
+ "Let's pull out just the first row, `m=0` of `a` and see how this works!"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b = a[0]\n",
+ "b"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Laying out our default slice to see the entire array explicitly looks something like this,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b[0:4:1]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "where again, these default values are optional,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b[::]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "and even the second `:` is optional"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b[:]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now to actually make our own slice, let's select select all elements from `m=0` to `m=2`"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b[0:2]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "\n",
+ "
Warning
\n",
+ " Slice notation is
exclusive of the final index.\n",
+ "
"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "This means that slices will include every value **up to** your `stop` index and not this index itself, like a half-open interval `[start, end)`. For example,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b[3]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "reveals a different value than"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b[0:3]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Finally, a few more examples of this notation before reintroducing our 2-d array `a`."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b[2:] # m=2 through the end, can leave off the number"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b[:3] # similarly, the same as our b[0:3]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Multidimensional slicing\n",
+ "This entire syntax can be extended to each dimension of multidimensional arrays."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "First let's pull out rows `0` through `2`, and then every `:` column for each of those"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a[0:2, :]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Similarly, let's get all rows for just column `2`,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a[:, 2]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "or just take a look at the full row `:`, for every second column, `::2`,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a[:, ::2]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "For any shape of array, you can use `...` to capture full slices of every non-specified dimension. Consider the 3-D array,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "c = a.reshape(2, 2, 3)\n",
+ "c"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "c[0, ...]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "and so this is equivalent to"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "c[0, :, :]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "for extracting every dimension across our first row. We can also flip this around,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "c[..., -1]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "to investigate every preceding dimension along our the last entry of our last axis, the same as `c[:, :, -1]`."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "---"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Summary\n",
+ "In this notebook we introduced NumPy and the `ndarray` that is so crucial to the entirety of the scientific Python community ecosystem. We created some arrays, used some of NumPy's own mathematical functions to manipulate them, and then introduced the world of NumPy indexing and selecting for even multi-dimensional arrays.\n",
+ "\n",
+ "### What's next?\n",
+ "This notebook is the gateway to nearly every other Pythia resource here. This information is crucial for understanding SciPy, pandas, xarray, and more. Continue into NumPy to explore some more intermediate and advanced topics!"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Resources and references\n",
+ "- [NumPy User Guide](http://docs.scipy.org/doc/numpy/user/)\n",
+ "- [SciPy Lecture Notes](https://scipy-lectures.org/)"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "Python 3",
+ "language": "python",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.8.10"
+ },
+ "toc-autonumbering": false
+ },
+ "nbformat": 4,
+ "nbformat_minor": 4
+}
diff --git a/core/numpy/numpy-broadcasting.ipynb b/core/numpy/numpy-broadcasting.ipynb
new file mode 100644
index 000000000..3f60643ea
--- /dev/null
+++ b/core/numpy/numpy-broadcasting.ipynb
@@ -0,0 +1,938 @@
+{
+ "cells": [
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "
\n",
+ "# NumPy Broadcasting\n",
+ "---"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Overview\n",
+ "Before we begin, broadcasting is a valuable part of the power that NumPy provides. However, there's no looking past the fact that broadcasting can be conceptually difficult to digest. This information can be helpful and very powerful, but we also suggest moving on to take a look at some of the label-based corners of the Python ecosystem, namely [pandas](../pandas) and [xarray](../xarray) for the ways that they make some of these concepts simpler or easier to use for real-world data.\n",
+ "\n",
+ "1. An introduction to broadcasting\n",
+ "1. Avoiding loops with vectorization"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Prerequisites\n",
+ "\n",
+ "| Concepts | Importance | Notes |\n",
+ "| --- | --- | --- |\n",
+ "| [NumPy Basics](numpy-basics) | Necessary | |\n",
+ "| [Intermediate NumPy](intermediate-numpy) | Helpful | |\n",
+ "| [Conceptual guide to broadcasting](https://numpy.org/doc/stable/user/theory.broadcasting.html#array-broadcasting-in-numpy) | Helpful | |\n",
+ "\n",
+ "* **Experience level**: advanced\n",
+ "* **Time to learn**: medium\n",
+ "---"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Imports"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "import numpy as np"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Using broadcasting to implicitly loop over data"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### What is broadcasting?\n",
+ "Broadcasting is a useful NumPy tool that allows us to perform operations between arrays with different shapes, provided that they are compatible with each other in certain ways. To start, we can create an array below and add 5 to it:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "import numpy as np\n",
+ "\n",
+ "a = np.array([10, 20, 30, 40])\n",
+ "a + 5"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "This works even though 5 is not an array; it works like as we would expect, adding 5 to each of the elements in `a`. This also works if 5 is an array:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b = np.array([5])\n",
+ "a + b"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "This takes the single element in `b` and adds it to each of the elements in `a`. This won't work for just any `b`, though; for instance, the following:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {
+ "tags": [
+ "raises-exception"
+ ]
+ },
+ "outputs": [],
+ "source": [
+ "b = np.array([5, 6, 7])\n",
+ "a + b"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "won't work. It does work if `a` and `b` are the same shape:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b = np.array([5, 5, 10, 10])\n",
+ "a + b"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "What if what we really want is pairwise addition of a, b? Without broadcasting, we could accomplish this by looping:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b = np.array([1, 2, 3, 4, 5])"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "result = np.empty((5, 4), dtype=np.int32)\n",
+ "for row, valb in enumerate(b):\n",
+ " for col, vala in enumerate(a):\n",
+ " result[row, col] = vala + valb\n",
+ "result"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "We can also do this by manually repeating the arrays to the proper shape for the result, using `np.tile`. This avoids the need to manually loop:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "aa = np.tile(a, (5, 1))\n",
+ "aa"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# Turn b into a column array, then tile it\n",
+ "bb = np.tile(b.reshape(5, 1), (1, 4))\n",
+ "bb"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "aa + bb"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Giving NumPy room for broadcasting\n",
+ "We can also do this using broadcasting, which is where NumPy implicitly repeats the array without using additional memory. With broadcasting, NumPy takes care of repeating for you, provided dimensions are \"compatible\". This works as:\n",
+ "1. Check the number of dimensions of the arrays. If they are different, *prepend* size one dimensions\n",
+ "2. Check if each of the dimensions are compatible: either the same size, or one of them is 1."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a.shape"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "b.shape"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Right now, they have the same number of dimensions, 1, but that dimension is incompatible. We can solve this by appending a dimension using `np.newaxis` when indexing:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "bb = b[:, np.newaxis]\n",
+ "bb.shape"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a + bb"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "(a + bb).shape"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "This can be written more directly in one line:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a + b[:, np.newaxis]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Extending to higher dimensions\n",
+ "This also works for higher dimensions. `x`, `y`, and `z` are here different dimensions, and we can broadcast to perform $x^2 + y^2 + z^2$,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "x = np.array([1, 2])\n",
+ "y = np.array([3, 4, 5])\n",
+ "z = np.array([6, 7, 8, 9])"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "First, let's extend `x` (and square it) by one dimension, onto which we can broadcast the vector `y ** 2`,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "d_2d = x[:, np.newaxis] ** 2 + y ** 2"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "d_2d.shape"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "and then further extend this new 2-D array by one more dimension before using broadcasting to add `z ** 2` across all other dimensions."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "d_3d = d_2d[..., np.newaxis] + z ** 2"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "d_3d.shape"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Or in one line:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "h = x[:, np.newaxis, np.newaxis] ** 2 + y[np.newaxis, :, np.newaxis] ** 2 + z ** 2"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "We can see this one-line result has the same shape and same values as the other multi-step calculation."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "h.shape"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "and we can confirm that the results here are identical,"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "np.all(h == d_3d)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Broadcasting is often useful when you want to do calculations with coordinate values, which are often given as 1-D arrays corresponding to positions along a particular array dimension. For example, taking range and azimuth values for radar data (1-D separable polar coordinates) and converting to x,y pairs relative to the radar location."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Given the 3-D temperature field and 1-D pressure coordinates below, let's calculate $T * exp(P / 1000)$. We will need to use broadcasting to make the arrays compatible!"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "pressure = np.array([1000, 850, 500, 300])\n",
+ "temps = np.linspace(20, 30, 24).reshape(4, 3, 2)\n",
+ "pressure.shape, temps.shape"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "pressure[:, np.newaxis, np.newaxis].shape"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "temps * np.exp(pressure[:, np.newaxis, np.newaxis] / 1000)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Vectorize calculations to avoid explicit loops"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "When working with arrays of data, loops over the individual array elements is a fact of life. However, for improved runtime performance, it is important to avoid performing these loops in Python as much as possible, and let NumPy handle the looping for you. Avoiding these loops frequently, but not always, results in shorter and clearer code as well."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Look ahead/behind\n",
+ "\n",
+ "One common pattern for vectorizing is in converting loops that work over the current point as well as the previous and/or next point. This comes up when doing finite-difference calculations, e.g. approximating derivatives,\n",
+ "\n",
+ "\\begin{equation*}\n",
+ "f'(x) = f_{i+1} - f_{i}\n",
+ "\\end{equation*}"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a = np.linspace(0, 20, 6)\n",
+ "a"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "We can calculate the forward difference for this array with a manual loop as:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "d = np.zeros(a.size - 1)\n",
+ "for i in range(len(a) - 1):\n",
+ " d[i] = a[i + 1] - a[i]\n",
+ "d"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "It would be nice to express this calculation without a loop, if possible. To see how to go about this, let's consider the values that are involved in calculating `d[i]`, `a[i+1]` and `a[i]`. The values over the loop iterations are:\n",
+ "\n",
+ "| i | a[i+1] | a[i] |\n",
+ "| --- | ---- | ---- |\n",
+ "| 0 | 4 | 0 |\n",
+ "| 1 | 8 | 4 |\n",
+ "| 2 | 12 | 8 |\n",
+ "| 3 | 16 | 12 |\n",
+ "| 4 | 20 | 16 |\n",
+ "\n",
+ "We can express the series of values for `a[i+1]` then as:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a[1:]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "and `a[i]` as:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a[:-1]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "This means that we can express the forward difference as:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "a[1:] - a[:-1]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "It should be noted that using slices in this way returns only a **view** on the original array. This means not only can you use the slices to modify the original data (even accidentally), but that this is also a quick operation that does not involve a copy and does not bloat memory usage."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "#### 2nd Derivative\n",
+ " \n",
+ "A finite difference estimate of the 2nd derivative is given by:\n",
+ "\n",
+ "\\begin{equation*}\n",
+ "f''(x) = 2\n",
+ "f_i - f_{i+1} - f_{i-1}\n",
+ "\\end{equation*}\n",
+ "\n",
+ "(we're ignoring $\\Delta x$ here)\n",
+ "\n",
+ "Let's write some vectorized code to calculate this finite difference for `a` (using slices.) What values should we be expecting to get for the 2nd derivative?"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "2 * a[1:-1] - a[:-2] - a[2:]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Blocking\n",
+ "\n",
+ "Another application where vectorization comes into play to make operations more efficient is when operating on blocks of data. Let's start by creating some temperature data (rounding to make it easier to see/recognize the values)."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "temps = np.round(20 + np.random.randn(10) * 5, 1)\n",
+ "temps"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Let's start by writing a loop to take a 3-point running mean of the data. We'll do this by iterating over all points in the array and average the 3 points centered on that point. We'll simplify the problem by avoiding dealing with the cases at the edges of the array."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "avg = np.zeros_like(temps)\n",
+ "for i in range(1, len(temps) - 1):\n",
+ " sub = temps[i - 1 : i + 2]\n",
+ " avg[i] = sub.mean()"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "avg"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "As with the case of doing finite differences, we can express this using slices of the original array:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# i - 1 i i + 1\n",
+ "(temps[:-2] + temps[1:-1] + temps[2:]) / 3"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Another option to solve this is not using slicing but by using a powerful numpy tool: `as_strided`. This tool can result in some odd behavior, so take care when using--the trade-off is that this can be used to do some powerful operations. What we're doing here is altering how NumPy is interpreting the values in the memory that underpins the array. So for this array:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "temps"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "we can create a view of the array with a new, bigger shape, with rows made up of overlapping values. We do this by specifying a new shape of 8x3, one row for each of the length 3 blocks we can fit in the original 1-D array of data. We then use the `strides` argument to control how numpy walks between items in each dimension. The last item in the strides tuple is just as normal--it says that the number of bytes to walk between items is just the size of an item. (Increasing this would skip items.) The first item says that when we go to a new, in this case row, only advance the size of a single item. This is what gives us overlapping rows."
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "block_size = 3\n",
+ "new_shape = (len(temps) - block_size + 1, block_size)\n",
+ "bytes_per_item = temps.dtype.itemsize\n",
+ "temps_strided = np.lib.stride_tricks.as_strided(\n",
+ " temps, shape=new_shape, strides=(bytes_per_item, bytes_per_item)\n",
+ ")\n",
+ "temps_strided"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now that we have this view of the array with the rows representing overlapping blocks, we can operate across the rows with `mean` and the `axis=-1` argument to get our running average:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "temps_strided.mean(axis=-1)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "It should be noted that there are no copies going on here, so if we change a value at a single indexed location, the change actually shows up in multiple locations:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "temps_strided[0, 2] = 2000\n",
+ "temps_strided"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Finding the difference between min and max\n",
+ "\n",
+ "Another operation that crops up when slicing and dicing data is trying to identify a set of indexes, along a particular axis, within a larger multidimensional array. For instance, say we have a 3-D array of temperatures, and want to identify the location of the $-10^oC$ isotherm within each column:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "pressure = np.linspace(1000, 100, 25)\n",
+ "temps = np.random.randn(25, 30, 40) * 3 + np.linspace(25, -100, 25).reshape(-1, 1, 1)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "NumPy has the function `argmin()` which returns the index of the minimum value. We can use this to find the minimum absolute difference between the value and -10:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "# Using axis=0 to tell it to operate along the pressure dimension\n",
+ "inds = np.argmin(np.abs(temps - -10), axis=0)\n",
+ "inds"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "inds.shape"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Great! We have an array representing the index of the point closest to $-10^oC$ in each column of data. We could use this to look up into our pressure coordinates to find the pressure level for each column:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "pressure[inds]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "How about using that to find the actual temperature value that was closest?"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "temps[inds, :, :].shape"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Unfortunately, this replaced the pressure dimension (size 25) with the shape of our index array (30 x 40), giving us a 30 x 40 x 30 x 40 array (imagine what would have happened with real data!). One solution here would be to loop:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "output = np.empty(inds.shape, dtype=temps.dtype)\n",
+ "for (i, j), val in np.ndenumerate(inds):\n",
+ " output[i, j] = temps[val, i, j]\n",
+ "output"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Of course, what we really want to do is avoid the explicit loop. Let's temporarily simplify the problem to a single dimension. If we have a 1-D array, we can pass a 1-D array of indices (a full) range, and get back the same as the original data array:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "pressure[np.arange(pressure.size)]"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "np.all(pressure[np.arange(pressure.size)] == pressure)"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "We can use this to select all the indices on the other dimensions of our temperature array. We will also need to use the magic of broadcasting to combine arrays of indices across dimensions:"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Now vectorized solution:"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "y_inds = np.arange(temps.shape[1])[:, np.newaxis]\n",
+ "x_inds = np.arange(temps.shape[2])\n",
+ "temps[inds, y_inds, x_inds]"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "Let's say we want to find the relative humidity at the $-10^oC$ isotherm"
+ ]
+ },
+ {
+ "cell_type": "code",
+ "execution_count": null,
+ "metadata": {},
+ "outputs": [],
+ "source": [
+ "np.all(output == temps[inds, y_inds, x_inds])"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "---"
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "## Summary\n",
+ "We've previewed some advanced NumPy capabilities with a focus on _vectorization_, or using clever broadcasting and windows of our data to enhance the speed and readability of our calculations. Doing so can reduce explicit construction of loops in your code and keep calculations running quickly!\n",
+ "\n",
+ "### What's next\n",
+ "This is an advanced NumPy topic, and important to designing your own calculations in a way for them to be as scalable and quick as possible. Please check out some of the following links to explore this topic further. We also suggest diving into label-based indexing and subsetting with [pandas](../pandas) and [xarray](../xarray), where some of this broadcasting can be simplified or have added context."
+ ]
+ },
+ {
+ "cell_type": "markdown",
+ "metadata": {},
+ "source": [
+ "### Resources and references\n",
+ "* [NumPy Broadcasting Documentation](https://docs.scipy.org/doc/numpy/user/basics.broadcasting.html)\n",
+ "* [NumPy Broadcasting Article](https://numpy.org/devdocs/user/theory.broadcasting.html)"
+ ]
+ }
+ ],
+ "metadata": {
+ "kernelspec": {
+ "display_name": "Python 3",
+ "language": "python",
+ "name": "python3"
+ },
+ "language_info": {
+ "codemirror_mode": {
+ "name": "ipython",
+ "version": 3
+ },
+ "file_extension": ".py",
+ "mimetype": "text/x-python",
+ "name": "python",
+ "nbconvert_exporter": "python",
+ "pygments_lexer": "ipython3",
+ "version": "3.8.10"
+ }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 4
+}
diff --git a/core/pandas/nino_analyzed_output.csv b/core/pandas/nino_analyzed_output.csv
deleted file mode 100644
index 1336c9ae0..000000000
--- a/core/pandas/nino_analyzed_output.csv
+++ /dev/null
@@ -1,473 +0,0 @@
-datetime,Nino12,Nino12anom,Nino3,Nino3anom,Nino4,Nino4anom,Nino34,Nino34anom,month,Nino34_degK
-1982-01-01,24.29,-0.17,25.87,0.24,28.3,0.0,26.72,0.15,1,299.87
-1982-02-01,25.49,-0.58,26.38,0.01,28.21,0.11,26.7,-0.02,2,299.84999999999997
-1982-03-01,25.21,-1.31,26.98,-0.16,28.41,0.22,27.2,-0.02,3,300.34999999999997
-1982-04-01,24.5,-0.97,27.68,0.18,28.92,0.42,28.02,0.24,4,301.16999999999996
-1982-05-01,23.97,-0.23,27.79,0.71,29.49,0.7,28.54,0.69,5,301.69
-1982-06-01,22.89,0.07,27.46,1.03,29.76,0.92,28.75,1.1,6,301.9
-1982-07-01,22.47,0.87,26.44,0.82,29.38,0.58,28.1,0.88,7,301.25
-1982-08-01,21.75,1.1,26.15,1.16,29.04,0.36,27.93,1.11,8,301.08
-1982-09-01,21.8,1.44,26.52,1.67,29.16,0.47,28.11,1.39,9,301.26
-1982-10-01,22.94,2.12,27.11,2.19,29.38,0.72,28.64,1.95,10,301.78999999999996
-1982-11-01,24.59,3.0,27.62,2.64,29.23,0.6,28.81,2.16,11,301.96
-1982-12-01,26.13,3.34,28.39,3.25,29.15,0.66,29.21,2.64,12,302.35999999999996
-1983-01-01,27.42,2.96,28.92,3.29,29.0,0.7,29.36,2.79,1,302.51
-1983-02-01,28.09,2.02,28.92,2.55,28.79,0.69,29.13,2.41,2,302.28
-1983-03-01,28.68,2.16,29.1,1.96,28.76,0.57,29.03,1.81,3,302.17999999999995
-1983-04-01,28.56,3.09,29.12,1.62,28.85,0.35,28.91,1.13,4,302.06
-1983-05-01,28.19,3.99,28.97,1.89,29.08,0.29,28.89,1.04,5,302.03999999999996
-1983-06-01,27.44,4.62,28.15,1.72,28.88,0.04,28.24,0.59,6,301.39
-1983-07-01,25.95,4.35,26.62,1.0,28.65,-0.15,27.07,-0.15,7,300.21999999999997
-1983-08-01,23.78,3.13,25.87,0.88,28.38,-0.3,26.53,-0.29,8,299.67999999999995
-1983-09-01,22.24,1.88,25.24,0.39,28.23,-0.46,26.44,-0.28,9,299.59
-1983-10-01,21.86,1.04,24.61,-0.31,27.75,-0.91,25.87,-0.82,10,299.02
-1983-11-01,21.9,0.31,24.17,-0.81,27.76,-0.87,25.58,-1.07,11,298.72999999999996
-1983-12-01,23.01,0.22,24.44,-0.7,27.82,-0.67,25.59,-0.98,12,298.73999999999995
-1984-01-01,24.18,-0.28,24.82,-0.81,27.64,-0.66,25.64,-0.93,1,298.78999999999996
-1984-02-01,25.18,-0.89,26.22,-0.15,27.25,-0.85,26.39,-0.33,2,299.53999999999996
-1984-03-01,26.0,-0.52,27.12,-0.02,27.21,-0.98,26.86,-0.36,3,300.01
-1984-04-01,25.16,-0.31,27.34,-0.16,27.7,-0.8,27.39,-0.39,4,300.53999999999996
-1984-05-01,23.23,-0.97,26.46,-0.62,27.95,-0.84,27.39,-0.46,5,300.53999999999996
-1984-06-01,21.96,-0.86,25.38,-1.05,28.13,-0.71,26.86,-0.79,6,300.01
-1984-07-01,21.24,-0.36,24.96,-0.66,28.35,-0.45,26.74,-0.48,7,299.89
-1984-08-01,20.17,-0.48,24.5,-0.49,28.17,-0.51,26.34,-0.48,8,299.48999999999995
-1984-09-01,20.37,0.01,24.35,-0.5,28.61,-0.08,26.43,-0.29,9,299.58
-1984-10-01,20.52,-0.3,23.95,-0.97,28.28,-0.38,25.93,-0.76,10,299.08
-1984-11-01,21.5,-0.09,24.03,-0.95,27.99,-0.64,25.41,-1.24,11,298.56
-1984-12-01,22.58,-0.21,23.7,-1.44,27.44,-1.05,25.0,-1.57,12,298.15
-1985-01-01,23.59,-0.87,24.51,-1.12,27.71,-0.59,25.43,-1.14,1,298.58
-1985-02-01,24.87,-1.2,25.19,-1.18,27.55,-0.55,25.67,-1.05,2,298.82
-1985-03-01,25.74,-0.78,26.11,-1.03,27.38,-0.81,26.23,-0.99,3,299.38
-1985-04-01,24.25,-1.22,26.52,-0.98,27.72,-0.78,26.8,-0.98,4,299.95
-1985-05-01,22.29,-1.91,26.12,-0.96,28.06,-0.73,27.11,-0.74,5,300.26
-1985-06-01,21.75,-1.07,25.6,-0.83,28.08,-0.76,26.86,-0.79,6,300.01
-1985-07-01,20.44,-1.16,24.74,-0.88,28.28,-0.52,26.69,-0.53,7,299.84
-1985-08-01,19.29,-1.36,24.4,-0.59,28.32,-0.36,26.5,-0.32,8,299.65
-1985-09-01,19.44,-0.92,24.15,-0.7,28.33,-0.36,26.25,-0.47,9,299.4
-1985-10-01,19.9,-0.92,24.15,-0.77,28.28,-0.38,26.19,-0.5,10,299.34
-1985-11-01,20.69,-0.9,24.28,-0.7,28.52,-0.11,26.19,-0.46,11,299.34
-1985-12-01,22.4,-0.39,24.29,-0.85,28.53,0.04,26.11,-0.46,12,299.26
-1986-01-01,24.61,0.15,24.73,-0.9,28.11,-0.19,25.79,-0.78,1,298.94
-1986-02-01,26.06,-0.01,25.81,-0.56,27.93,-0.17,25.94,-0.78,2,299.09
-1986-03-01,25.91,-0.61,26.84,-0.3,27.97,-0.22,26.65,-0.57,3,299.79999999999995
-1986-04-01,24.58,-0.89,27.17,-0.33,28.21,-0.29,27.44,-0.34,4,300.59
-1986-05-01,23.38,-0.82,26.68,-0.4,28.58,-0.21,27.5,-0.35,5,300.65
-1986-06-01,21.98,-0.84,26.3,-0.13,28.84,0.0,27.69,0.04,6,300.84
-1986-07-01,21.12,-0.48,25.7,0.08,28.9,0.1,27.37,0.15,7,300.52
-1986-08-01,20.97,0.32,25.02,0.03,29.04,0.36,27.15,0.33,8,300.29999999999995
-1986-09-01,20.44,0.08,25.25,0.4,29.18,0.49,27.33,0.61,9,300.47999999999996
-1986-10-01,21.07,0.25,25.62,0.7,29.38,0.72,27.57,0.88,10,300.71999999999997
-1986-11-01,22.03,0.44,25.92,0.94,29.4,0.77,27.73,1.08,11,300.88
-1986-12-01,23.0,0.21,25.86,0.72,29.19,0.7,27.7,1.13,12,300.84999999999997
-1987-01-01,25.3,0.84,26.69,1.06,29.02,0.72,27.91,1.34,1,301.06
-1987-02-01,27.14,1.07,27.42,1.05,28.93,0.83,28.02,1.3,2,301.16999999999996
-1987-03-01,28.01,1.49,28.2,1.06,29.04,0.85,28.47,1.25,3,301.62
-1987-04-01,27.17,1.7,28.49,0.99,29.21,0.71,28.8,1.02,4,301.95
-1987-05-01,25.58,1.38,28.22,1.14,29.25,0.46,28.75,0.9,5,301.9
-1987-06-01,24.06,1.24,27.71,1.28,29.53,0.69,29.03,1.38,6,302.17999999999995
-1987-07-01,22.78,1.18,27.07,1.45,29.47,0.67,28.8,1.58,7,301.95
-1987-08-01,21.73,1.08,26.52,1.53,29.41,0.73,28.58,1.76,8,301.72999999999996
-1987-09-01,21.45,1.09,26.57,1.72,29.51,0.82,28.39,1.67,9,301.53999999999996
-1987-10-01,22.39,1.57,26.2,1.28,29.61,0.95,28.07,1.38,10,301.21999999999997
-1987-11-01,22.63,1.04,26.13,1.15,29.8,1.17,27.99,1.34,11,301.14
-1987-12-01,23.47,0.68,26.2,1.06,29.44,0.95,27.6,1.03,12,300.75
-1988-01-01,24.64,0.18,26.12,0.49,29.13,0.83,27.32,0.75,1,300.46999999999997
-1988-02-01,25.74,-0.33,26.55,0.18,28.69,0.59,27.22,0.5,2,300.37
-1988-03-01,25.78,-0.74,27.14,0.0,28.2,0.01,27.31,0.09,3,300.46
-1988-04-01,24.54,-0.93,26.73,-0.77,28.15,-0.35,27.32,-0.46,4,300.46999999999997
-1988-05-01,23.6,-0.6,25.22,-1.86,28.36,-0.43,26.48,-1.37,5,299.63
-1988-06-01,21.27,-1.55,24.46,-1.97,28.13,-0.71,26.11,-1.54,6,299.26
-1988-07-01,20.26,-1.34,23.71,-1.91,27.88,-0.92,25.57,-1.65,7,298.71999999999997
-1988-08-01,19.12,-1.53,23.37,-1.62,27.68,-1.0,25.24,-1.58,8,298.39
-1988-09-01,19.19,-1.17,23.61,-1.24,27.63,-1.06,25.43,-1.29,9,298.58
-1988-10-01,19.5,-1.32,23.17,-1.75,27.06,-1.6,24.62,-2.07,10,297.77
-1988-11-01,20.55,-1.04,23.03,-1.95,26.76,-1.87,24.27,-2.38,11,297.41999999999996
-1988-12-01,21.8,-0.99,23.07,-2.07,26.75,-1.74,24.33,-2.24,12,297.47999999999996
-1989-01-01,24.09,-0.37,24.15,-1.48,26.54,-1.76,24.53,-2.04,1,297.67999999999995
-1989-02-01,26.26,0.19,25.61,-0.76,26.55,-1.55,25.33,-1.39,2,298.47999999999996
-1989-03-01,26.66,0.14,26.02,-1.12,27.0,-1.19,25.9,-1.32,3,299.04999999999995
-1989-04-01,25.63,0.16,26.67,-0.83,27.54,-0.96,26.69,-1.09,4,299.84
-1989-05-01,23.18,-1.02,26.37,-0.71,28.14,-0.65,27.09,-0.76,5,300.23999999999995
-1989-06-01,22.0,-0.82,26.08,-0.35,27.94,-0.9,26.98,-0.67,6,300.13
-1989-07-01,21.12,-0.48,25.28,-0.34,28.2,-0.6,26.74,-0.48,7,299.89
-1989-08-01,20.32,-0.33,24.56,-0.43,28.14,-0.54,26.33,-0.49,8,299.47999999999996
-1989-09-01,19.87,-0.49,24.45,-0.4,28.25,-0.44,26.25,-0.47,9,299.4
-1989-10-01,20.33,-0.49,24.49,-0.43,28.39,-0.27,26.26,-0.43,10,299.40999999999997
-1989-11-01,21.31,-0.28,24.56,-0.42,28.23,-0.4,26.24,-0.41,11,299.39
-1989-12-01,22.19,-0.6,24.71,-0.43,28.52,0.03,26.38,-0.19,12,299.53
-1990-01-01,24.02,-0.44,25.34,-0.29,28.56,0.26,26.55,-0.02,1,299.7
-1990-02-01,25.88,-0.19,26.37,0.0,28.62,0.52,26.95,0.23,2,300.09999999999997
-1990-03-01,26.16,-0.36,27.03,-0.11,28.78,0.59,27.46,0.24,3,300.60999999999996
-1990-04-01,25.22,-0.25,27.67,0.17,28.93,0.43,28.02,0.24,4,301.16999999999996
-1990-05-01,24.05,-0.15,27.35,0.27,28.96,0.17,28.06,0.21,5,301.21
-1990-06-01,22.68,-0.14,26.45,0.02,28.94,0.1,27.58,-0.07,6,300.72999999999996
-1990-07-01,21.0,-0.6,25.45,-0.17,28.98,0.18,27.25,0.03,7,300.4
-1990-08-01,20.25,-0.4,25.06,0.07,29.17,0.49,27.05,0.23,8,300.2
-1990-09-01,20.13,-0.23,24.85,0.0,29.04,0.35,26.75,0.03,9,299.9
-1990-10-01,20.28,-0.54,24.9,-0.02,29.23,0.57,26.98,0.29,10,300.13
-1990-11-01,20.84,-0.75,24.82,-0.16,29.06,0.43,26.72,0.07,11,299.87
-1990-12-01,22.45,-0.34,25.08,-0.06,29.14,0.65,26.91,0.34,12,300.06
-1991-01-01,23.86,-0.6,25.65,0.02,29.0,0.7,27.01,0.44,1,300.15999999999997
-1991-02-01,25.97,-0.1,26.27,-0.1,28.73,0.63,26.93,0.21,2,300.08
-1991-03-01,26.51,-0.01,26.99,-0.15,28.64,0.45,27.25,0.03,3,300.4
-1991-04-01,24.99,-0.48,27.32,-0.18,29.13,0.63,27.98,0.2,4,301.13
-1991-05-01,24.37,0.17,27.58,0.5,29.42,0.63,28.35,0.5,5,301.5
-1991-06-01,23.05,0.23,27.34,0.91,29.35,0.51,28.36,0.71,6,301.51
-1991-07-01,22.05,0.45,26.57,0.95,29.26,0.46,27.92,0.7,7,301.07
-1991-08-01,21.08,0.43,25.47,0.48,29.25,0.57,27.44,0.62,8,300.59
-1991-09-01,20.75,0.39,25.05,0.2,29.19,0.5,27.07,0.35,9,300.21999999999997
-1991-10-01,21.13,0.31,25.6,0.68,29.44,0.78,27.63,0.94,10,300.78
-1991-11-01,22.18,0.59,25.98,1.0,29.45,0.82,27.86,1.21,11,301.01
-1991-12-01,23.43,0.64,26.52,1.38,29.45,0.96,28.37,1.8,12,301.52
-1992-01-01,24.83,0.37,27.0,1.37,29.06,0.76,28.41,1.84,1,301.56
-1992-02-01,26.68,0.61,27.67,1.3,29.02,0.92,28.63,1.91,2,301.78
-1992-03-01,27.76,1.24,28.33,1.19,29.08,0.89,28.83,1.61,3,301.97999999999996
-1992-04-01,27.68,2.21,28.72,1.22,29.42,0.92,29.14,1.36,4,302.28999999999996
-1992-05-01,26.31,2.11,28.43,1.35,29.46,0.67,28.99,1.14,5,302.14
-1992-06-01,23.82,1.0,26.66,0.23,29.31,0.47,28.02,0.37,6,301.16999999999996
-1992-07-01,21.95,0.35,25.53,-0.09,29.35,0.55,27.53,0.31,7,300.67999999999995
-1992-08-01,20.55,-0.1,24.7,-0.29,28.9,0.22,26.64,-0.18,8,299.78999999999996
-1992-09-01,20.06,-0.3,24.52,-0.33,28.79,0.1,26.48,-0.24,9,299.63
-1992-10-01,20.82,0.0,24.62,-0.3,28.69,0.03,26.34,-0.35,10,299.48999999999995
-1992-11-01,21.49,-0.1,24.79,-0.19,28.7,0.07,26.51,-0.14,11,299.65999999999997
-1992-12-01,22.48,-0.31,25.01,-0.13,28.76,0.27,26.73,0.16,12,299.88
-1993-01-01,24.43,-0.03,25.56,-0.07,28.6,0.3,26.69,0.12,1,299.84
-1993-02-01,26.49,0.42,26.61,0.24,28.41,0.31,26.97,0.25,2,300.12
-1993-03-01,27.17,0.65,27.54,0.4,28.6,0.41,27.66,0.44,3,300.81
-1993-04-01,26.44,0.97,28.45,0.95,28.93,0.43,28.59,0.81,4,301.73999999999995
-1993-05-01,25.15,0.95,28.16,1.08,29.13,0.34,28.82,0.97,5,301.96999999999997
-1993-06-01,23.76,0.94,27.11,0.68,29.17,0.33,28.28,0.63,6,301.42999999999995
-1993-07-01,22.06,0.46,25.77,0.15,29.2,0.4,27.55,0.33,7,300.7
-1993-08-01,21.05,0.4,24.93,-0.06,28.94,0.26,26.84,0.02,8,299.98999999999995
-1993-09-01,20.83,0.47,24.97,0.12,29.07,0.38,26.92,0.2,9,300.07
-1993-10-01,20.99,0.17,25.21,0.29,28.9,0.24,26.93,0.24,10,300.08
-1993-11-01,21.64,0.05,25.17,0.19,28.97,0.34,26.91,0.26,11,300.06
-1993-12-01,22.75,-0.04,25.32,0.18,28.9,0.41,26.76,0.19,12,299.90999999999997
-1994-01-01,24.32,-0.14,25.71,0.08,28.47,0.17,26.6,0.03,1,299.75
-1994-02-01,25.79,-0.28,26.07,-0.3,28.07,-0.03,26.59,-0.13,2,299.73999999999995
-1994-03-01,25.43,-1.09,26.89,-0.25,28.26,0.07,27.27,0.05,3,300.41999999999996
-1994-04-01,24.32,-1.15,27.06,-0.44,28.62,0.12,27.9,0.12,4,301.04999999999995
-1994-05-01,23.22,-0.98,26.97,-0.11,29.0,0.21,28.04,0.19,5,301.19
-1994-06-01,22.43,-0.39,26.5,0.07,29.18,0.34,27.99,0.34,6,301.14
-1994-07-01,21.21,-0.39,25.19,-0.43,29.4,0.6,27.35,0.13,7,300.5
-1994-08-01,19.7,-0.95,24.71,-0.28,29.46,0.78,27.35,0.53,8,300.5
-1994-09-01,20.16,-0.2,24.81,-0.04,29.23,0.54,27.0,0.28,9,300.15
-1994-10-01,21.53,0.71,25.53,0.61,29.45,0.79,27.49,0.8,10,300.64
-1994-11-01,22.41,0.82,25.87,0.89,29.63,1.0,27.87,1.22,11,301.02
-1994-12-01,23.61,0.82,26.07,0.93,29.5,1.01,27.87,1.3,12,301.02
-1995-01-01,25.33,0.87,26.34,0.71,29.2,0.9,27.55,0.98,1,300.7
-1995-02-01,26.43,0.36,26.87,0.5,29.01,0.91,27.45,0.73,2,300.59999999999997
-1995-03-01,26.12,-0.4,27.08,-0.06,28.96,0.77,27.63,0.41,3,300.78
-1995-04-01,24.47,-1.0,27.1,-0.4,28.89,0.39,27.93,0.15,4,301.08
-1995-05-01,23.1,-1.1,26.4,-0.68,29.15,0.36,27.73,-0.12,5,300.88
-1995-06-01,22.45,-0.37,26.2,-0.23,29.01,0.17,27.59,-0.06,6,300.73999999999995
-1995-07-01,21.23,-0.37,25.42,-0.2,28.78,-0.02,27.01,-0.21,7,300.15999999999997
-1995-08-01,20.01,-0.64,24.33,-0.66,28.43,-0.25,26.33,-0.49,8,299.47999999999996
-1995-09-01,20.17,-0.19,24.02,-0.83,28.25,-0.44,25.96,-0.76,9,299.10999999999996
-1995-10-01,20.15,-0.67,24.01,-0.91,28.07,-0.59,25.67,-1.02,10,298.82
-1995-11-01,21.2,-0.39,24.03,-0.95,27.97,-0.66,25.66,-0.99,11,298.81
-1995-12-01,22.02,-0.77,24.19,-0.95,28.07,-0.42,25.57,-1.0,12,298.71999999999997
-1996-01-01,23.84,-0.62,24.96,-0.67,27.92,-0.38,25.74,-0.83,1,298.89
-1996-02-01,25.71,-0.36,25.72,-0.65,27.57,-0.53,25.85,-0.87,2,299.0
-1996-03-01,26.09,-0.43,26.71,-0.43,27.71,-0.48,26.62,-0.6,3,299.77
-1996-04-01,23.85,-1.62,26.72,-0.78,28.07,-0.43,27.36,-0.42,4,300.51
-1996-05-01,22.89,-1.31,26.33,-0.75,28.43,-0.36,27.37,-0.48,5,300.52
-1996-06-01,21.56,-1.26,25.89,-0.54,28.55,-0.29,27.32,-0.33,6,300.46999999999997
-1996-07-01,20.02,-1.58,25.35,-0.27,28.5,-0.3,27.09,-0.13,7,300.23999999999995
-1996-08-01,19.53,-1.12,24.6,-0.39,28.54,-0.14,26.56,-0.26,8,299.71
-1996-09-01,19.24,-1.12,24.37,-0.48,28.43,-0.26,26.35,-0.37,9,299.5
-1996-10-01,19.95,-0.87,24.37,-0.55,28.42,-0.24,26.24,-0.45,10,299.39
-1996-11-01,20.26,-1.33,24.38,-0.6,28.33,-0.3,26.19,-0.46,11,299.34
-1996-12-01,21.61,-1.18,24.2,-0.94,28.44,-0.05,26.02,-0.55,12,299.16999999999996
-1997-01-01,23.67,-0.79,24.7,-0.93,28.41,0.11,25.96,-0.61,1,299.10999999999996
-1997-02-01,25.74,-0.33,25.75,-0.62,28.33,0.23,26.36,-0.36,2,299.51
-1997-03-01,26.95,0.43,26.98,-0.16,28.52,0.33,27.03,-0.19,3,300.17999999999995
-1997-04-01,26.64,1.17,27.59,0.09,29.32,0.82,28.03,0.25,4,301.17999999999995
-1997-05-01,26.71,2.51,28.06,0.98,29.45,0.66,28.6,0.75,5,301.75
-1997-06-01,26.27,3.45,28.14,1.71,29.4,0.56,28.94,1.29,6,302.09
-1997-07-01,25.59,3.99,28.01,2.39,29.5,0.7,28.92,1.7,7,302.07
-1997-08-01,24.8,4.15,27.84,2.85,29.26,0.58,28.84,2.02,8,301.98999999999995
-1997-09-01,24.4,4.04,27.84,2.99,29.32,0.63,28.93,2.21,9,302.08
-1997-10-01,24.58,3.76,28.17,3.25,29.32,0.66,29.23,2.54,10,302.38
-1997-11-01,25.63,4.04,28.55,3.57,29.49,0.86,29.32,2.67,11,302.46999999999997
-1997-12-01,26.92,4.13,28.76,3.62,29.32,0.83,29.26,2.69,12,302.40999999999997
-1998-01-01,28.22,3.76,28.94,3.31,29.01,0.71,29.1,2.53,1,302.25
-1998-02-01,28.98,2.91,28.93,2.56,28.87,0.77,28.86,2.14,2,302.01
-1998-03-01,29.15,2.63,29.14,2.0,28.65,0.46,28.67,1.45,3,301.82
-1998-04-01,28.61,3.14,29.09,1.59,28.53,0.03,28.56,0.78,4,301.71
-1998-05-01,27.69,3.49,28.17,1.09,28.71,-0.08,28.47,0.62,5,301.62
-1998-06-01,25.18,2.36,26.0,-0.43,28.61,-0.23,26.72,-0.93,6,299.87
-1998-07-01,23.43,1.83,25.24,-0.38,28.07,-0.73,25.94,-1.28,7,299.09
-1998-08-01,21.77,1.12,24.63,-0.36,27.77,-0.91,25.49,-1.33,8,298.64
-1998-09-01,20.87,0.51,24.19,-0.66,27.88,-0.81,25.61,-1.11,9,298.76
-1998-10-01,21.16,0.34,24.06,-0.86,27.33,-1.33,25.34,-1.35,10,298.48999999999995
-1998-11-01,21.43,-0.16,24.11,-0.87,27.23,-1.4,25.18,-1.47,11,298.33
-1998-12-01,22.56,-0.23,23.86,-1.28,27.11,-1.38,24.79,-1.78,12,297.94
-1999-01-01,23.73,-0.73,24.41,-1.22,26.59,-1.71,24.9,-1.67,1,298.04999999999995
-1999-02-01,25.64,-0.43,25.57,-0.8,26.52,-1.58,25.41,-1.31,2,298.56
-1999-03-01,26.62,0.1,26.67,-0.47,26.9,-1.29,26.25,-0.97,3,299.4
-1999-04-01,24.3,-1.17,26.66,-0.84,27.35,-1.15,26.84,-0.94,4,299.98999999999995
-1999-05-01,23.46,-0.74,26.44,-0.64,27.87,-0.92,26.97,-0.88,5,300.12
-1999-06-01,21.83,-0.99,25.59,-0.84,28.01,-0.83,26.6,-1.05,6,299.75
-1999-07-01,20.44,-1.16,24.85,-0.77,27.92,-0.88,26.35,-0.87,7,299.5
-1999-08-01,19.75,-0.9,24.02,-0.97,27.73,-0.95,25.59,-1.23,8,298.73999999999995
-1999-09-01,19.23,-1.13,23.72,-1.13,27.82,-0.87,25.71,-1.01,9,298.85999999999996
-1999-10-01,20.05,-0.77,23.75,-1.17,27.85,-0.81,25.64,-1.05,10,298.78999999999996
-1999-11-01,20.51,-1.08,23.46,-1.52,27.56,-1.07,25.12,-1.53,11,298.27
-1999-12-01,21.72,-1.07,23.54,-1.6,27.23,-1.26,24.9,-1.67,12,298.04999999999995
-2000-01-01,23.86,-0.6,23.88,-1.75,26.96,-1.34,24.65,-1.92,1,297.79999999999995
-2000-02-01,25.71,-0.36,25.31,-1.06,26.66,-1.44,25.19,-1.53,2,298.34
-2000-03-01,26.19,-0.33,26.61,-0.53,26.76,-1.43,26.08,-1.14,3,299.22999999999996
-2000-04-01,25.84,0.37,27.46,-0.04,27.37,-1.13,27.01,-0.77,4,300.15999999999997
-2000-05-01,24.1,-0.1,26.8,-0.28,27.81,-0.98,27.12,-0.73,5,300.27
-2000-06-01,22.25,-0.57,25.84,-0.59,28.11,-0.73,27.03,-0.62,6,300.17999999999995
-2000-07-01,20.59,-1.01,25.13,-0.49,28.2,-0.6,26.72,-0.5,7,299.87
-2000-08-01,20.1,-0.55,24.47,-0.52,28.32,-0.36,26.45,-0.37,8,299.59999999999997
-2000-09-01,19.94,-0.42,24.35,-0.5,28.44,-0.25,26.21,-0.51,9,299.35999999999996
-2000-10-01,20.37,-0.45,24.41,-0.51,28.17,-0.49,25.96,-0.73,10,299.10999999999996
-2000-11-01,20.6,-0.99,24.17,-0.81,28.09,-0.54,25.78,-0.87,11,298.92999999999995
-2000-12-01,22.22,-0.57,24.43,-0.71,27.6,-0.89,25.59,-0.98,12,298.73999999999995
-2001-01-01,23.88,-0.58,24.99,-0.64,27.5,-0.8,25.74,-0.83,1,298.89
-2001-02-01,25.91,-0.16,26.06,-0.31,27.27,-0.83,26.11,-0.61,2,299.26
-2001-03-01,27.44,0.92,27.23,0.09,27.62,-0.57,26.84,-0.38,3,299.98999999999995
-2001-04-01,26.69,1.22,27.52,0.02,28.19,-0.31,27.52,-0.26,4,300.66999999999996
-2001-05-01,23.77,-0.43,26.89,-0.19,28.64,-0.15,27.6,-0.25,5,300.75
-2001-06-01,21.74,-1.08,26.35,-0.08,28.83,-0.01,27.68,0.03,6,300.83
-2001-07-01,20.88,-0.72,25.43,-0.19,29.06,0.26,27.32,0.1,7,300.46999999999997
-2001-08-01,19.9,-0.75,24.72,-0.27,28.96,0.28,26.87,0.05,8,300.02
-2001-09-01,19.39,-0.97,24.27,-0.58,29.14,0.45,26.55,-0.17,9,299.7
-2001-10-01,19.52,-1.3,24.45,-0.47,29.01,0.35,26.59,-0.1,10,299.73999999999995
-2001-11-01,20.49,-1.1,24.35,-0.63,28.96,0.33,26.45,-0.2,11,299.59999999999997
-2001-12-01,21.96,-0.83,24.6,-0.54,28.6,0.11,26.17,-0.4,12,299.32
-2002-01-01,23.64,-0.82,25.09,-0.54,28.81,0.51,26.5,-0.07,1,299.65
-2002-02-01,26.06,-0.01,26.21,-0.16,28.76,0.66,26.95,0.23,2,300.09999999999997
-2002-03-01,27.53,1.01,27.22,0.08,28.68,0.49,27.32,0.1,3,300.46999999999997
-2002-04-01,26.53,1.06,27.56,0.06,29.09,0.59,27.94,0.16,4,301.09
-2002-05-01,24.8,0.6,27.24,0.16,29.45,0.66,28.15,0.3,5,301.29999999999995
-2002-06-01,22.67,-0.15,27.06,0.63,29.63,0.79,28.43,0.78,6,301.58
-2002-07-01,21.01,-0.59,26.03,0.41,29.49,0.69,27.98,0.76,7,301.13
-2002-08-01,19.94,-0.71,25.47,0.48,29.4,0.72,27.79,0.97,8,300.94
-2002-09-01,19.89,-0.47,25.54,0.69,29.44,0.75,27.83,1.11,9,300.97999999999996
-2002-10-01,21.16,0.34,25.85,0.93,29.56,0.9,28.05,1.36,10,301.2
-2002-11-01,22.25,0.66,26.37,1.39,29.83,1.2,28.27,1.62,11,301.41999999999996
-2002-12-01,23.44,0.65,26.48,1.34,29.49,1.0,28.09,1.52,12,301.23999999999995
-2003-01-01,24.38,-0.08,26.38,0.75,29.25,0.95,27.76,1.19,1,300.90999999999997
-2003-02-01,25.81,-0.26,26.7,0.33,29.03,0.93,27.49,0.77,2,300.64
-2003-03-01,25.97,-0.55,27.28,0.14,29.03,0.84,27.81,0.59,3,300.96
-2003-04-01,24.44,-1.03,27.15,-0.35,28.96,0.46,27.81,0.03,4,300.96
-2003-05-01,22.49,-1.71,26.14,-0.94,28.92,0.13,27.37,-0.48,5,300.52
-2003-06-01,21.58,-1.24,25.83,-0.6,29.09,0.25,27.48,-0.17,6,300.63
-2003-07-01,20.75,-0.85,25.75,0.13,29.11,0.31,27.43,0.21,7,300.58
-2003-08-01,20.14,-0.51,25.04,0.05,29.05,0.37,26.85,0.03,8,300.0
-2003-09-01,20.0,-0.36,24.97,0.12,29.02,0.33,26.96,0.24,9,300.10999999999996
-2003-10-01,20.99,0.17,25.33,0.41,29.22,0.56,27.19,0.5,10,300.34
-2003-11-01,21.92,0.33,25.4,0.42,29.31,0.68,27.05,0.4,11,300.2
-2003-12-01,22.99,0.2,25.56,0.42,29.02,0.53,26.89,0.32,12,300.03999999999996
-2004-01-01,24.6,0.14,25.92,0.29,28.83,0.53,26.74,0.17,1,299.89
-2004-02-01,25.81,-0.26,26.46,0.09,28.59,0.49,26.86,0.14,2,300.01
-2004-03-01,25.94,-0.58,27.16,0.02,28.43,0.24,27.1,-0.12,3,300.25
-2004-04-01,25.32,-0.15,27.37,-0.13,28.75,0.25,27.84,0.06,4,300.98999999999995
-2004-05-01,23.05,-1.15,26.72,-0.36,29.16,0.37,28.06,0.21,5,301.21
-2004-06-01,21.6,-1.22,26.27,-0.16,29.17,0.33,27.76,0.11,6,300.90999999999997
-2004-07-01,20.71,-0.89,25.41,-0.21,29.39,0.59,27.69,0.47,7,300.84
-2004-08-01,19.62,-1.03,25.05,0.06,29.31,0.63,27.54,0.72,8,300.69
-2004-09-01,20.07,-0.29,25.17,0.32,29.6,0.91,27.47,0.75,9,300.62
-2004-10-01,20.92,0.1,25.32,0.4,29.55,0.89,27.38,0.69,10,300.53
-2004-11-01,21.96,0.37,25.46,0.48,29.57,0.94,27.31,0.66,11,300.46
-2004-12-01,22.94,0.15,25.77,0.63,29.4,0.91,27.31,0.74,12,300.46
-2005-01-01,24.47,0.01,25.89,0.26,29.21,0.91,27.1,0.53,1,300.25
-2005-02-01,25.49,-0.58,26.2,-0.17,28.83,0.73,26.96,0.24,2,300.10999999999996
-2005-03-01,25.6,-0.92,27.01,-0.13,28.91,0.72,27.55,0.33,3,300.7
-2005-04-01,24.9,-0.57,27.77,0.27,28.96,0.46,28.07,0.29,4,301.21999999999997
-2005-05-01,24.4,0.2,27.48,0.4,29.18,0.39,28.2,0.35,5,301.34999999999997
-2005-06-01,22.47,-0.35,26.81,0.38,29.18,0.34,28.05,0.4,6,301.2
-2005-07-01,21.18,-0.42,25.93,0.31,29.05,0.25,27.47,0.25,7,300.62
-2005-08-01,20.61,-0.04,25.19,0.2,28.86,0.18,26.88,0.06,8,300.03
-2005-09-01,19.71,-0.65,24.57,-0.28,28.84,0.15,26.63,-0.09,9,299.78
-2005-10-01,19.72,-1.1,24.69,-0.23,28.89,0.23,26.75,0.06,10,299.9
-2005-11-01,20.62,-0.97,24.28,-0.7,28.67,0.04,26.34,-0.31,11,299.48999999999995
-2005-12-01,22.29,-0.5,24.28,-0.86,28.35,-0.14,25.89,-0.68,12,299.03999999999996
-2006-01-01,24.33,-0.13,25.0,-0.63,27.68,-0.62,25.64,-0.93,1,298.78999999999996
-2006-02-01,26.46,0.39,26.08,-0.29,27.39,-0.71,26.08,-0.64,2,299.22999999999996
-2006-03-01,26.77,0.25,26.54,-0.6,27.79,-0.4,26.57,-0.65,3,299.71999999999997
-2006-04-01,24.15,-1.32,27.25,-0.25,28.38,-0.12,27.59,-0.19,4,300.73999999999995
-2006-05-01,23.86,-0.34,27.04,-0.04,28.91,0.12,27.91,0.06,5,301.06
-2006-06-01,22.77,-0.05,26.44,0.01,29.15,0.31,27.85,0.2,6,301.0
-2006-07-01,22.19,0.59,25.8,0.18,29.07,0.27,27.35,0.13,7,300.5
-2006-08-01,21.59,0.94,25.45,0.46,29.22,0.54,27.22,0.4,8,300.37
-2006-09-01,21.43,1.07,25.74,0.89,29.4,0.71,27.34,0.62,9,300.48999999999995
-2006-10-01,22.22,1.4,25.96,1.04,29.44,0.78,27.47,0.78,10,300.62
-2006-11-01,22.69,1.1,26.07,1.09,29.63,1.0,27.73,1.08,11,300.88
-2006-12-01,23.45,0.66,26.36,1.22,29.47,0.98,27.76,1.19,12,300.90999999999997
-2007-01-01,24.99,0.53,26.5,0.87,28.93,0.63,27.26,0.69,1,300.40999999999997
-2007-02-01,26.24,0.17,26.45,0.08,28.62,0.52,26.81,0.09,2,299.96
-2007-03-01,25.74,-0.78,26.79,-0.35,28.57,0.38,27.18,-0.04,3,300.33
-2007-04-01,24.3,-1.17,27.13,-0.37,28.7,0.2,27.78,0.0,4,300.92999999999995
-2007-05-01,22.73,-1.47,26.35,-0.73,28.86,0.07,27.57,-0.28,5,300.71999999999997
-2007-06-01,21.59,-1.23,25.83,-0.6,28.98,0.14,27.55,-0.1,6,300.7
-2007-07-01,20.27,-1.33,24.79,-0.83,28.81,0.01,26.79,-0.43,7,299.94
-2007-08-01,19.16,-1.49,23.86,-1.13,28.58,-0.1,26.2,-0.62,8,299.34999999999997
-2007-09-01,18.57,-1.79,23.52,-1.33,28.12,-0.57,25.77,-0.95,9,298.91999999999996
-2007-10-01,18.8,-2.02,23.36,-1.56,27.86,-0.8,25.22,-1.47,10,298.37
-2007-11-01,19.49,-2.1,23.17,-1.81,27.42,-1.21,25.06,-1.59,11,298.21
-2007-12-01,21.02,-1.77,23.59,-1.55,27.3,-1.19,24.97,-1.6,12,298.12
-2008-01-01,23.86,-0.6,24.13,-1.5,26.62,-1.68,24.71,-1.86,1,297.85999999999996
-2008-02-01,26.32,0.25,25.05,-1.32,26.43,-1.67,24.83,-1.89,2,297.97999999999996
-2008-03-01,27.3,0.78,26.56,-0.58,26.84,-1.35,26.07,-1.15,3,299.21999999999997
-2008-04-01,25.89,0.42,27.18,-0.32,27.44,-1.06,26.83,-0.95,4,299.97999999999996
-2008-05-01,24.41,0.21,27.08,0.0,27.9,-0.89,27.18,-0.67,5,300.33
-2008-06-01,23.55,0.73,26.53,0.1,28.08,-0.76,27.17,-0.48,6,300.32
-2008-07-01,22.63,1.03,26.12,0.5,28.25,-0.55,27.19,-0.03,7,300.34
-2008-08-01,21.79,1.14,25.63,0.64,28.18,-0.5,26.85,0.03,8,300.0
-2008-09-01,21.19,0.83,25.09,0.24,28.14,-0.55,26.44,-0.28,9,299.59
-2008-10-01,20.75,-0.07,24.79,-0.13,28.29,-0.37,26.33,-0.36,10,299.47999999999996
-2008-11-01,21.44,-0.15,24.75,-0.23,28.08,-0.55,26.3,-0.35,11,299.45
-2008-12-01,22.43,-0.36,24.6,-0.54,27.72,-0.77,25.74,-0.83,12,298.89
-2009-01-01,24.42,-0.1,25.03,-0.6,27.42,-0.88,25.54,-1.03,1,298.69
-2009-02-01,26.03,-0.11,25.85,-0.52,27.37,-0.73,26.04,-0.68,2,299.19
-2009-03-01,26.38,-0.26,26.44,-0.7,27.79,-0.4,26.67,-0.55,3,299.82
-2009-04-01,25.98,0.37,27.39,-0.11,28.37,-0.13,27.5,-0.27,4,300.65
-2009-05-01,24.83,0.56,27.4,0.32,28.99,0.2,28.03,0.18,5,301.17999999999995
-2009-06-01,23.73,0.85,27.12,0.69,29.2,0.36,28.11,0.47,6,301.26
-2009-07-01,22.63,1.02,26.56,0.94,29.21,0.4,27.94,0.72,7,301.09
-2009-08-01,21.64,1.0,25.94,0.95,29.21,0.53,27.53,0.71,8,300.67999999999995
-2009-09-01,20.82,0.47,25.66,0.8,29.28,0.58,27.47,0.75,9,300.62
-2009-10-01,20.96,0.17,25.73,0.81,29.65,0.99,27.63,0.94,10,300.78
-2009-11-01,22.11,0.51,26.23,1.26,29.88,1.25,28.19,1.54,11,301.34
-2009-12-01,23.16,0.35,26.67,1.53,29.67,1.18,28.3,1.72,12,301.45
-2010-01-01,24.82,0.3,26.63,1.0,29.51,1.21,28.07,1.5,1,301.21999999999997
-2010-02-01,26.08,-0.06,27.12,0.75,29.1,1.0,27.94,1.22,2,301.09
-2010-03-01,26.24,-0.4,27.73,0.6,29.21,1.02,28.29,1.08,3,301.44
-2010-04-01,26.05,0.45,28.05,0.55,29.25,0.74,28.36,0.59,4,301.51
-2010-05-01,24.28,0.0,26.97,-0.11,29.03,0.24,27.68,-0.17,5,300.83
-2010-06-01,22.6,-0.27,25.75,-0.68,28.64,-0.21,27.0,-0.65,6,300.15
-2010-07-01,20.08,-1.54,24.53,-1.09,28.09,-0.71,26.09,-1.13,7,299.23999999999995
-2010-08-01,19.27,-1.37,23.87,-1.12,27.47,-1.2,25.5,-1.32,8,298.65
-2010-09-01,18.9,-1.44,23.59,-1.26,27.13,-1.56,25.07,-1.65,9,298.21999999999997
-2010-10-01,19.06,-1.73,23.25,-1.66,27.06,-1.6,25.01,-1.68,10,298.15999999999997
-2010-11-01,20.03,-1.56,23.4,-1.58,27.07,-1.57,25.07,-1.58,11,298.21999999999997
-2010-12-01,21.48,-1.34,23.5,-1.64,26.89,-1.6,24.95,-1.62,12,298.09999999999997
-2011-01-01,24.08,-0.44,24.31,-1.32,26.72,-1.58,24.93,-1.64,1,298.08
-2011-02-01,26.22,0.08,25.55,-0.82,26.95,-1.15,25.46,-1.27,2,298.60999999999996
-2011-03-01,26.21,-0.43,26.39,-0.75,27.42,-0.77,26.23,-0.98,3,299.38
-2011-04-01,25.76,0.16,27.18,-0.32,27.86,-0.64,27.02,-0.76,4,300.16999999999996
-2011-05-01,24.89,0.62,26.94,-0.14,28.28,-0.51,27.42,-0.43,5,300.57
-2011-06-01,23.72,0.85,26.54,0.1,28.47,-0.37,27.46,-0.18,6,300.60999999999996
-2011-07-01,22.07,0.45,25.61,-0.01,28.47,-0.33,26.96,-0.26,7,300.10999999999996
-2011-08-01,20.64,0.0,24.58,-0.42,28.32,-0.36,26.19,-0.64,8,299.34
-2011-09-01,19.75,-0.59,24.22,-0.63,28.05,-0.64,25.98,-0.74,9,299.13
-2011-10-01,20.19,-0.6,23.97,-0.95,27.94,-0.72,25.72,-0.97,10,298.87
-2011-11-01,20.79,-0.8,23.89,-1.09,27.87,-0.77,25.6,-1.05,11,298.75
-2011-12-01,21.85,-0.96,24.2,-0.94,27.39,-1.1,25.53,-1.04,12,298.67999999999995
-2012-01-01,23.88,-0.64,24.9,-0.73,27.09,-1.21,25.49,-1.08,1,298.64
-2012-02-01,26.3,0.16,26.19,-0.18,27.2,-0.9,26.03,-0.69,2,299.17999999999995
-2012-03-01,26.91,0.27,26.92,-0.21,27.53,-0.66,26.63,-0.58,3,299.78
-2012-04-01,26.9,1.3,27.58,0.08,28.16,-0.34,27.38,-0.39,4,300.53
-2012-05-01,25.48,1.2,27.23,0.15,28.53,-0.26,27.8,-0.05,5,300.95
-2012-06-01,24.47,1.59,27.09,0.66,28.73,-0.11,27.95,0.31,6,301.09999999999997
-2012-07-01,22.61,0.99,26.54,0.92,28.86,0.06,27.75,0.53,7,300.9
-2012-08-01,20.99,0.35,25.72,0.73,29.1,0.42,27.55,0.73,8,300.7
-2012-09-01,20.83,0.49,25.28,0.43,29.12,0.43,27.24,0.51,9,300.39
-2012-10-01,20.68,-0.11,24.93,0.01,29.16,0.5,26.98,0.29,10,300.13
-2012-11-01,21.21,-0.38,25.11,0.14,29.17,0.54,27.01,0.36,11,300.15999999999997
-2012-12-01,22.13,-0.68,24.91,-0.23,28.71,0.23,26.46,-0.11,12,299.60999999999996
-2013-01-01,24.0,-0.52,25.06,-0.57,28.28,-0.02,26.16,-0.41,1,299.31
-2013-02-01,25.74,-0.41,25.9,-0.46,28.06,-0.04,26.32,-0.4,2,299.46999999999997
-2013-03-01,26.71,0.07,27.21,0.07,27.95,-0.24,27.0,-0.22,3,300.15
-2013-04-01,24.74,-0.86,27.35,-0.15,28.47,-0.03,27.68,-0.1,4,300.83
-2013-05-01,22.89,-1.38,26.39,-0.69,28.71,-0.08,27.57,-0.27,5,300.71999999999997
-2013-06-01,21.48,-1.4,25.8,-0.64,28.76,-0.08,27.43,-0.21,6,300.58
-2013-07-01,20.29,-1.33,24.97,-0.66,28.76,-0.04,26.91,-0.31,7,300.06
-2013-08-01,19.66,-0.98,24.44,-0.55,28.71,0.03,26.54,-0.28,8,299.69
-2013-09-01,19.78,-0.57,24.72,-0.13,28.7,0.01,26.65,-0.07,9,299.79999999999995
-2013-10-01,20.16,-0.63,24.7,-0.21,28.7,0.04,26.36,-0.33,10,299.51
-2013-11-01,21.06,-0.54,24.81,-0.17,28.91,0.27,26.65,0.01,11,299.79999999999995
-2013-12-01,22.61,-0.2,25.1,-0.04,28.64,0.15,26.53,-0.04,12,299.67999999999995
-2014-01-01,24.79,0.27,25.26,-0.37,28.14,-0.17,26.06,-0.51,1,299.21
-2014-02-01,25.4,-0.75,25.56,-0.81,28.37,0.27,26.18,-0.55,2,299.33
-2014-03-01,25.86,-0.78,26.9,-0.24,28.71,0.52,26.99,-0.22,3,300.14
-2014-04-01,25.23,-0.37,27.73,0.23,29.13,0.63,28.01,0.24,4,301.15999999999997
-2014-05-01,25.57,1.3,27.69,0.61,29.56,0.77,28.31,0.46,5,301.46
-2014-06-01,24.51,1.64,27.32,0.89,29.43,0.59,28.11,0.46,6,301.26
-2014-07-01,22.98,1.36,26.27,0.65,29.09,0.29,27.4,0.18,7,300.54999999999995
-2014-08-01,21.91,1.27,25.51,0.52,29.14,0.46,27.02,0.2,8,300.16999999999996
-2014-09-01,21.3,0.96,25.31,0.45,29.34,0.65,27.17,0.45,9,300.32
-2014-10-01,21.54,0.75,25.58,0.66,29.31,0.64,27.17,0.49,10,300.32
-2014-11-01,22.33,0.74,25.88,0.91,29.52,0.88,27.5,0.85,11,300.65
-2014-12-01,22.9,0.08,25.94,0.8,29.4,0.91,27.35,0.78,12,300.5
-2015-01-01,24.13,-0.39,25.99,0.36,29.16,0.86,27.1,0.53,1,300.25
-2015-02-01,25.59,-0.55,26.55,0.18,29.12,1.02,27.29,0.56,2,300.44
-2015-03-01,26.69,0.06,27.29,0.15,29.32,1.13,27.79,0.58,3,300.94
-2015-04-01,26.95,1.35,28.17,0.67,29.73,1.23,28.56,0.78,4,301.71
-2015-05-01,26.71,2.43,28.28,1.19,29.88,1.09,28.88,1.03,5,302.03
-2015-06-01,25.42,2.54,28.1,1.66,29.93,1.09,28.96,1.32,6,302.10999999999996
-2015-07-01,24.48,2.87,27.79,2.17,29.8,1.0,28.82,1.6,7,301.96999999999997
-2015-08-01,22.88,2.24,27.33,2.34,29.66,0.98,28.89,2.07,8,302.03999999999996
-2015-09-01,22.91,2.57,27.48,2.63,29.73,1.04,29.0,2.28,9,302.15
-2015-10-01,23.31,2.52,27.58,2.66,29.79,1.12,29.15,2.46,10,302.29999999999995
-2015-11-01,23.83,2.24,27.91,2.93,30.3,1.67,29.6,2.95,11,302.75
-2015-12-01,25.01,2.19,27.99,2.85,30.11,1.63,29.39,2.82,12,302.53999999999996
-2016-01-01,25.93,1.41,28.21,2.58,29.65,1.35,29.17,2.6,1,302.32
-2016-02-01,26.81,0.67,28.36,1.99,29.55,1.45,29.12,2.4,2,302.27
-2016-03-01,27.57,0.93,28.7,1.57,29.53,1.34,28.9,1.68,3,302.04999999999995
-2016-04-01,25.83,0.23,28.34,0.84,29.39,0.89,28.87,1.09,4,302.02
-2016-05-01,24.55,0.27,27.11,0.03,29.39,0.6,28.15,0.3,5,301.29999999999995
-2016-06-01,23.17,0.29,26.31,-0.12,29.36,0.52,27.53,-0.12,6,300.67999999999995
-2016-07-01,21.79,0.17,25.14,-0.48,29.06,0.26,26.73,-0.49,7,299.88
-2016-08-01,21.03,0.39,24.53,-0.46,28.68,-0.0,26.28,-0.54,8,299.42999999999995
-2016-09-01,20.87,0.53,24.67,-0.18,28.48,-0.21,26.11,-0.61,9,299.26
-2016-10-01,21.18,0.39,24.47,-0.45,28.26,-0.4,25.96,-0.73,10,299.10999999999996
-2016-11-01,21.68,0.09,24.58,-0.4,28.27,-0.37,26.1,-0.55,11,299.25
-2016-12-01,23.35,0.53,24.78,-0.36,28.35,-0.14,26.16,-0.41,12,299.31
-2017-01-01,25.75,1.23,25.61,-0.02,28.18,-0.12,26.25,-0.32,1,299.4
-2017-02-01,27.76,1.62,27.0,0.63,28.03,-0.07,26.87,0.14,2,300.02
-2017-03-01,28.52,1.89,27.7,0.57,28.13,-0.06,27.34,0.13,3,300.48999999999995
-2017-04-01,26.53,0.93,28.09,0.59,28.65,0.15,28.1,0.32,4,301.25
-2017-05-01,25.06,0.78,27.6,0.51,29.08,0.29,28.3,0.46,5,301.45
-2017-06-01,22.98,0.1,26.73,0.3,29.39,0.55,28.19,0.55,6,301.34
-2017-07-01,21.54,-0.07,25.85,0.23,29.21,0.4,27.61,0.39,7,300.76
-2017-08-01,20.19,-0.45,24.82,-0.17,28.87,0.19,26.67,-0.15,8,299.82
-2017-09-01,19.67,-0.67,24.17,-0.68,28.69,0.0,26.29,-0.43,9,299.44
-2017-10-01,19.45,-1.34,24.28,-0.64,28.55,-0.11,26.23,-0.46,10,299.38
-2017-11-01,20.44,-1.15,23.92,-1.05,28.46,-0.18,25.79,-0.86,11,298.94
-2017-12-01,21.44,-1.38,24.05,-1.09,28.24,-0.25,25.8,-0.77,12,298.95
-2018-01-01,23.71,-0.81,24.48,-1.14,28.03,-0.27,25.82,-0.75,1,298.96999999999997
-2018-02-01,25.57,-0.57,25.36,-1.01,27.86,-0.24,25.83,-0.9,2,298.97999999999996
-2018-03-01,25.83,-0.8,26.37,-0.76,28.14,-0.05,26.48,-0.73,3,299.63
-2018-04-01,24.58,-1.02,27.12,-0.38,28.63,0.12,27.42,-0.36,4,300.57
-2018-05-01,23.73,-0.54,26.94,-0.15,29.01,0.22,27.72,-0.13,5,300.87
-2018-06-01,22.19,-0.69,26.72,0.29,29.16,0.32,27.85,0.2,6,301.0
-2018-07-01,21.43,-0.19,26.05,0.43,29.1,0.3,27.52,0.3,7,300.66999999999996
-2018-08-01,20.66,0.02,25.14,0.15,29.19,0.51,27.11,0.29,8,300.26
-2018-09-01,20.31,-0.03,25.2,0.35,29.17,0.48,27.1,0.38,9,300.25
-2018-10-01,21.23,0.43,25.78,0.86,29.61,0.95,27.55,0.86,10,300.7
-2018-11-01,22.27,0.68,26.02,1.05,29.59,0.95,27.64,0.99,11,300.78999999999996
-2018-12-01,23.6,0.78,26.12,0.98,29.52,1.03,27.53,0.96,12,300.67999999999995
-2019-01-01,25.1,0.57,26.17,0.55,29.0,0.65,27.08,0.52,1,300.22999999999996
-2019-02-01,26.45,0.24,26.91,0.55,29.06,0.86,27.41,0.68,2,300.56
-2019-03-01,26.8,0.07,27.89,0.71,29.1,0.77,28.22,0.95,3,301.37
-2019-04-01,25.68,-0.01,28.17,0.56,29.24,0.56,28.6,0.71,4,301.75
-2019-05-01,24.38,-0.08,27.69,0.51,29.58,0.63,28.57,0.62,5,301.71999999999997
-2019-06-01,22.62,-0.38,26.81,0.27,29.62,0.6,28.24,0.47,6,301.39
-2019-07-01,21.34,-0.33,25.68,-0.07,29.73,0.77,27.63,0.3,7,300.78
-2019-08-01,20.2,-0.46,24.89,-0.15,29.5,0.68,26.97,0.1,8,300.12
-2019-09-01,19.5,-0.83,24.61,-0.23,29.33,0.55,26.7,-0.03,9,299.84999999999997
-2019-10-01,20.02,-0.71,25.12,0.17,29.64,0.88,27.31,0.59,10,300.46
-2019-11-01,21.32,-0.22,25.46,0.41,29.47,0.7,27.26,0.51,11,300.40999999999997
-2019-12-01,23.16,0.4,25.47,0.26,29.5,0.91,27.07,0.43,12,300.21999999999997
-2020-01-01,24.55,0.02,25.81,0.2,29.28,0.93,27.09,0.53,1,300.23999999999995
-2020-02-01,26.56,0.35,26.61,0.24,29.17,0.98,27.14,0.41,2,300.28999999999996
-2020-03-01,27.11,0.38,27.43,0.24,29.22,0.89,27.82,0.55,3,300.96999999999997
-2020-04-01,26.0,0.32,28.01,0.4,29.29,0.61,28.32,0.44,4,301.46999999999997
-2020-05-01,24.24,-0.22,26.82,-0.36,28.94,-0.01,27.59,-0.36,5,300.73999999999995
-2020-06-01,22.13,-0.87,25.75,-0.79,29.07,0.05,27.3,-0.47,6,300.45
-2020-07-01,20.44,-1.23,25.08,-0.66,28.83,-0.13,26.89,-0.44,7,300.03999999999996
-2020-08-01,19.69,-0.97,24.42,-0.62,28.47,-0.35,26.18,-0.69,8,299.33
-2020-09-01,19.48,-0.85,23.58,-1.26,28.29,-0.49,25.77,-0.96,9,298.91999999999996
-2020-10-01,19.67,-1.07,23.61,-1.34,27.89,-0.87,25.3,-1.42,10,298.45
-2020-11-01,20.94,-0.61,23.82,-1.23,27.91,-0.86,25.34,-1.42,11,298.48999999999995
-2020-12-01,22.16,-0.6,24.38,-0.83,27.65,-0.95,25.53,-1.12,12,298.67999999999995
-2021-01-01,23.89,-0.64,25.06,-0.55,27.1,-1.25,25.58,-0.99,1,298.72999999999996
-2021-02-01,25.55,-0.66,25.8,-0.57,27.2,-1.0,25.81,-0.92,2,298.96
-2021-03-01,26.48,-0.26,26.8,-0.39,27.79,-0.55,26.75,-0.51,3,299.9
-2021-04-01,24.89,-0.8,26.96,-0.65,28.47,-0.21,27.4,-0.49,4,300.54999999999995
diff --git a/foundations/quickstart.ipynb b/foundations/quickstart.ipynb
index 56c9673c6..356529a27 100644
--- a/foundations/quickstart.ipynb
+++ b/foundations/quickstart.ipynb
@@ -617,14 +617,6 @@
"\n",
"Read on for more details on how to install and run Python and necessary packages on your own laptop."
]
- },
- {
- "cell_type": "code",
- "execution_count": null,
- "id": "67c42944",
- "metadata": {},
- "outputs": [],
- "source": []
}
],
"metadata": {
diff --git a/preamble/template.ipynb b/preamble/template.ipynb
index ad912f32f..655e97551 100644
--- a/preamble/template.ipynb
+++ b/preamble/template.ipynb
@@ -34,6 +34,8 @@
"metadata": {},
"source": [
"## Overview\n",
+ "If you have an introductory paragraph, lead with it here! Keep it short and tied to your material, then be sure to continue into the required list of topics below,\n",
+ "\n",
"1. This is a numbered list of the specific topics\n",
"1. These should map approximately to your main sections of content\n",
"1. Or each second-level, `##`, header in your notebook\n",
@@ -58,7 +60,7 @@
"| [Understanding of NetCDF](some-link-to-external-resource) | Helpful | Familiarity with metadata structure |\n",
"| Project management | Helpful | |\n",
"\n",
- "- **Experience level**: with relevant packages or general self-assessed experience as **beginner/user/expert**\n",
+ "- **Experience level**: with relevant packages or general self-assessed experience as **beginner/intermediate/advanced**\n",
"- **Time to learn**: estimate in minutes or qualitatively as **long/medium/short**\n",
"- **System requirements**:\n",
" - Populate with any system, version, or non-Python software requirements if necessary\n",
@@ -142,11 +144,7 @@
},
{
"cell_type": "markdown",
- "metadata": {
- "slideshow": {
- "slide_type": "subslide"
- }
- },
+ "metadata": {},
"source": [
"## Your second content section\n",
"Here we can move on to our second objective, and we can demonstrate"
@@ -262,7 +260,7 @@
"## Summary\n",
"Add one final `---` marking the end of your body of content, and then conclude with a brief single paragraph summarizing at a high level the key pieces that were learned and how they tied to your objectives. Look to reiterate what the most important takeaways were.\n",
"\n",
- "### What's Next?\n",
+ "### What's next?\n",
"Let Jupyter book tie this to the next (sequential) piece of content that people could move on to down below and in the sidebar. However, if this page uniquely enables your reader to tackle other nonsequential concepts throughout this book, or even external content, link to it here!"
]
},
@@ -270,7 +268,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
- "## Resources and References\n",
+ "## Resources and references\n",
"Finally, be rigorous in your citations and references as necessary. Give credit where credit is due. Also, feel free to link to relevant external material, further reading, documentation, etc. Then you're done! Give yourself a quick review, a high five, and send us a pull request. A few final notes:\n",
" - `Kernel > Restart Kernel and Run All Cells...` to confirm that your notebook will cleanly run from start to finish\n",
" - `Kernel > Restart Kernel and Clear All Outputs...` before committing your notebook, our machines will do the heavy lifting\n",
@@ -284,7 +282,7 @@
],
"metadata": {
"kernelspec": {
- "display_name": "Python3",
+ "display_name": "Python [conda env:pythia-book-dev]",
"language": "python",
"name": "python3"
},
@@ -300,6 +298,60 @@
"pygments_lexer": "ipython3",
"version": "3.8.10"
},
+ "nbdime-conflicts": {
+ "local_diff": [
+ {
+ "diff": [
+ {
+ "diff": [
+ {
+ "key": 0,
+ "op": "addrange",
+ "valuelist": [
+ "Python 3"
+ ]
+ },
+ {
+ "key": 0,
+ "length": 1,
+ "op": "removerange"
+ }
+ ],
+ "key": "display_name",
+ "op": "patch"
+ }
+ ],
+ "key": "kernelspec",
+ "op": "patch"
+ }
+ ],
+ "remote_diff": [
+ {
+ "diff": [
+ {
+ "diff": [
+ {
+ "key": 0,
+ "op": "addrange",
+ "valuelist": [
+ "Python3"
+ ]
+ },
+ {
+ "key": 0,
+ "length": 1,
+ "op": "removerange"
+ }
+ ],
+ "key": "display_name",
+ "op": "patch"
+ }
+ ],
+ "key": "kernelspec",
+ "op": "patch"
+ }
+ ]
+ },
"toc-autonumbering": false
},
"nbformat": 4,