Skip to content

Commit 05152be

Browse files
yutik-nnYuta  Nordenpre-commit-ci[bot]
authored
Edits to "Xarray and Dask" (#177)
* mean var used in an exercise before intro * added a sentence clarifying /tmp --------- Co-authored-by: Yuta Norden <[email protected]> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
1 parent b81bca6 commit 05152be

File tree

1 file changed

+14
-19
lines changed

1 file changed

+14
-19
lines changed

intermediate/xarray_and_dask.ipynb

Lines changed: 14 additions & 19 deletions
Original file line numberDiff line numberDiff line change
@@ -230,9 +230,6 @@
230230
"cell_type": "code",
231231
"execution_count": null,
232232
"metadata": {
233-
"jupyter": {
234-
"outputs_hidden": true
235-
},
236233
"tags": []
237234
},
238235
"outputs": [],
@@ -246,16 +243,13 @@
246243
"source": [
247244
"### Exercise\n",
248245
"\n",
249-
"Try calling `mean.values` and `mean.data`. Do you understand the difference?"
246+
"Try calling `ds.air.values` and `ds.air.data`. Do you understand the difference?"
250247
]
251248
},
252249
{
253250
"cell_type": "code",
254251
"execution_count": null,
255252
"metadata": {
256-
"jupyter": {
257-
"outputs_hidden": true
258-
},
259253
"tags": []
260254
},
261255
"outputs": [],
@@ -331,7 +325,7 @@
331325
"2. `.load()` replaces the dask array in the xarray object with a numpy array.\n",
332326
" This is equivalent to `ds = ds.compute()`\n",
333327
" \n",
334-
"**Tip:** There is a third option : \"persisting\". `.persist()` loads the values into distributed RAM. The values are computed but remain distributed across workers. So `ds.air.persist()` still returns a dask array. This is useful if you will be repeatedly using a dataset for computation but it is too large to load into local memory. You will see a persistent task on the dashboard. See the [dask user guide](https://docs.dask.org/en/latest/api.html#dask.persist) for more on persisting\n"
328+
"**Tip:** There is a third option : \"persisting\". `.persist()` loads the values into distributed RAM. The values are computed but remain distributed across workers. So `ds.air.persist()` still returns a dask array. This is useful if you will be repeatedly using a dataset for computation but it is too large to load into local memory. You will see a persistent task on the dashboard. See the [dask user guide](https://docs.dask.org/en/latest/api.html#dask.persist) for more on persisting"
335329
]
336330
},
337331
{
@@ -347,9 +341,6 @@
347341
"cell_type": "code",
348342
"execution_count": null,
349343
"metadata": {
350-
"jupyter": {
351-
"outputs_hidden": true
352-
},
353344
"tags": []
354345
},
355346
"outputs": [],
@@ -446,7 +437,10 @@
446437
"\n",
447438
"You can use any kind of Dask cluster. This step is completely independent of\n",
448439
"xarray. While not strictly necessary, the dashboard provides a nice learning\n",
449-
"tool."
440+
"tool.\n",
441+
"\n",
442+
"By default, Dask uses the current working directory for writing temporary files.\n",
443+
"We choose to use a temporary scratch folder `local_directory='/tmp'` in the example below instead."
450444
]
451445
},
452446
{
@@ -464,10 +458,17 @@
464458
"# if os.environ.get('JUPYTERHUB_USER'):\n",
465459
"# dask.config.set(**{\"distributed.dashboard.link\": \"/user/{JUPYTERHUB_USER}/proxy/{port}/status\"})\n",
466460
"\n",
467-
"client = Client(local_directory='/tmp')\n",
461+
"client = Client()\n",
468462
"client"
469463
]
470464
},
465+
{
466+
"cell_type": "code",
467+
"execution_count": null,
468+
"metadata": {},
469+
"outputs": [],
470+
"source": []
471+
},
471472
{
472473
"cell_type": "markdown",
473474
"metadata": {},
@@ -483,9 +484,6 @@
483484
"cell_type": "code",
484485
"execution_count": null,
485486
"metadata": {
486-
"jupyter": {
487-
"outputs_hidden": true
488-
},
489487
"tags": []
490488
},
491489
"outputs": [],
@@ -539,9 +537,6 @@
539537
"cell_type": "code",
540538
"execution_count": null,
541539
"metadata": {
542-
"jupyter": {
543-
"outputs_hidden": true
544-
},
545540
"tags": []
546541
},
547542
"outputs": [],

0 commit comments

Comments
 (0)