Skip to content

doc fixes. #2611

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Dec 17, 2018
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion doc/dask.rst
Original file line number Diff line number Diff line change
Expand Up @@ -179,7 +179,7 @@ Explicit conversion by wrapping a DataArray with ``np.asarray`` also works:
Alternatively you can load the data into memory but keep the arrays as
Dask arrays using the :py:meth:`~xarray.Dataset.persist` method:

.. ipython::
.. ipython:: python

ds = ds.persist()

Expand Down
5 changes: 3 additions & 2 deletions doc/examples/multidimensional-coords.rst
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ As an example, consider this dataset from the

.. ipython:: python

ds = xr.tutorial.load_dataset('rasm')
ds = xr.tutorial.open_dataset('rasm').load()
ds

In this example, the *logical coordinates* are ``x`` and ``y``, while
Expand Down Expand Up @@ -107,7 +107,8 @@ function to specify the output coordinates of the group.
# define a label for each bin corresponding to the central latitude
lat_center = np.arange(1, 90, 2)
# group according to those bins and take the mean
Tair_lat_mean = ds.Tair.groupby_bins('xc', lat_bins, labels=lat_center).mean()
Tair_lat_mean = (ds.Tair.groupby_bins('xc', lat_bins, labels=lat_center)
.mean(xr.ALL_DIMS))
# plot the result
@savefig xarray_multidimensional_coords_14_1.png width=5in
Tair_lat_mean.plot();
Expand Down
3 changes: 2 additions & 1 deletion doc/examples/weather-data.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Shared setup:
.. ipython:: python
:suppress:

fpath = "examples/_code/weather_data_setup.py"
fpath = "doc/examples/_code/weather_data_setup.py"
with open(fpath) as f:
code = compile(f.read(), fpath, 'exec')
exec(code)
Expand Down Expand Up @@ -123,6 +123,7 @@ The :py:func:`~xarray.Dataset.fillna` method on grouped objects lets you easily
fill missing values by group:

.. ipython:: python
:okwarning:

# throw away the first half of every month
some_missing = ds.tmin.sel(time=ds['time.day'] > 15).reindex_like(ds)
Expand Down
8 changes: 4 additions & 4 deletions doc/groupby.rst
Original file line number Diff line number Diff line change
Expand Up @@ -118,7 +118,7 @@ dimensions *other than* the provided one:

.. ipython:: python

ds.groupby('x').std()
ds.groupby('x').std(xr.ALL_DIMS)

First and last
~~~~~~~~~~~~~~
Expand All @@ -129,7 +129,7 @@ values for group along the grouped dimension:

.. ipython:: python

ds.groupby('letters').first()
ds.groupby('letters').first(xr.ALL_DIMS)

By default, they skip missing values (control this with ``skipna``).

Expand All @@ -144,7 +144,7 @@ coordinates. For example:

.. ipython:: python

alt = arr.groupby('letters').mean()
alt = arr.groupby('letters').mean(xr.ALL_DIMS)
alt
ds.groupby('letters') - alt

Expand Down Expand Up @@ -197,7 +197,7 @@ __ http://cfconventions.org/cf-conventions/v1.6.0/cf-conventions.html#_two_dimen
'lat': (['ny','nx'], [[10,10],[20,20]] ),},
dims=['ny','nx'])
da
da.groupby('lon').sum()
da.groupby('lon').sum(xr.ALL_DIMS)
da.groupby('lon').apply(lambda x: x - x.mean(), shortcut=False)

Because multidimensional groups have the ability to generate a very large
Expand Down
2 changes: 1 addition & 1 deletion doc/internals.rst
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@ Back in an interactive IPython session, we can use these properties:
.. ipython:: python
:suppress:

exec(open("examples/_code/accessor_example.py").read())
exec(open("doc/examples/_code/accessor_example.py").read())

.. ipython:: python

Expand Down
1 change: 1 addition & 0 deletions doc/pandas.rst
Original file line number Diff line number Diff line change
Expand Up @@ -173,6 +173,7 @@ So you can represent a Panel, in two ways:
Let's take a look:

.. ipython:: python
:okwarning:

panel = pd.Panel(np.random.rand(2, 3, 4), items=list('ab'), major_axis=list('mno'),
minor_axis=pd.date_range(start='2000', periods=4, name='date'))
Expand Down
21 changes: 14 additions & 7 deletions doc/plotting.rst
Original file line number Diff line number Diff line change
Expand Up @@ -144,7 +144,7 @@ axes created by ``plt.subplots``.
plt.tight_layout()

@savefig plotting_example_existing_axes.png width=6in
plt.show()
plt.draw()

On the right is a histogram created by :py:func:`xarray.plot.hist`.

Expand Down Expand Up @@ -343,7 +343,7 @@ matplotlib is available.
plt.tight_layout()

@savefig plotting_2d_call_matplotlib.png width=4in
plt.show()
plt.draw()

.. note::

Expand All @@ -359,7 +359,7 @@ matplotlib is available.
air2d.plot()

@savefig plotting_2d_call_matplotlib2.png width=4in
plt.show()
plt.draw()

Colormaps
~~~~~~~~~
Expand Down Expand Up @@ -444,9 +444,11 @@ if using ``imshow`` or ``pcolormesh`` (but not with ``contour`` or ``contourf``,
since levels are chosen automatically).

.. ipython:: python
:okwarning:

@savefig plotting_seaborn_palette.png width=4in
air2d.plot(levels=10, cmap='husl')
plt.draw()

.. _plotting.faceting:

Expand Down Expand Up @@ -519,6 +521,11 @@ Other features

Faceted plotting supports other arguments common to xarray 2d plots.

.. ipython:: python
:suppress:

plt.close('all')

.. ipython:: python

hasoutliers = t.isel(time=slice(0, 5)).copy()
Expand All @@ -528,7 +535,7 @@ Faceted plotting supports other arguments common to xarray 2d plots.
@savefig plot_facet_robust.png
g = hasoutliers.plot.pcolormesh('lon', 'lat', col='time', col_wrap=3,
robust=True, cmap='viridis',
cbar_kwargs={'label': 'this has outliers'})
cbar_kwargs={'label': 'this has outliers'})

FacetGrid Objects
~~~~~~~~~~~~~~~~~
Expand Down Expand Up @@ -568,7 +575,7 @@ they have been plotted.
bottomright.annotate('bottom right', (240, 40))

@savefig plot_facet_iterator.png
plt.show()
plt.draw()

TODO: add an example of using the ``map`` method to plot dataset variables
(e.g., with ``plt.quiver``).
Expand Down Expand Up @@ -603,7 +610,7 @@ by faceting are accessible in the object returned by ``plot``:
ax.coastlines()
ax.gridlines()
@savefig plotting_maps_cartopy_facetting.png width=100%
plt.show();
plt.draw();


Details
Expand Down Expand Up @@ -634,7 +641,7 @@ These are provided for user convenience; they all call the same code.
xplt.line(da, ax=axes[1, 1])
plt.tight_layout()
@savefig plotting_ways_to_use.png width=6in
plt.show()
plt.draw()

Here the output is the same. Since the data is 1 dimensional the line plot
was used.
Expand Down
6 changes: 3 additions & 3 deletions doc/reshaping.rst
Original file line number Diff line number Diff line change
Expand Up @@ -186,8 +186,8 @@ labels for one or several dimensions:
array
array['c'] = ('x', ['a', 'b', 'c'])
array.set_index(x='c')
array.set_index(x='c', inplace=True)
array.reset_index('x', drop=True)
array = array.set_index(x='c')
array = array.reset_index('x', drop=True)

.. _reshape.shift_and_roll:

Expand All @@ -201,7 +201,7 @@ To adjust coordinate labels, you can use the :py:meth:`~xarray.Dataset.shift` an

array = xr.DataArray([1, 2, 3, 4], dims='x')
array.shift(x=2)
array.roll(x=2)
array.roll(x=2, roll_coords=True)

.. _reshape.sort:

Expand Down
4 changes: 4 additions & 0 deletions doc/time-series.rst
Original file line number Diff line number Diff line change
Expand Up @@ -163,6 +163,7 @@ Datetime components couple particularly well with grouped operations (see
calculate the mean by time of day:

.. ipython:: python
:okwarning:

ds.groupby('time.hour').mean()

Expand All @@ -176,6 +177,7 @@ same api as ``resample`` `in pandas`_.
For example, we can downsample our dataset from hourly to 6-hourly:

.. ipython:: python
:okwarning:

ds.resample(time='6H')

Expand All @@ -184,6 +186,7 @@ necessary for resampling. All of the reduction methods which work with
``Resample`` objects can also be used for resampling:

.. ipython:: python
:okwarning:

ds.resample(time='6H').mean()

Expand Down Expand Up @@ -326,6 +329,7 @@ For data indexed by a :py:class:`~xarray.CFTimeIndex` xarray currently supports:
:py:meth:`~xarray.CFTimeIndex.to_datetimeindex` method:

.. ipython:: python
:okwarning:

modern_times = xr.cftime_range('2000', periods=24, freq='MS', calendar='noleap')
da = xr.DataArray(range(24), [('time', modern_times)])
Expand Down
8 changes: 7 additions & 1 deletion doc/whats-new.rst
Original file line number Diff line number Diff line change
Expand Up @@ -174,7 +174,7 @@ Bug fixes
By `Spencer Clark <https://github.com/spencerkclark>`_.
- We now properly handle arrays of ``datetime.datetime`` and ``datetime.timedelta``
provided as coordinates. (:issue:`2512`)
By `Deepak Cherian <https://github.com/dcherian`_.
By `Deepak Cherian <https://github.com/dcherian>`_.
- ``xarray.DataArray.roll`` correctly handles multidimensional arrays.
(:issue:`2445`)
By `Keisuke Fujii <https://github.com/fujiisoup>`_.
Expand Down Expand Up @@ -2216,6 +2216,7 @@ Enhancements
for shifting/rotating datasets or arrays along a dimension:

.. ipython:: python
:okwarning:

array = xray.DataArray([5, 6, 7, 8], dims='x')
array.shift(x=2)
Expand Down Expand Up @@ -2723,6 +2724,7 @@ Enhancements
need to supply the time dimension explicitly:

.. ipython:: python
:verbatim:

time = pd.date_range('2000-01-01', freq='6H', periods=10)
array = xray.DataArray(np.arange(10), [('time', time)])
Expand All @@ -2732,26 +2734,30 @@ Enhancements
options such as ``closed`` and ``label`` let you control labeling:

.. ipython:: python
:verbatim:

array.resample('1D', dim='time', how='sum', label='right')

If the desired temporal resolution is higher than the original data
(upsampling), xray will insert missing values:

.. ipython:: python
:verbatim:

array.resample('3H', 'time')

- ``first`` and ``last`` methods on groupby objects let you take the first or
last examples from each group along the grouped axis:

.. ipython:: python
:verbatim:

array.groupby('time.day').first()

These methods combine well with ``resample``:

.. ipython:: python
:verbatim:

array.resample('1D', dim='time', how='first')

Expand Down
1 change: 1 addition & 0 deletions xarray/backends/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -487,6 +487,7 @@ def open_mfdataset(paths, chunks=None, concat_dim=_CONCAT_DIM_DEFAULT,
"""Open multiple files as a single dataset.
Requires dask to be installed. See documentation for details on dask [1].
Attributes from the first dataset file are used for the combined dataset.

Parameters
----------
paths : str or sequence
Expand Down
2 changes: 2 additions & 0 deletions xarray/core/combine.py
Original file line number Diff line number Diff line change
Expand Up @@ -560,6 +560,7 @@ def auto_combine(datasets, concat_dim=_CONCAT_DIM_DEFAULT,
``auto_combine`` works well if you have N years of data and M data
variables, and each combination of a distinct time period and set of data
variables is saved its own dataset.

Parameters
----------
datasets : sequence of xarray.Dataset
Expand Down Expand Up @@ -589,6 +590,7 @@ def auto_combine(datasets, concat_dim=_CONCAT_DIM_DEFAULT,
Details are in the documentation of concat
coords : {'minimal', 'different', 'all' or list of str}, optional
Details are in the documentation of conca

Returns
-------
combined : xarray.Dataset
Expand Down
18 changes: 9 additions & 9 deletions xarray/core/dataarray.py
Original file line number Diff line number Diff line change
Expand Up @@ -997,8 +997,8 @@ def interp(self, coords=None, method='linear', assume_sorted=False,
interpolated: xr.DataArray
New dataarray on the new coordinates.

Note
----
Notes
-----
scipy is required.

See Also
Expand Down Expand Up @@ -1053,8 +1053,8 @@ def interp_like(self, other, method='linear', assume_sorted=False,
Another dataarray by interpolating this dataarray's data along the
coordinates of the other object.

Note
----
Notes
-----
scipy is required.
If the dataarray has object-type coordinates, reindex is used for these
coordinates instead of the interpolation.
Expand Down Expand Up @@ -2291,13 +2291,13 @@ def quantile(self, q, dim=None, interpolation='linear', keep_attrs=None):
use when the desired quantile lies between two data points
``i < j``:

* linear: ``i + (j - i) * fraction``, where ``fraction`` is
- linear: ``i + (j - i) * fraction``, where ``fraction`` is
the fractional part of the index surrounded by ``i`` and
``j``.
* lower: ``i``.
* higher: ``j``.
* nearest: ``i`` or ``j``, whichever is nearest.
* midpoint: ``(i + j) / 2``.
- lower: ``i``.
- higher: ``j``.
- nearest: ``i`` or ``j``, whichever is nearest.
- midpoint: ``(i + j) / 2``.
keep_attrs : bool, optional
If True, the dataset's attributes (`attrs`) will be copied from
the original object to the new one. If False (default), the new
Expand Down
9 changes: 4 additions & 5 deletions xarray/core/dataset.py
Original file line number Diff line number Diff line change
Expand Up @@ -1945,8 +1945,8 @@ def interp(self, coords=None, method='linear', assume_sorted=False,
interpolated: xr.Dataset
New dataset on the new coordinates.

Note
----
Notes
-----
scipy is required.

See Also
Expand Down Expand Up @@ -2037,8 +2037,8 @@ def interp_like(self, other, method='linear', assume_sorted=False,
Another dataset by interpolating this dataset's data along the
coordinates of the other object.

Note
----
Notes
-----
scipy is required.
If the dataset has object-type coordinates, reindex is used for these
coordinates instead of the interpolation.
Expand Down Expand Up @@ -2548,7 +2548,6 @@ def merge(self, other, inplace=None, overwrite_vars=frozenset(),
'no_conflicts'}, optional
String indicating how to compare variables of the same name for
potential conflicts:

- 'broadcast_equals': all values must be equal when variables are
broadcast against each other to ensure common dimensions.
- 'equals': all values and dimensions must be the same.
Expand Down