Skip to content

Add deprecation warnings for lock kwarg #5237

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 6 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 5 additions & 0 deletions doc/whats-new.rst
Original file line number Diff line number Diff line change
Expand Up @@ -108,6 +108,11 @@ Breaking changes
Deprecations
~~~~~~~~~~~~

- The `lock` keyword argument to :py:func:`open_dataset` and :py:func:`open_dataarray` has now
been deprecated, and will give a warning if passed. From the next version it will
raise an error. This is part of the refactor to support external backends (:issue:`5073`).
By `Tom Nicholas <https://github.com/TomNicholas>`_.

Bug fixes
~~~~~~~~~
- Properly support :py:meth:`DataArray.ffill`, :py:meth:`DataArray.bfill`, :py:meth:`Dataset.ffill`, :py:meth:`Dataset.bfill` along chunked dimensions.
Expand Down
29 changes: 19 additions & 10 deletions xarray/backends/api.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
import os
import warnings
from glob import glob
from io import BytesIO
from numbers import Number
Expand Down Expand Up @@ -444,11 +445,6 @@ def open_dataset(

- 'group': path to the netCDF4 group in the given file to open given as
a str,supported by "netcdf4", "h5netcdf", "zarr".
- 'lock': resource lock to use when reading data from disk. Only
relevant when using dask or another form of parallelism. By default,
appropriate locks are chosen to safely read and write files with the
currently active dask scheduler. Supported by "netcdf4", "h5netcdf",
"pynio", "pseudonetcdf", "cfgrib".

See engine open function for kwargs accepted by each specific engine.

Expand All @@ -474,6 +470,15 @@ def open_dataset(
"all other options must be passed as keyword arguments"
)

# TODO remove after v0.19
if kwargs.pop("lock", None):
warnings.warn(
"The kwarg 'lock' has been deprecated, and is now "
"ignored. In the future passing lock will "
"raise an error.",
DeprecationWarning,
)
Comment on lines +473 to +480
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we include the target version in the warning and remove the comment?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did that originally but Max said above that it might be better to leave it out... I think this is fine to be honest.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hmm... in any case, I think it would be good to keep the error message for open_dataset and open_dataarray synchronized

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could always add a specific date of deprecation later on, even if that would extend the cycle. It doesn't really matter if this check just sits here for a while I suppose.

Oh I didn't mean to make them different! Will fix now.


if cache is None:
cache = chunks is None

Expand Down Expand Up @@ -628,11 +633,6 @@ def open_dataarray(

- 'group': path to the netCDF4 group in the given file to open given as
a str,supported by "netcdf4", "h5netcdf", "zarr".
- 'lock': resource lock to use when reading data from disk. Only
relevant when using dask or another form of parallelism. By default,
appropriate locks are chosen to safely read and write files with the
currently active dask scheduler. Supported by "netcdf4", "h5netcdf",
"pynio", "pseudonetcdf", "cfgrib".

See engine open function for kwargs accepted by each specific engine.

Expand All @@ -655,6 +655,15 @@ def open_dataarray(
"all other options must be passed as keyword arguments"
)

# TODO remove after v0.19
if kwargs.pop("lock", None):
warnings.warn(
"The kwarg 'lock' has been deprecated, and is now "
"ignored. In the future passing lock will "
"raise an error.",
DeprecationWarning,
)

dataset = open_dataset(
filename_or_obj,
decode_cf=decode_cf,
Expand Down