Skip to content

Fix NotImplementedError: xarray can't set arrays with multiple array indices to dask yet. #115

@tomvothecoder

Description

@tomvothecoder

What versions of software are you using?

  • Package Version: latest master

What are the steps to reproduce this issue?

import xcdat

# specify chunks to open with Dask
ds = xcdat.open_dataset("path/to/file.nc", var="tas", chunks={"time": 5})

# call spatial averaging and receive 
ts_n34 = ds.spatial.avg("tas", axis=["lat", "lon"], lat_bounds=(-5, 5), lon_bounds=(-170, -120))["tas"]

What happens? Any logs, error output, etc?

```python
NotImplementedError: xarray can't set arrays with multiple array indices to dask yet.
---------------------------------------------------------------------------
NotImplementedError                       Traceback (most recent call last)
~/Documents/Repositories/tomvothecoder/xcdat/qa/PR98/refactor.py in <module>
      24 # ---------------------
      25 # NotImplementedError: xarray can't set arrays with multiple array indices to dask yet.
----> 26 avg = ds.spatial.avg("tas", axis, lat_bounds=lat_bounds, lon_bounds=lon_bounds)

~/Documents/Repositories/tomvothecoder/xcdat/xcdat/spatial_avg.py in avg(self, data_var_name, axis, weights, lat_bounds, lon_bounds)
    137             if lon_bounds is not None:
    138                 self._validate_region_bounds("lon", lon_bounds)
--> 139             weights = self._get_weights(axis, lat_bounds, lon_bounds)
    140 
    141         self._validate_weights(data_var, axis, weights)

~/Documents/Repositories/tomvothecoder/xcdat/xcdat/spatial_avg.py in _get_weights(self, axis, lat_bounds, lon_bounds)
    270                             domain_bounds, reg_bounds
    271                         )
--> 272                     dom_bounds = self._scale_domain_to_region(dom_bounds, reg_bounds)
    273 
    274                 if dim == "lat":

~/Documents/Repositories/tomvothecoder/xcdat/xcdat/spatial_avg.py in _scale_domain_to_region(self, domain_bounds, region_bounds)
    420             # Case 1: not wrapping around prime meridian.
    421             # Adjustments for above / right of region.
--> 422             d_bounds[d_bounds[:, 0] > r_bounds[1], 0] = r_bounds[1]
    423             d_bounds[d_bounds[:, 1] > r_bounds[1], 1] = r_bounds[1]
    424             # Adjustments for below / left of region.

/opt/miniconda3/envs/xcdat_dev/lib/python3.8/site-packages/xarray/core/dataarray.py in __setitem__(self, key, value)
    765                 for k, v in self._item_key_to_dict(key).items()
    766             }
--> 767             self.variable[key] = value
    768 
    769     def __delitem__(self, key: Any) -> None:

/opt/miniconda3/envs/xcdat_dev/lib/python3.8/site-packages/xarray/core/variable.py in __setitem__(self, key, value)
    852 
    853         indexable = as_indexable(self._data)
--> 854         indexable[index_tuple] = value
    855 
    856     @property

/opt/miniconda3/envs/xcdat_dev/lib/python3.8/site-packages/xarray/core/indexing.py in __setitem__(self, key, value)
   1244                 )
   1245                 if num_non_slices > 1:
-> 1246                     raise NotImplementedError(
   1247                         "xarray can't set arrays with multiple "
   1248                         "array indices to dask yet."

NotImplementedError: xarray can't set arrays with multiple array indices to dask yet.
### What were you expecting to happen?
<!-- A clear and concise description of what you expected to happen. -->
Parallelization of chunks should occur without issue.

### Any other comments?
<!-- Add any other context about the problem here. -->
Figure out a way to use xarray to perform the scaling.

Metadata

Metadata

Labels

type: bugInconsistencies or issues which will cause an issue or problem for users or implementors.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions