You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I think dask array has some utility functions for "unifying chunks" that we might be able to use inside our map_blocks() function.
Potentially we could also make Dataset.chunks more robust, e.g., have it return None for dimensions with inconsistent chunk sizes rather than raising an error.
Alternatively, we could enforce matching chunksizes on all dask arrays inside a Dataset, as part of xarray's model of a Dataset as a collection of aligned arrays. But this seems unnecessarily limiting, and I am reluctant to add extra complexity to xarray's data model.
(map_ds + map_ds.y)
givesy
chunksize of 20 forc
and 5 fora
This seems reasonable except
(map_ds + map_ds.y).chunks)
raises an "Inconsistent chunks" error.I ran into this writing tests for map_blocks.
The text was updated successfully, but these errors were encountered: