-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Feature request xarray.Dataset.from_dask_dataframe #3929
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
I've been trying to implement this and have managed to create a The modifications I've made so far are adding the following above line 400 in dataarray.py: shape = tuple([
dim_size.compute()
if hasattr(dim_size, 'compute')
else dim_size
for dim_size
in data.shape
])
coords = tuple([
coord.compute()
if hasattr(coord, 'compute')
else coord
for coord
in coords
]) and on line 403 by replacing The issue I have is that when I then want to use the DataArray and do something like ---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-23-5d739a721388> in <module>
----> 1 da.sel(datetime='2020')
~\anaconda3\envs\DataHub\lib\site-packages\xarray\core\dataarray.py in sel(self, indexers, method, tolerance, drop, **indexers_kwargs)
1219
1220 """
-> 1221 ds = self._to_temp_dataset().sel(
1222 indexers=indexers,
1223 drop=drop,
~\anaconda3\envs\DataHub\lib\site-packages\xarray\core\dataarray.py in _to_temp_dataset(self)
499
500 def _to_temp_dataset(self) -> Dataset:
--> 501 return self._to_dataset_whole(name=_THIS_ARRAY, shallow_copy=False)
502
503 def _from_temp_dataset(
~\anaconda3\envs\DataHub\lib\site-packages\xarray\core\dataarray.py in _to_dataset_whole(self, name, shallow_copy)
551
552 coord_names = set(self._coords)
--> 553 dataset = Dataset._construct_direct(variables, coord_names, indexes=indexes)
554 return dataset
555
~\anaconda3\envs\DataHub\lib\site-packages\xarray\core\dataset.py in _construct_direct(cls, variables, coord_names, dims, attrs, indexes, encoding, file_obj)
959 """
960 if dims is None:
--> 961 dims = calculate_dimensions(variables)
962 obj = object.__new__(cls)
963 obj._variables = variables
~\anaconda3\envs\DataHub\lib\site-packages\xarray\core\dataset.py in calculate_dimensions(variables)
207 "conflicting sizes for dimension %r: "
208 "length %s on %r and length %s on %r"
--> 209 % (dim, size, k, dims[dim], last_used[dim])
210 )
211 return dims
ValueError: conflicting sizes for dimension 'datetime': length nan on <this-array> and length 90386 on 'datetime' This occurs due to the construction of I'm assuming there's an alternative way to construct |
For context this is the function I'm using to convert the Dask DataFrame to a DataArray. def from_dask_dataframe(df, index_name=None, columns_name=None):
def extract_dim_name(df, dim='index'):
if getattr(df, dim).name is None:
getattr(df, dim).name = dim
dim_name = getattr(df, dim).name
return dim_name
if index_name is None:
index_name = extract_dim_name(df, 'index')
if columns_name is None:
columns_name = extract_dim_name(df, 'columns')
da = xr.DataArray(df, coords=[df.index, df.columns], dims=[index_name, columns_name])
return da
df.index.name = 'datetime'
df.columns.name = 'fueltypes'
da = from_dask_dataframe(df) I'm also conscious that my question is different to @raybellwaves' as they were asking about Dataset creation and I'm interested in creating a DataArray which requires different functionality. I'm assuming this is the correct place to post though as @keewis closed my issue and linked to this one. |
Thanks for investigating and working on this, @AyrtonB. I indeed think this is the correct place to discuss this: your use case can probably be implemented by converting to a |
It sounds like making this work well would require xarray to support "unknown" dimension sizes throughout the codebase. This would be a nice feature to have, but indeed would likely require pervasive changes. The other option would be to explicitly compute the shape when converting from dask dataframes, by calling (xref data-apis/array-api#97) |
One of the things I was hoping to include in my approach is the preservation of the column dimension names, however if I was to use Thanks for the advice @shoyer, I reached a similar opinion and so have been working on the dim compute route. The issue is that a Dask array's shape uses np.nan for uncomputed dimensions, rather than leaving a delayed object like the Dask dataframe's shape. I looked into returning the dask dataframe rather than dask array but this didn't feel like it fit with the rest of the code and produced another issue as dask dataframes don't have a dtype attribute. I'll continue to look into alternatives. |
I've added a PR for the new feature but it's currently failing tests as the test-suite doesn't seem to have Dask installed. Any advice on how to get this PR prepared for merging would be appreciated. |
I create this function which works pretty good,
|
The method
xarray.Dataset.to_dask_dataframe
exists and would like to make a feature request for the oppositexarray.Dataset.from_dask_dataframe
. The conversation started over atdask
but it was suggested by @shoyer to implement here first dask/dask#6058The text was updated successfully, but these errors were encountered: