-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
.reset_index()
/.reset_coords()
maintain MultiIndex status
#8743
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Workaround for issues stemming from pydata/xarray#8743 By replicating the source_grid dataset from scratch / from .values before exporting it.
I have the same issue as described above. Still working for xarray v2023.12.0 |
Same issue for me. Maybe related to #6946. Posting the same example here with the modification of not dropping the reset Multiindex:
Raises: |
I think this was fixed in #8672 and should now work again in the newest version (v2024.02) |
Yup can confirm. Thanks. |
Can also confirm this is fixed by #8672 , thanks y'all! |
What happened?
Trying to save a dataset to NetCDF using
ds.to_netcdf()
will fail when one of the coordinates is a multiindex. The error message suggests using.reset_index()
to remove the multiindex. However, saving still fails after resetting the index, including after moving the offending coordinates to be data variables instead using.reset_coords()
.What did you expect to happen?
After calling
.reset_index()
, and especially after calling.reset_coords()
, the save should be successful.As shown in the example below, a dataset that asserts identical to the dataset that throws the error saves without a problem. (this also points to a current workaround - to recreate the Dataset from scratch).
Minimal Complete Verifiable Example
MVCE confirmation
Relevant log output
Anything else we need to know?
This is a recent error that came up in some automated tests - an older version of it is still working; so
xarray v2023.1.0
does not have this issue.Given that saving works with a dataset that
xr.testing.assert_identical()
asserts is identical to the dataset that fails, and thatds.indexes()
no longer shows a MultiIndex on the dataset that fails, perhaps the issue is in the error itself - i.e., inxarray.conventions.ensure_not_multiindex
?Looks like it was added recently f9f4c73 to address another bug.
Environment
INSTALLED VERSIONS
commit: None
python: 3.12.1 | packaged by conda-forge | (main, Dec 23 2023, 08:05:03) [Clang 16.0.6 ]
python-bits: 64
OS: Darwin
OS-release: 22.6.0
machine: x86_64
processor: i386
byteorder: little
LC_ALL: None
LANG: None
LOCALE: (None, 'UTF-8')
libhdf5: None
libnetcdf: None
xarray: 2024.1.1
pandas: 2.2.0
numpy: 1.26.3
scipy: 1.12.0
netCDF4: None
pydap: None
h5netcdf: None
h5py: None
Nio: None
zarr: None
cftime: None
nc_time_axis: None
iris: None
bottleneck: None
dask: None
distributed: None
matplotlib: 3.8.2
cartopy: None
seaborn: None
numbagg: None
fsspec: None
cupy: None
pint: None
sparse: 0.15.1
flox: None
numpy_groupies: None
setuptools: 69.0.3
pip: 24.0
conda: None
pytest: 7.4.0
mypy: None
IPython: 8.21.0
sphinx: None
The text was updated successfully, but these errors were encountered: