Skip to content

Fix "Chunksize cannot exceed dimension size" #1707

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 3 commits into from
Nov 13, 2017

Conversation

shoyer
Copy link
Member

@shoyer shoyer commented Nov 11, 2017

@shoyer shoyer force-pushed the encoding-bad-chunks branch from 589bb1a to f5c1b57 Compare November 11, 2017 03:28
c > d for c, d in zip(encoding['chunksizes'], variable.shape))
changed_shape = encoding.get('original_shape') != variable.shape
if chunks_too_big or changed_shape:
del encoding['chunksizes']
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this looks fine. Can you add a comment here that explains that we are dropping the encoding chunksizes so that netCDF4-python can write this dataset

del encoding['chunksizes']
if not raise_on_invalid and 'chunksizes' in encoding:
chunks_too_big = any(
c > d for c, d in zip(encoding['chunksizes'], variable.shape))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It looks like you'll need to make sure encoding['chunksizes'] is an iterable of length variable.ndim

@shoyer shoyer mentioned this pull request Nov 12, 2017
13 tasks
@shoyer
Copy link
Member Author

shoyer commented Nov 12, 2017

I updated the logic to only drop chunksizes if the dimension is no longer unlimited, which is the case where an error comes up. This allows us to faithfully keep chunksizes for unlimited dimensions (e.g., if you directly read/write a complete netCDF file).

@shoyer shoyer merged commit bea202d into pydata:master Nov 13, 2017
@jklymak
Copy link
Contributor

jklymak commented Nov 27, 2017

I had this error in 0.9.5. 0.10.0 definitely fixes it. Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants