-
Notifications
You must be signed in to change notification settings - Fork 2
Bump xarray from 2024.3.0 to 2025.1.2 #378
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
3c8474f
to
2279110
Compare
Bumps [xarray](https://github.com/pydata/xarray) from 2024.3.0 to 2025.1.2. - [Release notes](https://github.com/pydata/xarray/releases) - [Changelog](https://github.com/pydata/xarray/blob/main/HOW_TO_RELEASE.md) - [Commits](pydata/xarray@v2024.03.0...v2025.01.2) --- updated-dependencies: - dependency-name: xarray dependency-type: direct:production update-type: version-update:semver-major ... Signed-off-by: dependabot[bot] <[email protected]>
2279110
to
bc3a0c9
Compare
@@ -808,6 +809,8 @@ def prep( | |||
if damages[v].dtype == object: | |||
damages[v] = damages[v].astype("unicode") | |||
|
|||
damages["gcm"] = damages["gcm"].astype("U" + str(max_gcm_len)) | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It seems like the xarray previously automatically updated the datatype when appending additional GCMs. With the update to xarray 2025.1.2 this broke and the code was throwing an error when trying to append a GCM with shorter name. This change ensures that all GCMs use the same datatype.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
astype("unicode")
is pretty old (like, python 2) way of handling this because python strings at the time were not natively unicode. Would swapping all these over to astype("str")
fix this and hopefully not create trouble elsewhere? I think this would use a variable-length string representation -- meaning you don't need all this manual checking for max string length in a GCM name.
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## main #378 +/- ##
==========================================
+ Coverage 68.91% 68.94% +0.03%
==========================================
Files 15 15
Lines 1750 1752 +2
==========================================
+ Hits 1206 1208 +2
Misses 544 544 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
) # the first value only should be censored | ||
assert as_share[0] != 5 and [as_share[1], as_share[2]] == [30, 60] | ||
all_cons = menu_instance.weitzman_min( | ||
[20, 50, 100], [5, 30, 60], all_cons | ||
np.array([20, 50, 100]), np.array([5, 30, 60]), all_cons |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
A test was failing because of the default python lists. Nowhere in the code do we pass the weitzman_min
function with anything except an xarray array, so I thought it would be okay to "lose" the functionality. And in fact the docstring of the weitzman_min
function says it expects an xarray array anyway.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That seems good enough, especially if that's how this is documented. A more involved workaround may be another issue/PR. What was it erroring on? Do you have a copy of the error anywhere?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here's the error:
> as_share = menu_instance.weitzman_min(
[20, 50, 100], [5, 30, 60], as_share
) # the first value only should be censored
tests/test_main_recipe.py:260:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/home/jonahmgilbert/anaconda3/envs/dscim-testing/lib/python3.12/site-packages/dscim/menu/main_recipe.py:910: in weitzman_min
clipped_cons = xr.where(
/home/jonahmgilbert/anaconda3/envs/dscim-testing/lib/python3.12/site-packages/xarray/computation/computation.py:737: in where
result = apply_ufunc(
/home/jonahmgilbert/anaconda3/envs/dscim-testing/lib/python3.12/site-packages/xarray/computation/apply_ufunc.py:1280: in apply_ufunc
return apply_array_ufunc(func, *args, dask=dask)
/home/jonahmgilbert/anaconda3/envs/dscim-testing/lib/python3.12/site-packages/xarray/computation/apply_ufunc.py:891: in apply_array_ufunc
return func(*args)
/home/jonahmgilbert/anaconda3/envs/dscim-testing/lib/python3.12/site-packages/xarray/core/duck_array_ops.py:387: in where
return xp.where(condition, *as_shared_dtype([x, y], xp=xp))
/home/jonahmgilbert/anaconda3/envs/dscim-testing/lib/python3.12/site-packages/xarray/core/duck_array_ops.py:281: in as_shared_dtype
dtype = dtypes.result_type(*scalars_or_arrays, xp=xp)
/home/jonahmgilbert/anaconda3/envs/dscim-testing/lib/python3.12/site-packages/xarray/core/dtypes.py:267: in result_type
array_api_compat.result_type(preprocess_types(t), xp=xp)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
xp = <module 'numpy' from '/home/jonahmgilbert/anaconda3/envs/dscim-testing/lib/python3.12/site-packages/numpy/__init__.py'>
arrays_and_dtypes = ([5, 30, 60],)
def result_type(*arrays_and_dtypes, xp) -> np.dtype:
if xp is np or any(
isinstance(getattr(t, "dtype", t), np.dtype) for t in arrays_and_dtypes
):
> return xp.result_type(*arrays_and_dtypes)
E TypeError: Field elements must be 2- or 3-tuples, got '5'
/home/jonahmgilbert/anaconda3/envs/dscim-testing/lib/python3.12/site-packages/xarray/compat/array_api_compat.py:44: TypeError
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Cool. Thanks.
I think the original fix is fine.
The other aggressive approach might be to cast no_cc_consumption and cc_consumption to xr.DataArray at the start of the function like:
no_cc_consumption = xr.DataArray(no_cc_consumption)
cc_consumption = xr.DataArray(cc_consumption)
That might have side-effects for edge cases and it feels kinda magical for an internal-ish function. Could be a solution if this PR's fix accidentally breaks things.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for this fix!
I raised some points in response to your inline comments, but I don't think any of that discussion should hold back this PR just for my sake.
I was also worried about breaking compatibility with older versions of xarray, given how far behind this is, but these changes seem innocuous.
Thanks again for the PR.
@@ -808,6 +809,8 @@ def prep( | |||
if damages[v].dtype == object: | |||
damages[v] = damages[v].astype("unicode") | |||
|
|||
damages["gcm"] = damages["gcm"].astype("U" + str(max_gcm_len)) | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
astype("unicode")
is pretty old (like, python 2) way of handling this because python strings at the time were not natively unicode. Would swapping all these over to astype("str")
fix this and hopefully not create trouble elsewhere? I think this would use a variable-length string representation -- meaning you don't need all this manual checking for max string length in a GCM name.
) # the first value only should be censored | ||
assert as_share[0] != 5 and [as_share[1], as_share[2]] == [30, 60] | ||
all_cons = menu_instance.weitzman_min( | ||
[20, 50, 100], [5, 30, 60], all_cons | ||
np.array([20, 50, 100]), np.array([5, 30, 60]), all_cons |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That seems good enough, especially if that's how this is documented. A more involved workaround may be another issue/PR. What was it erroring on? Do you have a copy of the error anywhere?
Bumps xarray from 2024.3.0 to 2025.1.2.
Release notes
Sourced from xarray's releases.
... (truncated)
Commits
d8d1d9e
release notes: 2025.01.2 (#10007)326dbe7
Add FAQ answer about API stability & backwards compatibility (#9855)5b3f127
Preserve order of variables in combine_by_coords (#9070)97a4a71
fix typing (#10006)f91306a
castnumpy
scalars to arrays inNamedArray.from_array
(#10008)e84e421
Use freq='D' instead of freq='d' in pd.timedelta_range (#10004)d7ac79a
Fix the push method when the limit parameter is bigger than the chunk… (#9940)5fdceff
Migrate Zarr region="auto" tests to a class (#9990)018b241
Fix infer_freq, check for subdtype "datetime64"/"timedelta64" (#9999)8ccf1cb
Revert "Use flox for grouped first, last (#9986)" (#10001)You can trigger a rebase of this PR by commenting
@dependabot rebase
.Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR:
@dependabot rebase
will rebase this PR@dependabot recreate
will recreate this PR, overwriting any edits that have been made to it@dependabot merge
will merge this PR after your CI passes on it@dependabot squash and merge
will squash and merge this PR after your CI passes on it@dependabot cancel merge
will cancel a previously requested merge and block automerging@dependabot reopen
will reopen this PR if it is closed@dependabot close
will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually@dependabot show <dependency name> ignore conditions
will show all of the ignore conditions of the specified dependency@dependabot ignore this major version
will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this minor version
will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)@dependabot ignore this dependency
will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)