Skip to content

combine_by_coords can succed when it shouldn't #4824

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
mathause opened this issue Jan 18, 2021 · 15 comments
Open

combine_by_coords can succed when it shouldn't #4824

mathause opened this issue Jan 18, 2021 · 15 comments
Labels
bug topic-combine combine/concat/merge

Comments

@mathause
Copy link
Collaborator

What happened:

combine_by_coords can succeed when it should not - depending on the name of the dimensions (which determines the order of operations in combine_by_coords).

What you expected to happen:

  • I think it should throw an error in both cases.

Minimal Complete Verifiable Example:

import numpy as np
import xarray as xr


data = np.arange(5).reshape(1, 5)
x = np.arange(5)
x_name = "lat"

da0 = xr.DataArray(data, dims=("t", x_name), coords={"t": [1], x_name: x}).to_dataset(name="a")
x = x + 1e-6
da1 = xr.DataArray(data, dims=("t", x_name), coords={"t": [2], x_name: x}).to_dataset(name="a")
ds = xr.combine_by_coords((da0, da1))

ds

returns:

<xarray.Dataset>
Dimensions:  (lat: 10, t: 2)
Coordinates:
  * lat      (lat) float64 0.0 1e-06 1.0 1.0 2.0 2.0 3.0 3.0 4.0 4.0
  * t        (t) int64 1 2
Data variables:
    a        (t, lat) float64 0.0 nan 1.0 nan 2.0 nan ... 2.0 nan 3.0 nan 4.0

Thus lat is interlaced - it don't think combine_by_coords should do this. If you set

x_name = "lat"

and run the example again, it returns:

ValueError: Resulting object does not have monotonic global indexes along dimension x

Anything else we need to know?:

if not all(index.equals(indexes[0]) for index in indexes[1:]):

cc @dcherian @TomNicholas

Environment:

Output of xr.show_versions()
@TomNicholas
Copy link
Member

Thanks @mathause . I'm not sure exactly how it ends up with that erroneous result, but I think it should be caught by adding the same check that would fix #4077? i.e. when it realises that 4 is not < 1e-06 it would throw an error.

@mathause
Copy link
Collaborator Author

mathause commented Feb 1, 2021

combine_by_coords does a merge and a concat (in the _combine_1d function). The order of these operations makes a difference to the result. See details.

merge leads to interlaced coords

In [2]: xr.merge([da0, da1])
Out[2]: 
<xarray.Dataset>
Dimensions:  (lat: 10, t: 2)
Coordinates:
  * lat      (lat) float64 0.0 1e-06 1.0 1.0 2.0 2.0 3.0 3.0 4.0 4.0
  * t        (t) int64 1 2
Data variables:
    a        (t, lat) float64 0.0 nan 1.0 nan 2.0 nan ... 2.0 nan 3.0 nan 4.0

concat leads to - well - concatenated coords

xr.concat([da0, da1], dim="lat")
Out[5]: 
<xarray.Dataset>
Dimensions:  (lat: 10, t: 2)
Coordinates:
  * t        (t) int64 1 2
  * lat      (lat) float64 0.0 1.0 2.0 3.0 4.0 1e-06 1.0 2.0 3.0 4.0
Data variables:
    a        (t, lat) float64 0.0 1.0 2.0 3.0 4.0 nan ... 0.0 1.0 2.0 3.0 4.0

Yes I think I'd be interested to fix this...


One question on the scope of combine_by_coords - consider the following example with 4 arrays ("a" through "d") arranged as follows:

aabbb
cccdd

Or as Dataset:

ds_a = xr.Dataset({"data": (("y", "x"), [[1, 2]]), "x": [0, 1], "y": [0]})
ds_b = xr.Dataset({"data": (("y", "x"), [[3, 4, 5]]), "x": [2, 3, 4], "y": [0]})

ds_c = xr.Dataset({"data": (("y", "x"), [[11, 12, 14]]), "x": [0, 1, 2], "y": [1]})
ds_d = xr.Dataset({"data": (("y", "x"), [[14, 15]]), "x": [3, 4], "y": [1]})

xr.combine_by_coords([ds_a, ds_b, ds_c, ds_d]).data

Should that work or not?

Current behaviour

Currently it yields the following:

<xarray.DataArray 'data' (y: 2, x: 5)>
array([[ 1,  2,  3,  4,  5],
       [11, 12, 14, 14, 15]])
Coordinates:
  * x        (x) int64 0 1 2 3 4
  * y        (y) int64 0 1

However, if "x" is renamed to "z" it fails ("resulting coords not monotonic").

@TomNicholas
Copy link
Member

Thanks for these examples @mathause , these are useful.

combine_by_coords does a merge and a concat (in the _combine_1d function). The order of these operations makes a difference to the result. See details.

Not sure if this is what you meant, but to be clear: _combine_1d is used by both combine_by_coords and combine_nested - whether it does a merge or a concat is dictated by the concat_dim argument passed. In both combine_by_coords and combine_nested a hypercube (or multiple hypercubes) of datasets is assembled, before the hypercubes are each collapsed into a single dataset by _combine_nd. Within combine_by_coords _combine_1d will only ever do concatenation, because _infer_concat_order_from_coords will only ever return a list of real dimensions as the concat_dims (as opposed to a None like combine_nested can accept from the user - a None means "please merge along this axis of the hypercube I gave you"). combine_by_coords then finishes off by merging the possible multiple collapsed hypercubes (so following the same logic as the original 1D auto_combine that it replaced).

merge leads to interlaced coords
concat leads to - well - concatenated coords

As far I can see here then concat is behaving perfectly sensibly, but merge is behaving in a way that conflicts with its documentation, which says "xarray.MergeError is raised if you attempt to merge two variables with the same name but different values". In ds0 and ds1 lat has the same name but slightly different values - a MergeError should have been thrown, and so then combine_* would have thrown it too.

One question on the scope of combine_by_coords ...
aabbb
cccdd

This is a good question, but I'm 99% sure I didn't intend for either combine function to be able to handle this case. The problem with this case is that it's not order-invariant: you could concat aabbb together along x fine, and cccdd together fine, then both the results can be concatenated together along y (i.e. concat_dims=['x', 'y'], x then y). But if you tried y first then a and c have different sizes along the x direction, so you can't use concat on them (or you shouldn't be able to without introducing NaNs). It's possible that the checks are not sufficient to catch this case though.

Try comparing it to the behaviour of combine_nested: I think you'll find that it's only possible to arrange those datasets into the list-of-lists hypercube structure that's expected when you are doing x first then y, and not when doing y first then x.

@TomNicholas
Copy link
Member

tl;dr, I currently think there are two issues:

  1. xr.Merge has just allowed something through when it should have thrown an error,
  2. Passing a sort of ragged hypercube can slip past the checks in combine_by_coords, though it was never intended to handle this case. This limitation also isn't very clearly documented.

@mathause
Copy link
Collaborator Author

mathause commented Feb 2, 2021

Thanks for the thorough answer! Thus, we need to figure out if there is a "bug" in merge or if this has to be soved in combine_by_coords.

Disallowing "ragged hypercubes" certainly makes the problem much easier. Then we can say: "if two coords have the same start they need to be equal" (I think).

@TomNicholas
Copy link
Member

TomNicholas commented Feb 2, 2021

we need to figure out if there is a "bug" in merge

I don't actually know - this behaviour of merge seems wrong to me but it might be allowed given the changes to compatibility checks since I wrote combine_by_coords. @dcherian do you know?

Then we can say: "if two coords have the same start they need to be equal" (I think).

Yes exactly. Well, more specifically, "if they have the same start they need to be equal in length otherwise its a ragged hypercube, and if they have the same start, equal in length, but different values in the middle then it's a valid hypercube but an inconsistent dataset". It is currently assumed that the user passes a valid hypercube with consistent data, see this comment in _infer_concat_order_from_coords:

# Assume that any two datasets whose coord along dim starts
# with the same value have the same coord values throughout.

Though I realise now that I don't think this assumption is made explicit in the docs anywhere, instead it just talks about coords being monotonic.

If people pass "dodgy" hypercubes then this could currently fail in multiple ways (including silently), but the reason we didn't just actually check that the coords were completely equal throughout was because then you have to load all the actual values from the files, which could incur a significant performance cost. Adding a check of just the last value of each coord would help considerably (it should solve #4077), but unless we check every value then there will always be a way to silently produce a nonsense result by feeding it inconsistent data. We might consider some kind of flag for whether or not these checks should be done, which defaults to on, and users can turn off if they trust their data but want more speed.

@dcherian
Copy link
Contributor

dcherian commented Feb 2, 2021

this is the result of join="outer" which expands both datasets, and compat="no_conflicts" which picks the not-NaN value. These are both default values.

You can avoid this by setting join="override" which will skip the check.

@TomNicholas
Copy link
Member

Thanks @dcherian.

So passing join='override' to combine_by_coords gives this

<xarray.Dataset>
Dimensions:  (lat: 5, t: 2)
Coordinates:
  * lat      (lat) int64 0 1 2 3 4
  * t        (t) int64 1 2
Data variables:
    a        (t, lat) int64 0 1 2 3 4 0 1 2 3 4

which seems fine - that solves issue (1) doesn't it?

Your other example though @mathause, of allowing certain ragged arrays to pass through, presumably we still need a new check of some kind to disallow that?

@mathause
Copy link
Collaborator Author

But then it also overrides when it should concatenate...

import numpy as np
import xarray as xr

data = np.arange(5).reshape(1, 5)
x = np.arange(5)
x_name = "lat"

da0 = xr.DataArray(data, dims=("t", x_name), coords={"t": [1], x_name: x}).to_dataset(name="a")
x = x + 5
da1 = xr.DataArray(data, dims=("t", x_name), coords={"t": [2], x_name: x}).to_dataset(name="a")
ds = xr.combine_by_coords((da0, da1), join="override")

out

<xarray.Dataset>
Dimensions:  (lat: 5, t: 2)
Coordinates:
  * lat      (lat) int64 0 1 2 3 4
  * t        (t) int64 1 2
Data variables:
    a        (t, lat) int64 0 1 2 3 4 0 1 2 3 4

Again if we set x_name = "y" the following is returned:

<xarray.Dataset>
Dimensions:  (t: 1, y: 10)
Coordinates:
  * t        (t) int64 1
  * y        (y) int64 0 1 2 3 4 5 6 7 8 9
Data variables:
    a        (t, y) int64 0 1 2 3 4 0 1 2 3 4



@max-sixty
Copy link
Collaborator

Is there a reasonable way of preventing the ambiguity in override? Do we need to make the concat_dim explicit?

@TomNicholas
Copy link
Member

I honestly don't really understand override well enough to reason about this at the moment. What would help me at least is to have a set of unit tests for this case: @mathause what exactly do we want it to do that it does not currently do?

@dcherian
Copy link
Contributor

dcherian commented May 13, 2021

For concat, join applies to all indexes that are not being concatenated (e.g. concat over time but override floating point differences in latitude). This is easy to understand for combine_nested.

I don't think it makes sense at all for combine_by_coords. If you want to override coordinate labels values for some dimensions you should have to do it manually. It seems easy to land up in weird edge-cases when you combine that with autoguessing behaviour.

@TomNicholas
Copy link
Member

Thanks @dcherian that helps. It does seem silly to be like "I'm going to use the coordinates to decide how everything should be concatenated, but I'm also happy to change some of those coordinate values whilst I'm combining". Does that mean we should just not to allow join='override' for combine_by_coords? I don't think that solves @mathause 's original problem though.

@dcherian
Copy link
Contributor

Does that mean we should just not to allow join='override' for combine_by_coords?

I think so.

I suspect the original problem can only be solved by allowing approximate alignment within a specified tolerance.

@mathause
Copy link
Collaborator Author

I add the example from #4753 (comment) here. I am not 100% if that is the same problem or not:

def combine_by_coords_problem(name, join="override"):

    ds0 = xr.Dataset(coords={"x1": [1, 2, 3], name: [10, 20, 30]})
    ds1 = xr.Dataset(coords={"x1": [4, 5, 6], name: [40, 50, 60]})

    return xr.combine_by_coords([ds0, ds1], join=join)

combine_by_coords_problem("x0") # concatenates 1, 2, 3, 4, 5, 6
combine_by_coords_problem("x2") # concatenates 10, 20, 30, 40, 50, 60

with join=

  • "exact": error
  • "left", "right", "inner", "override": ambiguous result
  • "outer": sensible result

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug topic-combine combine/concat/merge
Projects
None yet
Development

No branches or pull requests

4 participants