pydata / xarray

N-D labeled arrays and datasets in Python
https://xarray.dev
Apache License 2.0
3.63k stars 1.09k forks source link

Groupby reduction with nd labels and a subset of dims returns factorized labels #9803

Open max-sixty opened 1 day ago

max-sixty commented 1 day ago

What happened?

When grouping by a coord which has multiple dimensions and reducing by a subset of dimensions, the returned dimensions are given as the int values (0,1), rather than the actual labels.

Check out MCVE below

What did you expect to happen?

No response

Minimal Complete Verifiable Example

d = xr.DataArray(
    [[1.0, 1.0], [1.0, 1.0]], dims=["a", "b"], coords=[[1, 2], [11, 12]]
).assign_coords(g=(("a", "b"), [[52, 42], [52, 42]]))

d

<xarray.DataArray (a: 2, b: 2)> Size: 32B
array([[1., 1.],
       [1., 1.]])
Coordinates:
  * a        (a) int64 16B 1 2
  * b        (b) int64 16B 11 12
    g        (a, b) int64 32B 52 42 52 42

This is as expected:

d.groupby('g').sum()

<xarray.DataArray (g: 2)> Size: 16B
array([2., 2.])
Coordinates:
  * g        (g) int64 16B 42 52   

But then we get g (g) int64 16B 0 1 if we do .sum('a'):

d.groupby('g').sum('a')

<xarray.DataArray (b: 2, g: 2)> Size: 32B
array([[nan,  2.],
       [ 2., nan]])
Coordinates:
  * b        (b) int64 16B 11 12
  * g        (g) int64 16B 0 1    # HERE but then this is the factorized labels — just `0` & `1`!

Notably, removing some of the conditions make it work fine:

# single dimension coord

d = xr.DataArray(
    [[1.0, 1.0], [1.0, 1.0]], dims=["a", "b"], coords=[[1, 2], [11, 12]]
).assign_coords(g=(("a",), [52, 42]))

d.groupby('g').sum('a')

<xarray.DataArray (g: 2, b: 2)> Size: 32B
array([[1., 1.],
       [1., 1.]])
Coordinates:
  * b        (b) int64 16B 11 12
  * g        (g) int64 16B 42 52

# or with the original but summing over all dims:

d.groupby('g').sum(...)

<xarray.DataArray (g: 2)> Size: 16B
array([2., 2.])
Coordinates:
  * g        (g) int64 16B 42 52

MVCE confirmation

Relevant log output

No response

Anything else we need to know?

Happens both on current and earlier versions of xarray — doesn't seem like a new thing with the recent groupby changes

Environment

INSTALLED VERSIONS ------------------ commit: 339ed932eac7f123410382a2037f844dd852f251 python: 3.11.10 (main, Sep 7 2024, 01:03:31) [Clang 15.0.0 (clang-1500.3.9.4)] python-bits: 64 OS: Darwin OS-release: 23.6.0 machine: arm64 processor: arm byteorder: little LC_ALL: en_US.UTF-8 LANG: None LOCALE: ('en_US', 'UTF-8') libhdf5: 1.14.3 libnetcdf: 4.9.2 xarray: 2024.9.1.dev32+gece582dd pandas: 2.2.2 numpy: 2.0.2 scipy: 1.14.1 netCDF4: 1.7.1.post2 pydap: None h5netcdf: 1.3.0 h5py: 3.11.0 zarr: 2.18.3 cftime: 1.6.4 nc_time_axis: 1.4.1 iris: None bottleneck: 1.4.0 dask: 2024.8.2 distributed: 2024.8.2 matplotlib: 3.9.2 cartopy: None seaborn: 0.13.2 numbagg: 0.8.1 fsspec: 2024.9.0 cupy: None pint: None sparse: None flox: 0.9.12 numpy_groupies: 0.11.2 setuptools: 69.2.0 pip: 24.0 conda: None pytest: 8.3.3 mypy: 1.11.2 IPython: 8.24.0 sphinx: None
dcherian commented 1 day ago

Hmmm... we've lost an error message somewhere in all the groupby PRs I've been pushing. Note it doesn't work at all without flox

ValueError: cannot reduce over dimensions ['a']. expected either '...' to reduce over all dimensions or one or more of ('stacked_a_b',).
max-sixty commented 1 day ago

we've lost an error message somewhere in all the groupby PRs I've been pushing

As in, we should fail but we don't?

Note it doesn't work at all without flox

Good point...

For completeness, stacking is a reasonable workaround (with or without flox)

with xr.set_options(use_flox=False):
     print(d.groupby('g').sum('stacked_a_b'))

<xarray.DataArray (g: 2)> Size: 16B
array([2., 2.])
Coordinates:
  * g        (g) int64 16B 42 52
dcherian commented 1 day ago

Yes we should be erroring with same message in both cases.

For completeness, stacking is a reasonable workaround (with or without flox)

This isn't the same! You can't apply a subset of dims without .map(lambda g: g.unstack().mean(dim))

max-sixty commented 1 day ago

This isn't the same! You can't apply a subset of dims without .map(lambda g: g.unstack().mean(dim))

Ah yes, ofc

Yes we should be erroring with same message in both cases.

Am I right in thinking we're pretty close to the correct result though? We're just missing putting the labels on?

(obviously it would still be work, reasonable to error in the meantime — but is my assessment correct that the difficult piece is working correctly?)

dcherian commented 1 day ago

Am I right in thinking we're pretty close to the correct result though? We're just missing putting the labels on?

Yes correct, flox supports this. Though I think .mean('a') fails ...

I wanted consistent behaviour in both code paths that's all...

dcherian commented 1 day ago

Note this is a dupe of #1013 though this one is clearer about the end-goal.

max-sixty commented 6 hours ago

Yes correct, flox supports this. Though I think .mean('a') fails ...

You're correct!

Note this is a dupe of #1013 though this one is clearer about the end-goal.

Ah OK, now I understand #1013 more clearly :) Shall we close this and link from that?