Some common aggregations (at least min, max, var, std) are broken with complex dtypes if numbagg is installed with the default skipna=True. Looks like this is the case since #8624.
What did you expect to happen?
We either route to bottleneck or numpyhere if these aggregation aren't supported in numbagg, or get them working with numbagg.
Minimal Complete Verifiable Example
import xarray as xr
import numpy as np
da = xr.DataArray(np.ones((2,), dtype=np.complex_), dims=["x"])
da.min(skipna=False) # works
da.min() # fails
MVCE confirmation
[X] Minimal example — the example is as focused as reasonably possible to demonstrate the underlying issue in xarray.
[X] Complete example — the example is self-contained, including all data and the text of any traceback.
[X] Verifiable example — the example copy & pastes into an IPython prompt or Binder notebook, returning the result.
[X] New issue — a search of GitHub Issues suggests this is not a duplicate.
[X] Recent environment — the issue occurs with the latest version of xarray and its dependencies.
Relevant log output
TypeError: ufunc '__numbagg_transformed_func' not supported for the input types, and the inputs could not be safely coerced to any supported types according to the casting rule ''safe''
What happened?
Some common aggregations (at least
min
,max
,var
,std
) are broken with complex dtypes ifnumbagg
is installed with the defaultskipna=True
. Looks like this is the case since #8624.What did you expect to happen?
We either route to
bottleneck
ornumpy
here if these aggregation aren't supported innumbagg
, or get them working withnumbagg
.Minimal Complete Verifiable Example
MVCE confirmation
Relevant log output
Anything else we need to know?
cc @max-sixty
Environment