pydata / xarray

N-D labeled arrays and datasets in Python
https://xarray.dev
Apache License 2.0
3.59k stars 1.08k forks source link

Lazy concatenation of arrays #4628

Open nbren12 opened 3 years ago

nbren12 commented 3 years ago

Is your feature request related to a problem? Please describe. Concatenating xarray objects forces the data to load. I recently learned about this object allowing lazy indexing into an DataArrays/sets without using dask. Concatenation along a single dimension is the inverse operation of slicing, so it seems natural to also support it. Also, concatenating along dimensions (e.g. "run"/"simulation"/"ensemble") can be a common merging workflow.

Describe the solution you'd like

xr.concat([a, b], dim=...) does not load any data in a or b.

Describe alternatives you've considered One could rename the variables in a and b to allow them to be merged (e.g. a['air_temperature'] -> "air_temperature_a"), but it's more natural to make a new dimension.

Additional context

This is useful when not using dask for performance reasons (e.g. using another parallelism engine like Apache Beam).

shoyer commented 3 years ago

If you write write something like xarray.concat(..., data_vars='minimal', coords='minimal'), dask should entirely lazy -- the non-laziness only happens with the default value of coords='different'.

But I agree, it would be nice if Xarray's internal lazy indexing machinery supported concatenation. It currently does not.

LunarLanding commented 2 years ago

Any pointers regarding where to start / modules involved to implement this? I would like to have a try.

dcherian commented 2 years ago

From @rabernat in #6588:

Right now, if I want to concatenate multiple datasets (e.g. as in open_mfdataset), I have two options:

In pseudocode:

ds1 = xr.open_dataset("some_big_lazy_source_1.nc")
ds2 = xr.open_dataset("some_big_lazy_source_2.nc")
item1 = ds1.foo[0, 0, 0]  # lazily access a single item
ds = xr.concat([ds1.chunk(), ds2.chunk()], "time")  # only way to lazily concat
# trying to access the same item will now trigger loading of all of ds1
item1 = ds.foo[0, 0, 0]
# yes I could use different chunks, but the point is that I should not have to 
# arbitrarily choose chunks to make this work

However, I am increasingly encountering scenarios where I would like to lazily concatenate datasets (without loading into memory), but also without the requirement of using dask. This would be useful, for example, for creating composite datasets that point back to an OpenDAP server, preserving the possibility of granular lazy access to any array element without the requirement of arbitrary chunking at an intermediate stage.

Describe the solution you'd like

I propose to extend our LazilyIndexedArray classes to support simple concatenation and stacking. The result of applying concat to such arrays will be a new LazilyIndexedArray that wraps the underlying arrays into a single object.

The main difficulty in implementing this will probably be with indexing: the concatenated array will need to understand how to map global indexes to the underling individual array indexes. That is a little tricky but eminently solvable.

Describe alternatives you've considered

The alternative is to structure your code in a way that avoids needing to lazily concatenate arrays. That is what we do now. It is not optimal.

nbren12 commented 2 years ago

@rabernat It seems that great minds think alike ;)

rabernat commented 2 years ago

Any pointers regarding where to start / modules involved to implement this? I would like to have a try.

The starting point would be to look at the code in indexing.py and try to understand how lazy indexing works.

In particular, look at

https://github.com/pydata/xarray/blob/3920c48d61d1f213a849bae51faa473b9c471946/xarray/core/indexing.py#L465-L470

Then you may want to try writing a class that looks like

 class LazilyConcatenatedArray:  # have to decide what to inherit from 

    def __init__(self, *arrays: LazilyIndexedArray, concat_axis=0):
        # figure out what you need to keep track of

    @property
    def shape(self):
        # figure out how to determine the total shape

    def __getitem__(self, indexer) -> LazilyIndexedArray:
        # figure out how to map an indexer to the right piece of data
ilan-gold commented 4 weeks ago

I would +1 your great proposal @dcherian . My use-case would be the extension arrays - being able to concat them without dask is basically a must for us.

Can't do this without virtual concat machinery (https://github.com/pydata/xarray/issues/4628) which someone decided to implement elsewhere 🙄 ;)

From https://github.com/pydata/xarray/issues/9038#issuecomment-2123533539 - what is this referring to?

In short, this is not implemented here: https://github.com/scverse/anndata/pull/1247#discussion_r1766811545 . It is implemented for non-lazy categoricals but not lazy ones