willow-ahrens / Finch.jl

Sparse tensors in Julia and more! Datastructure-driven array programing language.
http://willowahrens.io/Finch.jl/
MIT License
159 stars 15 forks source link

`reshape` function #558

Open mtsokol opened 4 months ago

mtsokol commented 4 months ago

Hi @willow-ahrens,

I wanted to discuss feasibility of eager only (and simplest possible) reshape function in Finch.

It looks like most of the Array API test suites require reshape to run (test_signatures.py test suite that I ran to count the number of supported functions is an exception).

To run them locally I used a crude/test only implementation:

def reshape(x: Tensor, shape: tuple[int, ...]) -> Tensor:
    arr = x.todense()
    arr = arr.reshape(shape)
    return Tensor(arr)

Do you think it would be feasible to have an eager only reshape(::SwizzleArray, ...)? Even if it would be as simple as copy to a dense format ([EDIT] or COO?), reshape and then copy back to the original format?

hameerabbasi commented 4 months ago

copy to a dense format

This is exactly what I'd like to avoid -- COO should be doable though.

willow-ahrens commented 4 months ago

I believe we can do this. The main challenge I have a hard time designing for is the selection of an appropriate output format. If we know the appropriate output format, I don't think it would be so hard to do a direct copy. We would need to use one of the randomly accessible sparse formats if the swizzles don't match up. I think we could implement something similar to the style of our getindex kernel, where we use a generated function to write some Finch code and call it.

willow-ahrens commented 4 months ago

@mtsokol Is this is a blocker for you?

hameerabbasi commented 4 months ago

@willow-ahrens It's needed for doing https://github.com/willow-ahrens/finch-tensor/pull/48 cleanly.

willow-ahrens commented 4 months ago

hmm. Okay! I'll do it today then.