willow-ahrens / Finch.jl

Sparse tensors in Julia and more! Datastructure-driven array programing language.
http://willowahrens.io/Finch.jl/
MIT License
158 stars 15 forks source link

`DOK` constructor issues #451

Closed mtsokol closed 6 months ago

mtsokol commented 6 months ago

Hi @willow-ahrens,

I started working on adding DOK constructor to finch-tensor, and came across some issues. It isn't a priority, just wanted to keep track of it.

Adding a new non-zero element at the wrong index:

julia> tns = Tensor(SparseHash{2}(Element(0.0)), [10 0 20; 30 0 0; 0 0 40])
SparseHash{2} (0.0) [:,1:3]
├─ [1, 1]: 10.0
├─ [2, 1]: 30.0
├─ [1, 3]: 20.0
└─ [3, 3]: 40.0

julia> tns[1, 2] = 100
100

julia> tns
SparseHash{2} (0.0) [:,1:3]
├─ [1, 1]: 10.0
├─ [2, 1]: 30.0
├─ [1, 2]: 20.0
├─ [1, 3]: 40.0
└─ [3, 3]: 100.0

Error on accessing DOK element with fill-value:

julia> tns = Tensor(SparseHash{2}(Element(0.0), (5, 5)))
SparseHash{2} (0.0) [:,1:5]

julia> tns[1,1]

[70362] signal (11.1): Segmentation fault: 11
in expression starting at REPL[4]:1
getindex at ./essentials.jl:13 [inlined]
macro expansion at /Users/mateusz/JuliaProjects/Finch.jl/src/tensors/levels/sparsehashlevels.jl:337 [inlined]
willow-ahrens commented 6 months ago

what if we didn't support DOK? Could we get away with using a tensor format that has similar properties?

willow-ahrens commented 6 months ago

i.e. what if we used the SparseDict level instead, which is one-dimensional, and stacked it to the desired number of dimensions?

willow-ahrens commented 6 months ago

I'm very very tempted to remove the SparseHashLevel because it's a maintenance burden.

mtsokol commented 6 months ago

Sure! I think we're not tied to DOK in any way.