willow-ahrens / Finch.jl

Sparse tensors in Julia and more! Datastructure-driven array programing language.
http://willowahrens.io/Finch.jl/
MIT License
157 stars 15 forks source link

Support `tensordot` and `reduce` for `SwizzleArray` #475

Closed mtsokol closed 5 months ago

mtsokol commented 5 months ago

Issue #474

Hi @willow-ahrens,

This PR contains support for tensordot and reduce operations for SwizzleArrays.

Should FiberOrBroadcast be renamed to also indicate that is considers SwizzleArrays?

willow-ahrens commented 5 months ago

I suspect the test failure is due to FInch deciding to drop dims and julia deciding not to drop dims. I like dropping dims better, idk, but we may want to fix stuff in the future to be julia-compliant. For now, your fix is good.

mtsokol commented 5 months ago

Exactly - CI should be green now. Could you make a new release once we merge it? Then in finch-tensor I should have elem-wise and reduction ops running for eager and lazy mode.

Note to myself, and what you mentioned last week - I need to revisit handling scalars. Should there be a Tensor(scalar) or should we keep it as just Julia/numpy scalars?

codecov[bot] commented 5 months ago

Codecov Report

All modified and coverable lines are covered by tests :white_check_mark:

Project coverage is 76.27%. Comparing base (20d4276) to head (fba82b4).

Additional details and impacted files ```diff @@ Coverage Diff @@ ## main #475 +/- ## ========================================== + Coverage 76.23% 76.27% +0.04% ========================================== Files 92 92 Lines 8839 8842 +3 ========================================== + Hits 6738 6744 +6 + Misses 2101 2098 -3 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

willow-ahrens commented 5 months ago

Could you bump the version?—WillowOn Mar 18, 2024, at 3:45 PM, Mateusz Sokół @.***> wrote: Exactly - CI should be green now. Could you make a new release once we merge it? Then in finch-tensor I should have elem-wise and reduction ops running for eager and lazy mode. Note to myself, and what you mentioned last week - I need to revisit handling scalars. Should there be a Tensor(scalar) or should we keep it as just Julia/numpy scalars?

—Reply to this email directly, view it on GitHub, or unsubscribe.You are receiving this because you were mentioned.Message ID: @.***>