Closed SimeonEhrig closed 2 years ago
I wonder if Vikunja memory access could profit from LLAMA...
I wonder if Vikunja memory access could profit from LLAMA...
For this PR, it is not interesting, because the current implementation of the memory access strategies requires 1D memory without padding. But in general, yes. I want to support also multidimensional memory #61 . I think std::mdspan
is a good idea for the memory access, like @bernhardmgruber already suggested for llama
and alpaka
.
After a discussion with @bernhardmgruber we find out, that std::mdspan
does not support padding, when the padding is not multiple of the size of a single element. Therefore I will use alpaka::accessors
, which supports this.
As side not, std::mdspan
supports also other nice features like slicing. But I don't need this feature for vikunja.
The RP is ready for merging. At the moment, some tests and documentation are missing. This will follow in another PR. This PR needs to be merged, because it is a show stopper for PR #62, #68, #71, #72.