Open arteymix opened 7 years ago
Some QAing then merge?
We can treat broadcasting later if you prefer, because it will require some decisions. I guess we can only work out compatible shape for now.
I wonder if the MultiIterator
should simply be a shape-compatible iterator and we would only need to do something like:
var iterator = new MultiIterator ({a, b.broadcast_to (a.shape)});
Then only focus on providing extra information for iterating more efficiently (e.g. provide the current stride and number of scalars remaining in that setup).
I moved them out of array because they really only need two informations that describes the type of element of array, not the true information of array (shape, strides, etc).
I don't see we ever use move and reset. We can always create a new iterator to substitute reset. And the only move operation we will ever need is next(). If we want a particular element from index, use get_pointer on the array. I do see it may be necessary as we go along the buffers that we need a move method, but I'd say we look at it by then because that's when the use case is there it is clear how it can be done or avoided.
flatten doesn't always return a view. But I agree the renaming is not needed since by definition iterators are flat.
Yes. we can have a _new function that does the broadcasting. Such that the gobject constructor of multiiterator itself can require shape matched input.
once we add buffering to Iterator, we will add another _new function that calculates the buffer size and transposes the input array properly.
It seems that we need to finish improving Iterator
first before attacking this one. I really like the latest work :+1:
As said over the line, we will use this for broadcasting n-ary operators (e.g. n-element-wise operations) and implement tensor products. For contracting dimensions, we need a more specialized iterator or a local implementation.