pytorch / pytorch

Tensors and Dynamic neural networks in Python with strong GPU acceleration
https://pytorch.org
Other
82.67k stars 22.26k forks source link

Dimension reducing variants of bitwise operations (bitwise_or, bitwise_and, bitwise_xor) #35641

Open Ajk4 opened 4 years ago

Ajk4 commented 4 years ago

🚀 Feature

Function for reducing tensor along specified dim with bitwise operations.

Motivation

In my project I have a need to reduce my tensors along some dimensions with a bitwise operations.

Pitch

In pytorch I can reduce my tensor along dim in multiple ways (like t.min(dim=0), t.sum(dim=0), t.any(dim=0), t.all(dim=0). Unfortunately it's not yet possible to reduce dimension with a bitwise operation like bitwise_or, bitwise_xor, bitwise_and.

Possible method headers could look like this:

def bitwise_or(dim=None, keepdim=False)
def bitwise_and(dim=None, keepdim=False)
def bitwise_xor(dim=None, keepdim=False)

Currently in BoolTensor there are two special methods any(dim) and all(dim) that implements logical or/and reduction. Bitwise_or/bitwise_and could be a generalization of those two to other tensor types. (Similarly as & operator that is a bitwise operation for non Bool tensors, and a logical one for BoolTensor)

Possibly loosely connected to https://github.com/pytorch/pytorch/pull/26824 - however it seems like it's only a pytorch distributed reduction method, not a tensor API one.

Alternatives

I implemented binary reducing operations in python using builtin pytorch functions with a loop along dimension dim. I imagine that implementing those directly in C++/CUDA could yield performance boost.

ailzhang commented 4 years ago

IIRC these ops were implemented by @xuhdev , please feel free to comment whether you think it makes sense or not. Thanks!

xuhdev commented 4 years ago

It perfectly makes sense to me! In fact, it would be great if there is a way to automatically turn any associative operators into a dimension reduction operators.

Ajk4 commented 4 years ago

It perfectly makes sense to me! In fact, it would be great if there is a way to automatically turn any associative operators into a dimension reduction operators.

Hi @xuhdev

Do you perhaps have those op implementations available somewhere? I would be very grateful if I could use those in my code!

Also is there any chance of merging it to pytorch in the future?

Cheers!

xuhdev commented 4 years ago

@Ajk4 I'm not aware of any such implementation. I was simply suggesting that this might be a favorable structural change in the future :) This issue looks perfectly reasonable to me.

lezwon commented 2 years ago

It would be nice to have these operators. Is it being prioritized?

function2-llx commented 2 years ago

How about looking at NumPy's implementation? E.g, to reduce with mutiplication, they can write something like:

np.multiply.reduce([2,3,5])
marwanj commented 1 year ago

Also looking for an efficient way to do this