Open ShreyasKhandekar opened 1 month ago
maxPool currently only takes in a single integral parameter which creates a square kernel.
maxPool
It should be simple enough to implement this to also support passing a two-tuple to support arbitrary kernel sizes.
See https://pytorch.org/docs/stable/generated/torch.nn.MaxPool1d.html#maxpool1d. kernel_size can be int or Tuple[int].
kernel_size
int
Tuple[int]
Ex:
x = x.maxPool(3); // Square 2D pooling
instead of
x = x.maxPool((3,2));
We should also add the other parameters that pytorch supports like stride, padding, dilation etc.
stride
padding
dilation
I didn't add rectangular pooling yet, but added more features to maxPool that helped me like padding and dilation, which unblocked my progress for now.
maxPool
currently only takes in a single integral parameter which creates a square kernel.It should be simple enough to implement this to also support passing a two-tuple to support arbitrary kernel sizes.
See https://pytorch.org/docs/stable/generated/torch.nn.MaxPool1d.html#maxpool1d.
kernel_size
can beint
orTuple[int]
.Ex:
instead of
We should also add the other parameters that pytorch supports like
stride
,padding
,dilation
etc.