jacobkimmel / pytorch_convgru

Convolutional Gated Recurrent Units implemented in PyTorch
MIT License
191 stars 41 forks source link

Sequence length? #4

Closed wuchlei closed 5 years ago

wuchlei commented 5 years ago

The input is actually a sequence of items. So it seems that your implementation isn't right?

TethysSun commented 5 years ago

Not only the sequence length. Even adding with sequence length, this code makes all cells at the same depth layer not sharing weights parameters....

Melika-Ayoughi commented 5 years ago

I have the same question: Is this implementation a stacked convgru? so the n_layer is actually the number of stacked cells on top of each other? Then how are u taking the sequence length into account? I only see one for loop for n_layers so i'm curious to know. Cheers

jacobkimmel commented 5 years ago

Is this implementation a stacked convgru? so the n_layer is actually the number of stacked cells on top of each other?

Yes.

Then how are u taking the sequence length into account? I only see one for loop for n_layers so i'm curious to know.

I pass in elements of the sequence in the training loop. Something like

model = ConvGRU()

# Batch, Time, Channels, Height, Width
x = torch.FloatTensor(1, 3, 8, 64, 64)

outputs = []
for t in range(x.size(1)):
  out = model(x[:, t, :, :, :])
  outputs.append(out)