jihunchoi / recurrent-batch-normalization-pytorch

PyTorch implementation of recurrent batch normalization
243 stars 34 forks source link

Boolean to Float Tensor #4

Closed herleeyandi closed 6 years ago

herleeyandi commented 6 years ago

What do you mean in line 262 mask = (time < length).float().unsqueeze(1).expand_as(h_next) I just got error AttributeError: 'bool' object has no attribute 'float' since (time < length) returns True

jihunchoi commented 6 years ago

Hi, length should be a variable containing the length of each sequence in a batch. Note that the line 262 is equivalent to the following statement: torch.lt(time, length).float().unsqueeze(1).expand_as(h_next). The last expand seems to be unnecessary since recent versions of pytorch support broadcasting though.

herleeyandi commented 6 years ago

image The time is scalar variable? Here is my debug result. It still give me that error. I am using pytorch v0.2, do you know how to do it in pytorch v0.2?

herleeyandi commented 6 years ago

Here is result when I am using torch.lt(time, length).float().unsqueeze(1).expand_as(h_next). image

jihunchoi commented 6 years ago

Why is the length a tuple of multiple variables? It should be 1-D integer variable containing the lengths. In pytorch v0.2, I get the following result:

In [19]: length = Variable(torch.LongTensor([2, 3, 4]))

In [20]: time = 3

In [21]: mask = (time < length).float().unsqueeze(1)

In [22]: mask
Out[22]: 
Variable containing:
 0
 0
 1
[torch.FloatTensor of size 3x1]

And the torch.lt statement seems to emit an error; my oversight.

herleeyandi commented 6 years ago

Oh my bad, I forget to comment 1 line in my code. Thank you so much. 1 star for this repository.

gabrer commented 6 years ago

I have got a similar problem. May I suggest to change the name of the variable "length"? By itself, to me, it is not very clear that is an array with the maximum length of each string. Or we might just add a comment. Anyway, I enjoyed your implementation! :)