Enny1991 / PLSTM

137 stars 32 forks source link

Time in input sequence #11

Open kush789 opened 7 years ago

kush789 commented 7 years ago

Hey!

I am confused about one thing, the time should be in the first position of the input tensor right? For example, if I have a input tensor of shape (?, seq_len, nb_features), [:, :, 0] would be the times.

Enny1991 commented 7 years ago

Hi @kush789,

actually no, the times are the last feature, in the code

i_size = input_size.value - 1 # -1 to extract time times = array_ops.slice(inputs, [0, i_size], [-1, 1]) filtered_inputs = array_ops.slice(inputs, [0, 0], [-1, i_size])

so if your input tensor is (?, seq_len, nb_features), then [:, :, -1] would be your times.

+Enea

julj commented 7 years ago

I am confused, too: the doc specifies that

inputs: input Tensor, 2D, batch x num_units

What is "num_units" in that case ?

Enny1991 commented 7 years ago

Sorry for the delay, you are right I should update the docs. 'num_units' should actually be num_features in the input, that is why the time input is 'hidden' in [:,:,-1].

+Enea

shalberd commented 4 years ago

is there a Keras implementation of this somewhere, especially with regards to async=True? https://github.com/fferroni/PhasedLSTM-Keras