Closed ElVictorious closed 2 years ago
I could reproduce the error on colab.
This happens with lstm_layers > 1
. Can you try with lstm_layers=1
?
I assume the non-contiguity comes from expanding the hidden state tensors for the LSTM layers.
I will test if we can fix this with contiguous()
Hi,
First, congrats for the amazing job with this repo.
I experienced that the temporal fusion transformer do not works with GPU but it works well with cpus. I used 3 different computers and I had the same problem, even with colab. Same issue..
Thanks in advance for your help
model = TFTModel( input_chunk_length=INLEN, output_chunk_length=N_FC, hidden_size=HIDDEN, lstm_layers=LSTMLAYERS, num_attention_heads=ATTHEADS, dropout=DROPOUT, batch_size=BATCH, n_epochs=EPOCHS, likelihood=QuantileRegression(quantiles=QUANTILES),
loss_fn=MSELoss(),
model.fit( ts_ttrain, future_covariates=tcov, verbose=True)
RuntimeError Traceback (most recent call last)