Closed SokolOFFF closed 1 month ago
If your doing time series of continuous numbers, you should remove the embedding layer. embedding layers are meant to take discrete numbers (like a token lets say 3345) and convert them to a floating point tensor with n_features.
Hello! Have you solved this problem?
Have you solved this problem?
The embedding_dim should be seen as synonym for hidden_dim, when you are not using embeddings. The xLSTMBlockStack does not contain embedding/un-embedding layers.
Hello! I have read in your README.md file the next point: "For non language applications or for integrating in other architectures you can use the xLSTMBlockStack". I have an issue now: I want to use your xlstm implementation for time series data prediction, however, I have met problems with "embeddings" while using "xLSTMBlockStack". Based on what should I define "embedding_dim" variable in config? Can you please elaborate on this topic? Thank you in advance.