Time series Timeseries Deep Learning Machine Learning Python Pytorch fastai | State-of-the-art Deep Learning library for Time Series and Sequences in Pytorch / fastai
Originally posted by **ChrystleMyrnaLobo** November 14, 2023
Hi! I'm using the [MiniRocket Pytorch implementation](tutorial_nbs/10_Time_Series_Classification_and_Regression_with_MiniRocket.ipynb) for multivariate time series classification. Thanks for the GPU compatible implementation! :D
I want to use it with variable sequence_length input. I _do not_ want to make the ts data fixed length (by padding). Instead, treat it as variable length itself and leverage the fully convolutional network of MiniRocket. The input shape for MiniRocket feature extractor is `n_samples x n_variables x sequence_length` and output is `n_samples x 9996 x 1` which is independent of the input sequence_length.
While there are no errors in the forward pass of MiniRocket, the output features have nan values for last few features/filter output when input sequence_length is lesser than the `seq_len` of MiniRocketFeatures constructor. Could you please help me understand why some features are nan? What should be the `seq_len` of MiniRocketFeatures constructor if the X_train has variable sequence_length?
Minimal code to reproduce the issue
```
# Creating the feature extractor
X_train = np.random.rand(5, 4, 1000)
model_mrf = MiniRocketFeatures(c_in=4, seq_len=1000).to(default_device())
model_mrf.fit(X_train.astype('float32'))
# Forward pass
X_test = np.random.rand(1, 4, 800)
print(X_test.shape)
print(np.linalg.norm(X_test))
X_test_feat = get_minirocket_features(X_test.astype('float32'), model_mrf, to_np=True)
print(X_test_feat.shape)
print(np.linalg.norm(X_test_feat))
print("Count of nan features", 9996-np.count_nonzero(~np.isnan(X_test_feat)))
```
Output
```
(1, 4, 800)
32.77988802149151
(1, 9996, 1)
nan
Count of nan features 252
```
Related discussion: [this](https://github.com/timeseriesAI/tsai/discussions/149)
Discussed in https://github.com/timeseriesAI/tsai/discussions/857