Closed nejox closed 5 months ago
I think this may not be a bug: not casting the static covariates would lead to issues using the PyTorch models.
I faced with the same issue. Mapping to a smaller range works for me
def map_large_ids(ids):
unique_ids = np.unique(ids)
id_dict = {id: i for i, id in enumerate(unique_ids)}
return id_dict
Describe the bug I'm working with historical sales data of multiple stores and products. I convert a DataFrame indexed by id_store, id_product, and target_date into multiple TimeSeries objects, marking IDs as static covariates. Initially integers, these IDs automatically get parsed to float64. Due to performance problems with the TemporalFusionTransformer with my amount of time series I was attempting mixed-precision training via TimeSeries.astype("float32"). This modifies static covariates to float32, altering ID values in a weird way: e.g., id_product 100100037 changes to 100100040.
Questions:
To Reproduce
output:
Expected behavior Static Covariates keep the original dtype like integer instead of getting parsed to floats.
System:
Python version: 3.9 darts version: 0.27.2 lightning version: 2.1.3 torch version: 2.1.0 OS: macOS 14.2.1 (23C71)