WenjieDu / Awesome_Imputation

Awesome Deep Learning for Time-Series Imputation, including a must-read paper list about applying neural networks to impute incomplete time series containing NaN missing values/data
BSD 3-Clause "New" or "Revised" License
223 stars 26 forks source link

An error occurred during hyperparameter optimization #23

Closed RainioTop closed 2 weeks ago

RainioTop commented 3 weeks ago

When performing hyperparameter optimization using nnictl on the Windows platform, a TypeError occurred: data should be an instance of list/np.ndarray/torch.Tensor, but got <class 'bytes'>.

SAITS_searching_config.yml:

# nnictl create -c SAITS_searching_config.yml --port 8081 --debug --foreground
experimentName: SAITS hyper-param searching
authorName: WenjieDu
trialConcurrency: 1
trainingServicePlatform: local
searchSpacePath: SAITS_PhysioNet2012_tuning_space.json
#searchSpacePath: SAITS_BeijingAir_tuning_space.json
#searchSpacePath: SAITS_ETTh1_tuning_space.json
#searchSpacePath: SAITS_Pedestrian_tuning_space.json
multiThread: true
useAnnotation: false
tuner:
    builtinTunerName: Random

trial:
    command: set enable_tuning=1 & pypots-cli tuning --model pypots.imputation.SAITS --train_set ../../data/generated_datasets/physionet_2012_rate01_point/train.h5 --val_set ../../data/generated_datasets/physionet_2012_rate01_point/val.h5

    codeDir: .
    gpuNum: 1

localConfig:
    useActiveGpu: true
    maxTrialNumPerGpu: 1
    gpuIndices: 0

PixPin_2024-11-06_22-08-52

stderr file:

cmd.exe : 2024-11-06 22:00:43 [INFO]: Have set the random seed as 2022 for numpy and pytorch.
At C:\Users\rainio\nni-experiments\dn4hsyft\trials\pq24Z\run.ps1:12 char:1
+ cmd.exe /c 'set enable_tuning=1 & pypots-cli tuning --model pypots.im ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo          : NotSpecified: (2024-11-06 22:0...py and pytorch.:String) [], RemoteException
    + FullyQualifiedErrorId : NativeCommandError

2024-11-06 22:00:43 [INFO]: The tunner assigns a new group of params: {'n_steps': 48, 'n_features': 35, 'epochs': 100, 'patience': 10, 'n_layers': 1, 'd_model': 64, 'd_ffn': 256, 'n_heads': 1, '
d_k': 128, 'd_v': 256, 'dropout': 0.5, 'attn_dropout': 0.3, 'lr': 0.0003889487885125863}
2024-11-06 22:00:43 [INFO]: No given device, using default device: cuda
2024-11-06 22:00:43 [WARNING]: \u203c\ufe0f saving_path not given. Model files and tensorboard file will not be saved.
2024-11-06 22:00:43 [WARNING]: \u203c\ufe0f d_model must = n_heads * d_k, it should be divisible by n_heads and the result should be equal to d_k, but got d_model=64, n_heads=1, d_k=128
2024-11-06 22:00:43 [WARNING]: \u26a0\ufe0f d_model is reset to 128 = n_heads (1) * d_k (128)
2024-11-06 22:00:43 [INFO]: SAITS initialized with the given hyperparameters, the number of trainable parameters: 360,878
2024-11-06 22:00:44 [INFO]: Option lazy_load is set as False, hence loading all data from file...
Traceback (most recent call last):
  File "C:\Users\rainio\.conda\envs\Awesome_Imputation\lib\runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "C:\Users\rainio\.conda\envs\Awesome_Imputation\lib\runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "C:\Users\rainio\.conda\envs\Awesome_Imputation\Scripts\pypots-cli.exe\__main__.py", line 7, in <module>
  File "C:\Users\rainio\.conda\envs\Awesome_Imputation\lib\site-packages\pypots\cli\pypots_cli.py", line 35, in main
    service.run()
  File "C:\Users\rainio\.conda\envs\Awesome_Imputation\lib\site-packages\pypots\cli\tuning.py", line 281, in run
    model.fit(train_set=train_set, val_set=val_set)
  File "C:\Users\rainio\.conda\envs\Awesome_Imputation\lib\site-packages\pypots\imputation\saits\model.py", line 246, in fit
    training_set = DatasetForSAITS(train_set, return_X_ori=False, return_y=False, file_type=file_type)
  File "C:\Users\rainio\.conda\envs\Awesome_Imputation\lib\site-packages\pypots\imputation\saits\data.py", line 61, in __init__
    super().__init__(
  File "C:\Users\rainio\.conda\envs\Awesome_Imputation\lib\site-packages\pypots\data\dataset\base.py", line 150, in __init__
    self.X, self.X_ori, self.X_pred, self.y = self._check_array_input(X, X_ori, X_pred, y, "tensor")
  File "C:\Users\rainio\.conda\envs\Awesome_Imputation\lib\site-packages\pypots\data\dataset\base.py", line 289, in _check_array_input
    X_ori = turn_data_into_specified_dtype(X_ori, out_dtype)
  File "C:\Users\rainio\.conda\envs\Awesome_Imputation\lib\site-packages\pypots\data\utils.py", line 28, in turn_data_into_specified_dtype
    raise TypeError(f"data should be an instance of list/np.ndarray/torch.Tensor, but got {type(data)}")
TypeError: data should be an instance of list/np.ndarray/torch.Tensor, but got <class 'bytes'>
github-actions[bot] commented 3 weeks ago

Hi there 👋,

Thank you so much for your attention to PyPOTS! You can follow me on GitHub to receive the latest news of PyPOTS. If you find our research is helpful to your work, please star⭐️ this repository. Your star is your recognition, which can help more people notice PyPOTS and grow PyPOTS community. It matters and is definitely a kind of contribution to the community.

I have received your message and will respond ASAP. Thank you for your patience! 😃

Best, Wenjie

WenjieDu commented 2 weeks ago

The user says it works fine on Ubuntu, so the error is probably caused by the dev env