Closed sungsy closed 7 months ago
It looks like an issue with the data dimensionality or format. What is the source of the data?
I downloaded the HAR-dataset from here and placed it inside data/HAR/(test.pt, train.pt, val.pt)
. When I run python main.py --experiment_description exp1 --run_description run_1 --seed 123 --training_mode self_supervised --selected_dataset HAR
I get the error above.
The training data has after extracting from dictionary a shape of torch.Size([5881, 9, 128]), torch.Size([5881])
for X and y.
I'm not sure, but I guess it could be due to different Numpy versions. Unfortunately, I haven't recorded which version I used. However, you can try downgrading and see if the problem persists.
Issue could be solved by downgrading to numpy 1.21.0
. Not sure if newer version could also work.
Following warning shown now (also for numpy 1.20.3
):
/home/TS-TCC/dataloader/augmentations.py:42: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray.
warp = np.concatenate(np.random.permutation(splits)).ravel()
<__array_function__ internals>:5: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray.
Data loaded ...
Training started ....
/home/TS-TCC/models/TC.py:52: UserWarning: Implicit dimension choice for log_softmax has been deprecated. Change the call to include dim=X as an argument.
nce += torch.sum(torch.diag(self.lsoftmax(total)))`
I am currently trying to get the repository running. Soon as I try to run the self-supervised part, I run into the following error. I downloaded the datasets from the dataverse... I understand the error message and can follow it, but I am wondering why it does not work for me out of the box. Is there anything I am missing here?