YGZWQZD / LAMDA-SSL

30 Semi-Supervised Learning Algorithms
MIT License
184 stars 16 forks source link

I cannot reproduce some of the Examples with deep algorithms (maybe because of pytorch 2.X?) #9

Open jwainer opened 1 year ago

jwainer commented 1 year ago

I just installed LAMDA-SSL from github. It instaled the newer version of all packages, including torch==2.0.1 (pip freeze below)

I cannot reproduce the Example that uses deeplearning

Assemble and others non-deep algorithms work fine:

(luan) Atlas:LAMDA-SSL wainer$ python Examples/Assemble_BreastCancer.py  
(luan) Atlas:LAMDA-SSL wainer$

but

(luan) Atlas:LAMDA-SSL wainer$ python Examples/FixMatch_BreastCancer.py 
Traceback (most recent call last):
  File "/Users/wainer/Dropbox/alunos/luan/LAMDA-SSL/Examples/FixMatch_BreastCancer.py", line 64, in <module>
    model.fit(X=labeled_X,y=labeled_y,unlabeled_X=unlabeled_X)
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/LAMDA_SSL/Base/DeepModelMixin.py", line 326, in fit
    self.init_train_dataloader()
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/LAMDA_SSL/Base/DeepModelMixin.py", line 243, in init_train_dataloader
    self._labeled_dataloader, self._unlabeled_dataloader = self._train_dataloader.init_dataloader(
                                                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/LAMDA_SSL/Dataloader/TrainDataloader.py", line 344, in init_dataloader
    self.labeled_dataloader = self.labeled_dataloader.init_dataloader(dataset=self.labeled_dataset,
                              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/LAMDA_SSL/Dataloader/LabeledDataloader.py", line 86, in init_dataloader
    self.dataloader= DataLoader(dataset=self.dataset,
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/torch/utils/data/dataloader.py", line 245, in __init__
    raise ValueError('prefetch_factor option could only be specified in multiprocessing.'
ValueError: prefetch_factor option could only be specified in multiprocessing.let num_workers > 0 to enable multiprocessing, otherwise set prefetch_factor to None.

I have been altering the obvious things such as default prefetch_factor and num_workers but after 2 hours of doing this I still get some problem somewhere. Below is my last attempt, by creating Dataloaders with the appropriate num_workers and prefetch_factor for the FixMatch_BreastCancer.py code, but I am not sure my modifications are correct. Someone is probably much more competent to make these changes...

(luan) Atlas:progs wainer$ python a2.py
Traceback (most recent call last):
  File "/Users/wainer/Dropbox/alunos/luan/progs/a2.py", line 82, in <module>
    model.fit(X=labeled_X,y=labeled_y,unlabeled_X=unlabeled_X)
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/LAMDA_SSL/Base/DeepModelMixin.py", line 335, in fit
    self.fit_epoch_loop(valid_X,valid_y)
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/LAMDA_SSL/Base/DeepModelMixin.py", line 311, in fit_epoch_loop
    self.fit_batch_loop(valid_X,valid_y)
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/LAMDA_SSL/Base/DeepModelMixin.py", line 280, in fit_batch_loop
    for (lb_idx, lb_X, lb_y), (ulb_idx, ulb_X, _) in zip(self._labeled_dataloader, self._unlabeled_dataloader):
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/torch/utils/data/dataloader.py", line 633, in __next__
    data = self._next_data()
           ^^^^^^^^^^^^^^^^^
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/torch/utils/data/dataloader.py", line 677, in _next_data
    data = self._dataset_fetcher.fetch(index)  # may raise StopIteration
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/torch/utils/data/_utils/fetch.py", line 51, in fetch
    data = [self.dataset[idx] for idx in possibly_batched_index]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/torch/utils/data/_utils/fetch.py", line 51, in <listcomp>
    data = [self.dataset[idx] for idx in possibly_batched_index]
            ~~~~~~~~~~~~^^^^^
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/LAMDA_SSL/Dataset/LabeledDataset.py", line 217, in __getitem__
    Xi, yi = self.apply_transform(Xi, yi)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/LAMDA_SSL/Dataset/LabeledDataset.py", line 185, in apply_transform
    _X = self._transform(X, item)
         ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/LAMDA_SSL/Dataset/LabeledDataset.py", line 130, in _transform
    X=self._transform(X,item)
      ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/LAMDA_SSL/Dataset/LabeledDataset.py", line 132, in _transform
    X = transform(X)
        ^^^^^^^^^^^^
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/LAMDA_SSL/Base/Transformer.py", line 18, in __call__
    return self.fit_transform(X,y,fit_params=fit_params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/sklearn/utils/_set_output.py", line 140, in wrapped
    data_to_wrap = f(self, X, *args, **kwargs)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/LAMDA_SSL/Base/Transformer.py", line 30, in fit_transform
    return self.fit(X=X,y=y,fit_params=fit_params).transform(X)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/sklearn/utils/_set_output.py", line 140, in wrapped
    data_to_wrap = f(self, X, *args, **kwargs)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/wainer/miniconda3/envs/luan/lib/python3.11/site-packages/LAMDA_SSL/Transform/ToTensor.py", line 51, in transform
    X=torch.Tensor(X)
      ^^^^^^^^^^^^^^^
TypeError: new(): data must be a sequence (got Image)

I would guess that the problem is with the torch 2.X version, but I am not sure.

pip freeze:

(luan) Atlas:progs wainer$ pip freeze
certifi==2023.5.7
charset-normalizer==3.1.0
contourpy==1.0.7
cycler==0.11.0
filelock==3.12.0
flake8==6.0.0
fonttools==4.39.4
idna==3.4
Jinja2==3.1.2
joblib==1.2.0
kiwisolver==1.4.4
LAMDA-SSL @ file:///Users/wainer/Dropbox/alunos/luan/LAMDA-SSL
MarkupSafe==2.1.3
matplotlib==3.7.1
mccabe==0.7.0
mpmath==1.3.0
networkx==3.1
numpy==1.24.3
packaging==23.1
pandas==2.0.2
Pillow==9.5.0
psutil==5.9.5
pycodestyle==2.10.0
pyflakes==3.0.1
pyparsing==3.0.9
python-dateutil==2.8.2
pytz==2023.3
requests==2.31.0
scikit-learn==1.2.2
scipy==1.10.1
six==1.16.0
sympy==1.12
threadpoolctl==3.1.0
torch==2.0.1
torch-geometric==2.3.1
torchdata==0.6.1
torchtext==0.15.2
torchvision==0.15.2
tqdm==4.65.0
typing_extensions==4.6.3
tzdata==2023.3
urllib3==2.0.3