YangtaoWANG95 / TokenCut

(CVPR 2022) Pytorch implementation of "Self-supervised transformers for unsupervised object discovery using normalized cut"
MIT License
304 stars 34 forks source link

Re-implementation of unsupervised saliency detection #15

Closed Jun-Pu closed 2 years ago

Jun-Pu commented 2 years ago

Hi, thanks for the interesting work!

I can not gain the results of "TokenCut + BS" as reported in the paper.

E.g., I gain "ECSSD: IoU (0.621), Acc (0.891), F-max (0.751)" , which are much lower than ones claimed.

As for "TokenCut", I have gained the same results, so I guess the problems may occur at "bilateral solver". Could you please check and update the codes?

Thanks in advance!

Best,

YangtaoWANG95 commented 2 years ago

Hi Jun Pu,

Thank you for your interest. I have rechecked the current version. Here is the result that I get:

(tokencut)TokenCut/unsupervised_saliency_detection$ python get_saliency.py --out-dir ECSSD --sigma-spatial 16 --sigma-luma 16 --sigma-chroma 8 --nb-vis 1 --vit-arch small --patch-size 16 --dataset ECSSD
Namespace(dataset='ECSSD', img_path=None, nb_vis=1, out_dir='ECSSD', patch_size=16, sigma_chroma=8.0, sigma_luma=16.0, sigma_spatial=16.0, tau=0.2, vit_arch='small', vit_feat='k')
Loading weight from /dino/dino_deitsmall16_pretrain/dino_deitsmall16_pretrain.pth
Load small pre-trained feature...

TokenCut evaluation:
{'IoU': 0.7119671522730496, 'accuracy': 0.9178118275292217, 'F_max': 0.8034312725067139}
TokenCut + bilateral solver evaluation:
{'IoU': 0.7720177292864828, 'accuracy': 0.9341852449579164, 'F_max': 0.8735781311988831}

Perhaps there is an incompatible library version on your environment? Can you share your conda environment?

Best, Yangtao

Jun-Pu commented 2 years ago

Ohlala~thanks so much for the quick reply and for reminding the environments! I have upgraded the "scipy" package from version 1.7.3 to version 1.9.2, and the problem has been solved! Note that I am using anaconda3.

Originally, I gained:

/home/yzhang1/anaconda3/bin/python3.8 /home/yzhang1/PythonProjects/TokenCut_bilateral/get_saliency.py
Namespace(dataset='ECSSD', img_path=None, nb_vis=1, out_dir='ECSSD', patch_size=16, sigma_chroma=8, 
sigma_luma=16, sigma_spatial=16, tau=0.2, vit_arch='small', vit_feat='k')
Loading weight from /home/yzhang1/PythonProjects/TokenCut_bilateral/dino_pretrains/dino_deitsmall16_pretrain.pth
Load small pre-trained feature...
ECSSD
args.out_dir: ECSSD, img_name: 0001.jpg

100%|██████████| 1000/1000 [05:27<00:00,  3.05it/s]
TokenCut evaluation:
{'IoU': 0.7129746048355009, 'accuracy': 0.918353023160249, 'F_max': 0.8046749234199524}

TokenCut + bilateral solver evaluation:
{'IoU': 0.6214015735024441, 'accuracy': 0.891019322630018, 'F_max': 0.7509733438491821}
Process finished with exit code 0

After upgrading "scipy" package to the newest version, I gained:

TokenCut evaluation:
{'IoU': 0.7119671522730496, 'accuracy': 0.9178118275292217, 'F_max': 0.8034312725067139}

TokenCut + bilateral solver evaluation:
{'IoU': 0.7720177292864828, 'accuracy': 0.9341852449579164, 'F_max': 0.8735781311988831}

Thanks so much for the work!

Best, Jun-Pu

Jun-Pu commented 2 years ago

To supplement, I have also conducted some little adjustments according to your pseudo-codes

The results upon "ECSSD" are as follows:

TokenCut evaluation:
{'IoU': 0.7125319285520818, 'accuracy': 0.9178285121507943, 'F_max': 0.804131031036377}

TokenCut + bilateral solver evaluation:
{'IoU': 0.7725453542884497, 'accuracy': 0.934181008453481, 'F_max': 0.8743699193000793}

Again, thanks so much for the interesting & insightful work!

Best, Jun-Pu