FriedRonaldo / PsyNet

Official Implementation of "PsyNet: Self-supervised Approach to Object Localization Using Point Symmetric Transformation"
25 stars 4 forks source link
localization pst psynet self-supervised

PsyNet: Self-supervised Approach to Object Localization Using Point Symmetric Transformation

Official Pytorch implementation of "PsyNet: Self-supervised Approach to Object Localization Using Point Symmetric Transformation"

This implementation is based on these repos.

Pre-trained checkpoints are now available

Joint work with

PAPER

This paper is accepted by AAAI 2020. The pdf is available at https://aaai.org/ojs/index.php/AAAI/article/view/6615/6469

NOTE

structure

Abstract

Existing co-localization techniques significantly lose performance over weakly or fully supervised methods in accuracy and inference time. In this paper, we overcome common drawbacks of co-localization techniques by utilizing self-supervised learning approach. The major technical contributions of the proposed method are two-fold. 1) We devise a new geometric transformation, namely point symmetric transformation and utilize its parameters as an artificial label for self-supervised learning. This new transformation can also play the role of region-drop based regularization. 2) We suggest a heat map extraction method for computing the heat map from the network trained by self-supervision, namely class-agnostic activation mapping. It is done by computing the spatial attention map. Based on extensive evaluations, we observe that the proposed method records new state-of-the-art performance in three fine-grained datasets for unsupervised object localization. Moreover, we show that the idea of the proposed method can be adopted in a modified manner to solve the weakly supervised object localization task. As a result, we outperform the current state-of-the-art technique in weakly supervised object localization by a significant gap.

Todo

Requirement

Data Preparation

How to Run

Arguments

-

Train

python -W ignore main.py --dataset CUB --network vggcam16bn --tftypes OR

Test

References

-