haeusser / learning_by_association

This repository contains code for the paper Learning by Association - A versatile semi-supervised training method for neural networks (CVPR 2017) and the follow-up work Associative Domain Adaptation (ICCV 2017).
https://vision.in.tum.de/members/haeusser
Apache License 2.0
151 stars 63 forks source link

New experiment mnist - Svhn #11

Closed engharat closed 6 years ago

engharat commented 6 years ago

Hi, I'm trying to obtain results for the experiment "mnist --> svhn", as you published in your paper only the case "svhn --> mnist". I have no problem running the experiment as it needs only to swap the training and target dataset names, but my issue is that, no matter of what hyperparameters I choose, the loss exhibits always the same behaviour: it start from a little value around 0.5, and suddenly after few iterations it jumps to 6.0 - 7.0 value,than it settle around 6.5 value until training is finished. The accuracy is 32%. I tried several hyperparameters values, taken from all the other experiments settings. but until now I have no better success than this. Do you have some walker / visit/ logit specific hyperparameter to suggest for this experiment? Edit: I found that the sudden jump of loss is caused by the activation of walker_weight, controlled by the variable walker_weight_envelope_delay. No I'll try with walker_weight_envelope_delay: 2000 instead of 500, or not activating it at all.

haeusser commented 6 years ago

I uploaded for you the logs including hyperparams and TFEvents so you can visualize the graph with TensorBoard: https://vision.in.tum.de/~haeusser/da_svhn_mnist.zip

Cheers, Philip

engharat commented 6 years ago

Very very thanks!