omegafragger / DDU

Code for Deterministic Neural Networks with Appropriate Inductive Biases Capture Epistemic and Aleatoric Uncertainty
https://arxiv.org/abs/2102.11582
MIT License
119 stars 20 forks source link

Feature Densities Always Zero #2

Open JacobOaks opened 2 years ago

JacobOaks commented 2 years ago

Hello,

I'm having an issue with the active learning script. I'm running:

CUDA_VISIBLE_DEVICES=7 python active_learning_script.py --seed 1 --model resnet18 -sn -mod --al-type gmm

and I have set a breakpoint right before acquisition in active_learning_script.py here:

Screen Shot 2022-02-04 at 1 12 13 PM

whenever I inspect the result of compute_density(logits, class_prob), density for all instances is always zero like so:

Screen Shot 2022-02-04 at 1 18 38 PM

Upon digging deeper, I noticed that the feature Gaussians always assign extremely low probability to every class, leading to all of these zeros. This happens at every acquisition step for me, regardless of the dataset size and seed (I've tried 2 seeds). I've even tried programmatically setting up a check to tell me if the densities are ever not zero for entire training runs, and it never happens. Thus, for me, the following call to torch.topk amounts to random selection at every acquisition step. I'm wondering if this is an issue that you've experienced.

Thanks, Jacob

JacobOaks commented 2 years ago

Hello,

Just checking if you've had a chance to look at this given that it has potentially quite big implications in terms of the validity of the associated paper.

Jacob

LinyeLi60 commented 10 months ago

I have met the same problem!