NVlabs / DIODE

Official PyTorch implementation of Data-free Knowledge Distillation for Object Detection, WACV 2021.
https://openaccess.thecvf.com/content/WACV2021/html/Chawla_Data-Free_Knowledge_Distillation_for_Object_Detection_WACV_2021_paper.html
Other
60 stars 6 forks source link

Same content in label files on bdd100k dataset #5

Closed abse4411 closed 2 years ago

abse4411 commented 2 years ago

I downloaded bdd100k dataset from the provided link bdd100k dataset , and found that all contents of label files are same (at least 20 label files I have seen). e.g.: cat bdd100k\labels\train2014\0a0a0b1a-7c39d841.txt

16 0.711640 0.774731 0.102000 0.068660 0 0.057500 0.594460 0.045600 0.122158 13 0.307800 0.763254 0.109320 0.191058 25 0.400000 0.774302 0.097640 0.155459

cat bdd100k\labels\train2014\0a0a0b1a-27d9fc44.txt

16 0.711640 0.774731 0.102000 0.068660 0 0.057500 0.594460 0.045600 0.122158 13 0.307800 0.763254 0.109320 0.191058 25 0.400000 0.774302 0.097640 0.155459

cat bdd100k\labels\train2014\0a0b16e2-93f8c456.txt

16 0.711640 0.774731 0.102000 0.068660 0 0.057500 0.594460 0.045600 0.122158 13 0.307800 0.763254 0.109320 0.191058 25 0.400000 0.774302 0.097640 0.155459

Shall I extract label files from the official bdd100k dataset?

akshaychawla commented 2 years ago

Thanks for your interest in our work @abse4411 . Yes you are correct, the label files for the bdd100k dataset are dummy values. To get labels for bdd100k, please download the dataset from the official bdd100k website https://www.bdd100k.com/

The labels have no impact on our experiments because we use bdd100k for data-free (label-free) knowledge distillation, where we only use the images from bdd100k, not the bbox labels.

For reference, we took 2 crops from each image of the training (70k) and validation (10k) set of bdd100k. This gave us the 160k images for distillation.

abse4411 commented 2 years ago

Thank you for your reply and pioneering work! Have a nice day! :-)