facebookresearch / jepa

PyTorch code and models for V-JEPA self-supervised learning from video.
Other
2.53k stars 242 forks source link

Question on number of heads in Attentive Pooler #60

Open orrzohar opened 2 months ago

orrzohar commented 2 months ago

Dear authors, Thank you for your great work. I wondered why you set the number of heads in the attentive pooler to be equal to that of the JEPA encoder?

https://github.com/facebookresearch/jepa/blob/13fbba844b3ce3d16a1c1b633a52fff6f88e9876/evals/video_classification_frozen/eval.py#L184

Is this a principled design decision or just a hyperparameter? Best, Orr