huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
133.95k stars 26.79k forks source link

DinoV2 is incorrectly documented as a default patch size of 16 instead of 14 #34292

Open OFSkean opened 3 days ago

OFSkean commented 3 days ago

System Info

transformers version 4.40.2, python version 3.10

Who can help?

@amyeroberts @NielsRogge

Information

Tasks

Reproduction

from transformers import AutoConfig config = AutoConfig.from_pretrained("facebook/dinov2-base") print(config.patch_size) # prints 14

Expected behavior

The DinoV2 model was trained with a patch size of 14 and released with that patch size on Github. Some time later, @NielsRogge ported it into Huggingface. The config.jsons for the Dinov2 models (base, large) correctly say patch size of 14, but the default for Dinov2Config is 16 (docs, code).

This error was a bit confusing for me at first as I was trying to hunt down why my sequence length was 257 instead of 197. It makes sense to correct the documentation to show 14 as the default.

LysandreJik commented 2 days ago

cc @qubvel as well