[ ] An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
[ ] My own task or dataset (give details below)
Reproduction
from transformers import AutoConfig
config = AutoConfig.from_pretrained("facebook/dinov2-base")
print(config.patch_size) # prints 14
Expected behavior
The DinoV2 model was trained with a patch size of 14 and released with that patch size on Github. Some time later, @NielsRogge ported it into Huggingface. The config.jsons for the Dinov2 models (base, large) correctly say patch size of 14, but the default for Dinov2Config is 16 (docs, code).
This error was a bit confusing for me at first as I was trying to hunt down why my sequence length was 257 instead of 197. It makes sense to correct the documentation to show 14 as the default.
System Info
transformers version 4.40.2, python version 3.10
Who can help?
@amyeroberts @NielsRogge
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
from transformers import AutoConfig config = AutoConfig.from_pretrained("facebook/dinov2-base") print(config.patch_size) # prints 14
Expected behavior
The DinoV2 model was trained with a patch size of 14 and released with that patch size on Github. Some time later, @NielsRogge ported it into Huggingface. The config.jsons for the Dinov2 models (base, large) correctly say patch size of 14, but the default for Dinov2Config is 16 (docs, code).
This error was a bit confusing for me at first as I was trying to hunt down why my sequence length was 257 instead of 197. It makes sense to correct the documentation to show 14 as the default.