huggingface / huggingface_hub

The official Python client for the Huggingface Hub.
https://huggingface.co/docs/huggingface_hub
Apache License 2.0
2.02k stars 531 forks source link

Any `AutoConfig.from_pretrained` call results in `FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0.` #2275

Closed tomaarsen closed 4 months ago

tomaarsen commented 4 months ago

Describe the bug

Any AutoConfig.from_pretrained call results in FutureWarning:resume_downloadis deprecated and will be removed in version 1.0.0. This is caused by a call internally in AutoConfig which uses resume_download=False as a default option:

https://github.com/huggingface/transformers/blob/508c0bfe555936fc772cd000e2e8da739f777a4f/src/transformers/configuration_utils.py#L650

Feel free to transfer this issue to transformers if you believe the fix should be applied there instead. Either way, we shouldn't get deprecation warnings when using packages normally.

Reproduction

from transformers import AutoConfig

config = AutoConfig.from_pretrained("bert-base-cased")

Logs

[sic]\envs\sentence-transformers\Lib\site-packages\huggingface_hub\file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
  warnings.warn(
config.json: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 570/570 [00:00<?, ?B/s]

System info

- huggingface_hub version: 0.23.0
- Platform: Windows-10-10.0.22631-SP0
- Python version: 3.11.6
- Running in iPython ?: No
- Running in notebook ?: No
- Running in Google Colab ?: No
- Token path ?: C:\Users\tom\.cache\huggingface\token
- Has saved token ?: True
- Who am I ?: tomaarsen
- Configured git credential helpers: manager
- FastAI: N/A
- Tensorflow: N/A
- Torch: 2.3.0+cu121
- Jinja2: 3.1.2
- Graphviz: N/A
- keras: N/A
- Pydot: N/A
- Pillow: 10.1.0
- hf_transfer: 0.1.6
- gradio: N/A
- tensorboard: N/A
- numpy: 1.26.1
- pydantic: 2.4.2
- aiohttp: 3.8.5
- ENDPOINT: https://huggingface.co
- HF_HUB_CACHE: C:\Users\tom\.cache\huggingface\hub
- HF_ASSETS_CACHE: C:\Users\tom\.cache\huggingface\assets
- HF_TOKEN_PATH: C:\Users\tom\.cache\huggingface\token
- HF_HUB_OFFLINE: False
- HF_HUB_DISABLE_TELEMETRY: False
- HF_HUB_DISABLE_PROGRESS_BARS: None
- HF_HUB_DISABLE_SYMLINKS_WARNING: True
- HF_HUB_DISABLE_EXPERIMENTAL_WARNING: False
- HF_HUB_DISABLE_IMPLICIT_TOKEN: False
- HF_HUB_ENABLE_HF_TRANSFER: False
- HF_HUB_ETAG_TIMEOUT: 10
- HF_HUB_DOWNLOAD_TIMEOUT: 10
aymenkrifa commented 4 months ago

@tomaarsen, https://github.com/huggingface/transformers/pull/30620 fixes the issue for me but a new release hasn’t been rolled out yet! (you can try to install transformers from source to check if it fixes your issue but be careful as the main branch is not always stable, just use it to check if the change solves the issue and we have to wait for a new realease)

tomaarsen commented 4 months ago

Excellent! Then I think this is all set. Thanks!