huggingface / huggingface_hub

The official Python client for the Huggingface Hub.
https://huggingface.co/docs/huggingface_hub
Apache License 2.0
1.83k stars 471 forks source link

ModuleNotFoundError: No module named 'huggingface_hub' #2307

Closed hannaDataScientist closed 1 month ago

hannaDataScientist commented 1 month ago

Describe the bug

Hi

I have installed, uninstalled and reinstalled huggingface_hub numerous times but I cannot get it to work.

When I run conda list on cmd, huggingface_hub is listed, see output below.

Please see the output when I run the suggested code to check installation was successful below.

ModuleNotFoundError Traceback (most recent call last) Cell In[1], line 1 ----> 1 from huggingface_hub import InferenceClient 2 client = InferenceClient() 4 image = client.text_to_image(“An astronaut riding a horse on the moon.”)

ModuleNotFoundError: No module named ‘huggingface_hub’

conda list

packages in environment at C:\Users\AppData\Local\anaconda3\envs\myenv: Name Version Build Channel brotli-python 1.1.0 py311h12c1d0e_1 conda-forge bzip2 1.0.8 h2bbff1b_6 ca-certificates 2024.6.2 h56e8100_0 conda-forge certifi 2024.2.2 pyhd8ed1ab_0 conda-forge charset-normalizer 3.3.2 pyhd8ed1ab_0 conda-forge colorama 0.4.6 pyhd8ed1ab_0 conda-forge filelock 3.14.0 pyhd8ed1ab_0 conda-forge fsspec 2024.5.0 pyhff2d567_0 conda-forge git 2.45.1 h57928b3_0 conda-forge huggingface_hub 0.23.2 pyhd8ed1ab_0 conda-forge idna 3.7 pyhd8ed1ab_0 conda-forge intel-openmp 2021.4.0 pypi_0 pypi jinja2 3.1.4 pypi_0 pypi joblib 1.4.2 pypi_0 pypi libffi 3.4.4 hd77b12b_1 markupsafe 2.1.5 pypi_0 pypi mkl 2021.4.0 pypi_0 pypi mpmath 1.3.0 pypi_0 pypi networkx 3.3 pypi_0 pypi numpy 1.26.4 pypi_0 pypi openssl 3.3.0 h2466b09_3 conda-forge packaging 24.0 pyhd8ed1ab_0 conda-forge pillow 10.3.0 pypi_0 pypi pip 24.0 py311haa95532_0 pysocks 1.7.1 pyh0701188_6 conda-forge python 3.11.7 he1021f5_0 python_abi 3.11 2_cp311 conda-forge pyyaml 6.0.1 py311ha68e1ae_1 conda-forge regex 2024.5.15 pypi_0 pypi requests 2.32.3 pyhd8ed1ab_0 conda-forge safetensors 0.4.3 pypi_0 pypi scikit-learn 1.5.0 pypi_0 pypi scipy 1.13.1 pypi_0 pypi sentence-transformers 3.0.0 pypi_0 pypi setuptools 69.5.1 py311haa95532_0 sqlite 3.45.3 h2bbff1b_0 sympy 1.12.1 pypi_0 pypi tbb 2021.12.0 pypi_0 pypi threadpoolctl 3.5.0 pypi_0 pypi tk 8.6.14 h0416ee5_0 tokenizers 0.19.1 pypi_0 pypi torch 2.3.0 pypi_0 pypi tqdm 4.66.4 pyhd8ed1ab_0 conda-forge transformers 4.41.2 pypi_0 pypi typing-extensions 4.12.1 hd8ed1ab_0 conda-forge typing_extensions 4.12.1 pyha770c72_0 conda-forge tzdata 2024a h04d1e81_0 ucrt 10.0.22621.0 h57928b3_0 conda-forge urllib3 2.2.1 pyhd8ed1ab_0 conda-forge vc 14.2 h2eaa2aa_1 vc14_runtime 14.38.33135 h835141b_20 conda-forge vs2015_runtime 14.38.33135 h22015db_20 conda-forge wheel 0.43.0 py311haa95532_0 win_inet_pton 1.1.0 pyhd8ed1ab_6 conda-forge xz 5.4.6 h8cc25b3_1 yaml 0.2.5 h8ffe710_2 conda-forge zlib 1.2.13 h8cc25b3_1

python -c “from huggingface_hub import model_info; print(model_info(‘gpt2’))”

ModelInfo(id=‘openai-community/gpt2’, author=‘openai-community’, sha=‘607a30d783dfa663caf39e06633721c8d4cfcd7e’, created_at=datetime.datetime(2022, 3, 2, 23, 29, 4, tzinfo=datetime.timezone.utc), last_modified=datetime.datetime(2024, 2, 19, 10, 57, 45, tzinfo=datetime.timezone.utc), private=False, gated=False, disabled=False, downloads=10803905, likes=1974, library_name=‘transformers’, tags=[‘transformers’, ‘pytorch’, ‘tf’, ‘jax’, ‘tflite’, ‘rust’, ‘onnx’, ‘safetensors’, ‘gpt2’, ‘text-generation’, ‘exbert’, ‘en’, ‘doi:10.57967/hf/0039’, ‘license:mit’, ‘autotrain_compatible’, ‘endpoints_compatible’, ‘text-generation-inference’, ‘region:us’], pipeline_tag=‘text-generation’, mask_token=None, card_data={‘language’: ‘en’, ‘license’: ‘mit’, ‘library_name’: None, ‘tags’: [‘exbert’], ‘base_model’: None, ‘datasets’: None, ‘metrics’: None, ‘eval_results’: None, ‘model_name’: None}, widget_data=[{‘text’: ‘My name is Julien and I like to’}, {‘text’: ‘My name is Thomas and my main’}, {‘text’: ‘My name is Mariama, my favorite’}, {‘text’: ‘My name is Clara and I am’}, {‘text’: ‘My name is Lewis and I like to’}, {‘text’: ‘My name is Merve and my favorite’}, {‘text’: ‘My name is Teven and I am’}, {‘text’: ‘Once upon a time,’}], model_index=None, config={‘architectures’: [‘GPT2LMHeadModel’], ‘model_type’: ‘gpt2’, ‘tokenizer_config’: {}}, transformers_info=TransformersInfo(auto_model=‘AutoModelForCausalLM’, custom_class=None, pipeline_tag=‘text-generation’, processor=‘AutoTokenizer’), siblings=[RepoSibling(rfilename=‘.gitattributes’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘64-8bits.tflite’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘64-fp16.tflite’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘64.tflite’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘README.md’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘config.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘flax_model.msgpack’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘generation_config.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘merges.txt’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘model.safetensors’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/config.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/decoder_model.onnx’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/decoder_model_merged.onnx’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/decoder_with_past_model.onnx’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/generation_config.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/merges.txt’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/special_tokens_map.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/tokenizer.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/tokenizer_config.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/vocab.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘pytorch_model.bin’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘rust_model.ot’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘tf_model.h5’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘tokenizer.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘tokenizer_config.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘vocab.json’, size=None, blob_id=None, lfs=None)], spaces=[‘open-llm-leaderboard/open_llm_leaderboard’, ‘microsoft/HuggingGPT’, ‘Gustavosta/MagicPrompt-Stable-Diffusion’, ‘shi-labs/Versatile-Diffusion’, ‘optimum/llm-perf-leaderboard’, ‘h2oai/h2ogpt-chatbot’, ‘microsoft/Promptist’, ‘yizhangliu/Grounded-Segment-Anything’, ‘aliabid94/AutoGPT’, ‘wangrongsheng/ChatPaper’, ‘h2oai/h2ogpt-chatbot2’, ‘OFA-Sys/OFA-Image_Caption’, ‘Manmay/tortoise-tts’, ‘ShiwenNi/ChatReviewer’, ‘OpenMotionLab/MotionGPT’, ‘Intel/low_bit_open_llm_leaderboard’, ‘m-ric/beam_search_visualizer’, ‘exbert-project/exbert’, ‘flax-community/image-captioning’, ‘doevent/Stable-Diffusion-prompt-generator’, ‘eduagarcia/open_pt_llm_leaderboard’, ‘treadon/prompt-fungineer-355M’, ‘SeaLLMs/SeaLLM-7B’, ‘nateraw/lavila’, ‘yizhangliu/Text-to-Image’, ‘BAAI/open_cn_llm_leaderboard’, ‘gsaivinay/open_llm_leaderboard’, ‘deepklarity/poster2plot’, ‘EleutherAI/magma’, ‘akhaliq/CLIP_prefix_captioning’, ‘FrankZxShen/so-vits-svc-models-ba’, ‘OFA-Sys/OFA-Visual_Grounding’, ‘maxmax20160403/sovits5.0’, ‘OFA-Sys/OFA-vqa’, ‘Gustavosta/MagicPrompt-Dalle’, ‘phenomenon1981/MagicPrompt-Stable-Diffusion’, ‘OFA-Sys/OFA-Generic_Interface’, ‘johko/capdec-image-captioning’, ‘hkunlp/Binder’, ‘aubmindlab/Arabic-NLP’, ‘bipin/image2story’, ‘ShiwenNi/ChatResponse’, ‘LilyF/Generate_Text_and_Audio’, ‘Omnibus/Chatbot-Compare’, ‘TMElyralab/MuseTalk’, ‘society-ethics/model-card-regulatory-check’, ‘Catmeow/AI_story_writing’, ‘hahahafofo/image2text_prompt_generator’, ‘ICML2022/OFA’, ‘thirdai/BOLT2.5B’, ‘mshukor/UnIVAL’, ‘sohaibcs1/Image-to-Text-Summary’, ‘aliabid94/GPT-Golf’, ‘Hello-SimpleAI/chatgpt-detector-ling’, ‘llizhx/TinyGPT-V’, ‘SeaLLMs/SeaLLM-7B-v2.5’, ‘FrankZxShen/so-vits-svc-models-pcr’, ‘architext/Architext_deployed’, ‘kmacdermid/RpgRoomGenerator’, ‘SeViLA/SeViLA’, ‘Dagfinn1962/stablediffusion-models’, ‘RitaParadaRamos/SmallCapDemo’, ‘AnimaLab/bias-test-gpt-pairs’, ‘stanfordnlp/Backpack-Demo’, ‘sasha/BiasDetection’, ‘gsarti/pecore’, ‘sasha/WinoBiasCheck’, ‘GTBench/GTBench’, ‘ccolas/TastyPiano’, ‘BoomerangGirl/MagicPrompt-Stable-Diffusion’, ‘dromerosm/gpt-info-extraction’, ‘hahahafofo/prompt_generator’, ‘liyucheng/selective_context’, ‘zeno-ml/chatbot-report’, ‘lfoppiano/document-qa’, ‘Hellisotherpeople/HF-SHAP’, ‘ethzanalytics/gpt2-xl-conversational’, ‘taesiri/HuggingGPT-Lite’, ‘taka-yamakoshi/tokenizer-demo’, ‘autonomous019/image_story_generator’, ‘shangdatalab-ucsd/LDB’, ‘Guinnessgshep/AI_story_writing’, ‘SeaLLMs/SeaLLM-7B-v2’, ‘gagan3012/ViTGPT2’, ‘luis112/text-generation-webui’, ‘abdullahmeda/detect-ai-text’, ‘yhavinga/dutch-tokenizer-arena’, ‘BigSalmon/InformalToFormal’, ‘xzuyn/Token-Count-Comparison’, ‘alistairmcleay/cambridge-masters-project’, ‘codeparrot/apps_metric’, ‘Catmeow/Text_Generation_Fine_Tune’, ‘j43fer/MagicPrompt-Stable-Diffusion’, ‘Daniton/MagicPrompt-Stable-Diffusion’, ‘ashhadahsan/ai-book-generator’, ‘Chakshu123/sketch-colorization-with-hint’, ‘koajoel/PolyFormer’, ‘felixz/open_llm_leaderboard’, ‘ehristoforu/Rensor’, ‘Kaludi/Stable-Diffusion-Prompt-Generator_App’], safetensors=SafeTensorsInfo(parameters={‘F32’: 137022720}, total=137022720))

Reproduction

No response

Logs

No response

System info

- huggingface_hub version: 0.23.2
- Platform: Windows-10-10.0.22631-SP0
- Python version: 3.11.7
- Running in iPython ?: No
- Running in notebook ?: No
- Running in Google Colab ?: No
- Token path ?: C:\Users\hanna\.cache\huggingface\token
- Has saved token ?: True
- Who am I ?: HannaDataScientist
- Configured git credential helpers:
- FastAI: N/A
- Tensorflow: N/A
- Torch: 2.3.0
- Jinja2: 3.1.4
- Graphviz: N/A
- keras: N/A
- Pydot: N/A
- Pillow: 10.3.0
- hf_transfer: N/A
- gradio: N/A
- tensorboard: N/A
- numpy: 1.26.4
- pydantic: N/A
- aiohttp: N/A
- ENDPOINT: https://huggingface.co
- HF_HUB_CACHE: C:\Users\hanna\.cache\huggingface\hub
- HF_ASSETS_CACHE: C:\Users\hanna\.cache\huggingface\assets
- HF_TOKEN_PATH: C:\Users\hanna\.cache\huggingface\token
- HF_HUB_OFFLINE: False
- HF_HUB_DISABLE_TELEMETRY: False
- HF_HUB_DISABLE_PROGRESS_BARS: None
- HF_HUB_DISABLE_SYMLINKS_WARNING: False
- HF_HUB_DISABLE_EXPERIMENTAL_WARNING: False
- HF_HUB_DISABLE_IMPLICIT_TOKEN: False
- HF_HUB_ENABLE_HF_TRANSFER: False
- HF_HUB_ETAG_TIMEOUT: 10
- HF_HUB_DOWNLOAD_TIMEOUT: 10
hannaDataScientist commented 1 month ago

Describe the bug

Hi

I have installed, uninstalled and reinstalled huggingface_hub numerous times but I cannot get it to work.

When I run conda list on cmd, huggingface_hub is listed, see output below.

Please see the output when I run the suggested code to check installation was successful below.

ModuleNotFoundError Traceback (most recent call last) Cell In[1], line 1 ----> 1 from huggingface_hub import InferenceClient 2 client = InferenceClient() 4 image = client.text_to_image(“An astronaut riding a horse on the moon.”)

ModuleNotFoundError: No module named ‘huggingface_hub’

conda list

packages in environment at C:\Users\AppData\Local\anaconda3\envs\myenv: Name Version Build Channel brotli-python 1.1.0 py311h12c1d0e_1 conda-forge bzip2 1.0.8 h2bbff1b_6 ca-certificates 2024.6.2 h56e8100_0 conda-forge certifi 2024.2.2 pyhd8ed1ab_0 conda-forge charset-normalizer 3.3.2 pyhd8ed1ab_0 conda-forge colorama 0.4.6 pyhd8ed1ab_0 conda-forge filelock 3.14.0 pyhd8ed1ab_0 conda-forge fsspec 2024.5.0 pyhff2d567_0 conda-forge git 2.45.1 h57928b3_0 conda-forge huggingface_hub 0.23.2 pyhd8ed1ab_0 conda-forge idna 3.7 pyhd8ed1ab_0 conda-forge intel-openmp 2021.4.0 pypi_0 pypi jinja2 3.1.4 pypi_0 pypi joblib 1.4.2 pypi_0 pypi libffi 3.4.4 hd77b12b_1 markupsafe 2.1.5 pypi_0 pypi mkl 2021.4.0 pypi_0 pypi mpmath 1.3.0 pypi_0 pypi networkx 3.3 pypi_0 pypi numpy 1.26.4 pypi_0 pypi openssl 3.3.0 h2466b09_3 conda-forge packaging 24.0 pyhd8ed1ab_0 conda-forge pillow 10.3.0 pypi_0 pypi pip 24.0 py311haa95532_0 pysocks 1.7.1 pyh0701188_6 conda-forge python 3.11.7 he1021f5_0 python_abi 3.11 2_cp311 conda-forge pyyaml 6.0.1 py311ha68e1ae_1 conda-forge regex 2024.5.15 pypi_0 pypi requests 2.32.3 pyhd8ed1ab_0 conda-forge safetensors 0.4.3 pypi_0 pypi scikit-learn 1.5.0 pypi_0 pypi scipy 1.13.1 pypi_0 pypi sentence-transformers 3.0.0 pypi_0 pypi setuptools 69.5.1 py311haa95532_0 sqlite 3.45.3 h2bbff1b_0 sympy 1.12.1 pypi_0 pypi tbb 2021.12.0 pypi_0 pypi threadpoolctl 3.5.0 pypi_0 pypi tk 8.6.14 h0416ee5_0 tokenizers 0.19.1 pypi_0 pypi torch 2.3.0 pypi_0 pypi tqdm 4.66.4 pyhd8ed1ab_0 conda-forge transformers 4.41.2 pypi_0 pypi typing-extensions 4.12.1 hd8ed1ab_0 conda-forge typing_extensions 4.12.1 pyha770c72_0 conda-forge tzdata 2024a h04d1e81_0 ucrt 10.0.22621.0 h57928b3_0 conda-forge urllib3 2.2.1 pyhd8ed1ab_0 conda-forge vc 14.2 h2eaa2aa_1 vc14_runtime 14.38.33135 h835141b_20 conda-forge vs2015_runtime 14.38.33135 h22015db_20 conda-forge wheel 0.43.0 py311haa95532_0 win_inet_pton 1.1.0 pyhd8ed1ab_6 conda-forge xz 5.4.6 h8cc25b3_1 yaml 0.2.5 h8ffe710_2 conda-forge zlib 1.2.13 h8cc25b3_1

python -c “from huggingface_hub import model_info; print(model_info(‘gpt2’))”

ModelInfo(id=‘openai-community/gpt2’, author=‘openai-community’, sha=‘607a30d783dfa663caf39e06633721c8d4cfcd7e’, created_at=datetime.datetime(2022, 3, 2, 23, 29, 4, tzinfo=datetime.timezone.utc), last_modified=datetime.datetime(2024, 2, 19, 10, 57, 45, tzinfo=datetime.timezone.utc), private=False, gated=False, disabled=False, downloads=10803905, likes=1974, library_name=‘transformers’, tags=[‘transformers’, ‘pytorch’, ‘tf’, ‘jax’, ‘tflite’, ‘rust’, ‘onnx’, ‘safetensors’, ‘gpt2’, ‘text-generation’, ‘exbert’, ‘en’, ‘doi:10.57967/hf/0039’, ‘license:mit’, ‘autotrain_compatible’, ‘endpoints_compatible’, ‘text-generation-inference’, ‘region:us’], pipeline_tag=‘text-generation’, mask_token=None, card_data={‘language’: ‘en’, ‘license’: ‘mit’, ‘library_name’: None, ‘tags’: [‘exbert’], ‘base_model’: None, ‘datasets’: None, ‘metrics’: None, ‘eval_results’: None, ‘model_name’: None}, widget_data=[{‘text’: ‘My name is Julien and I like to’}, {‘text’: ‘My name is Thomas and my main’}, {‘text’: ‘My name is Mariama, my favorite’}, {‘text’: ‘My name is Clara and I am’}, {‘text’: ‘My name is Lewis and I like to’}, {‘text’: ‘My name is Merve and my favorite’}, {‘text’: ‘My name is Teven and I am’}, {‘text’: ‘Once upon a time,’}], model_index=None, config={‘architectures’: [‘GPT2LMHeadModel’], ‘model_type’: ‘gpt2’, ‘tokenizer_config’: {}}, transformers_info=TransformersInfo(auto_model=‘AutoModelForCausalLM’, custom_class=None, pipeline_tag=‘text-generation’, processor=‘AutoTokenizer’), siblings=[RepoSibling(rfilename=‘.gitattributes’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘64-8bits.tflite’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘64-fp16.tflite’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘64.tflite’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘README.md’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘config.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘flax_model.msgpack’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘generation_config.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘merges.txt’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘model.safetensors’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/config.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/decoder_model.onnx’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/decoder_model_merged.onnx’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/decoder_with_past_model.onnx’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/generation_config.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/merges.txt’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/special_tokens_map.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/tokenizer.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/tokenizer_config.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘onnx/vocab.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘pytorch_model.bin’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘rust_model.ot’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘tf_model.h5’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘tokenizer.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘tokenizer_config.json’, size=None, blob_id=None, lfs=None), RepoSibling(rfilename=‘vocab.json’, size=None, blob_id=None, lfs=None)], spaces=[‘open-llm-leaderboard/open_llm_leaderboard’, ‘microsoft/HuggingGPT’, ‘Gustavosta/MagicPrompt-Stable-Diffusion’, ‘shi-labs/Versatile-Diffusion’, ‘optimum/llm-perf-leaderboard’, ‘h2oai/h2ogpt-chatbot’, ‘microsoft/Promptist’, ‘yizhangliu/Grounded-Segment-Anything’, ‘aliabid94/AutoGPT’, ‘wangrongsheng/ChatPaper’, ‘h2oai/h2ogpt-chatbot2’, ‘OFA-Sys/OFA-Image_Caption’, ‘Manmay/tortoise-tts’, ‘ShiwenNi/ChatReviewer’, ‘OpenMotionLab/MotionGPT’, ‘Intel/low_bit_open_llm_leaderboard’, ‘m-ric/beam_search_visualizer’, ‘exbert-project/exbert’, ‘flax-community/image-captioning’, ‘doevent/Stable-Diffusion-prompt-generator’, ‘eduagarcia/open_pt_llm_leaderboard’, ‘treadon/prompt-fungineer-355M’, ‘SeaLLMs/SeaLLM-7B’, ‘nateraw/lavila’, ‘yizhangliu/Text-to-Image’, ‘BAAI/open_cn_llm_leaderboard’, ‘gsaivinay/open_llm_leaderboard’, ‘deepklarity/poster2plot’, ‘EleutherAI/magma’, ‘akhaliq/CLIP_prefix_captioning’, ‘FrankZxShen/so-vits-svc-models-ba’, ‘OFA-Sys/OFA-Visual_Grounding’, ‘maxmax20160403/sovits5.0’, ‘OFA-Sys/OFA-vqa’, ‘Gustavosta/MagicPrompt-Dalle’, ‘phenomenon1981/MagicPrompt-Stable-Diffusion’, ‘OFA-Sys/OFA-Generic_Interface’, ‘johko/capdec-image-captioning’, ‘hkunlp/Binder’, ‘aubmindlab/Arabic-NLP’, ‘bipin/image2story’, ‘ShiwenNi/ChatResponse’, ‘LilyF/Generate_Text_and_Audio’, ‘Omnibus/Chatbot-Compare’, ‘TMElyralab/MuseTalk’, ‘society-ethics/model-card-regulatory-check’, ‘Catmeow/AI_story_writing’, ‘hahahafofo/image2text_prompt_generator’, ‘ICML2022/OFA’, ‘thirdai/BOLT2.5B’, ‘mshukor/UnIVAL’, ‘sohaibcs1/Image-to-Text-Summary’, ‘aliabid94/GPT-Golf’, ‘Hello-SimpleAI/chatgpt-detector-ling’, ‘llizhx/TinyGPT-V’, ‘SeaLLMs/SeaLLM-7B-v2.5’, ‘FrankZxShen/so-vits-svc-models-pcr’, ‘architext/Architext_deployed’, ‘kmacdermid/RpgRoomGenerator’, ‘SeViLA/SeViLA’, ‘Dagfinn1962/stablediffusion-models’, ‘RitaParadaRamos/SmallCapDemo’, ‘AnimaLab/bias-test-gpt-pairs’, ‘stanfordnlp/Backpack-Demo’, ‘sasha/BiasDetection’, ‘gsarti/pecore’, ‘sasha/WinoBiasCheck’, ‘GTBench/GTBench’, ‘ccolas/TastyPiano’, ‘BoomerangGirl/MagicPrompt-Stable-Diffusion’, ‘dromerosm/gpt-info-extraction’, ‘hahahafofo/prompt_generator’, ‘liyucheng/selective_context’, ‘zeno-ml/chatbot-report’, ‘lfoppiano/document-qa’, ‘Hellisotherpeople/HF-SHAP’, ‘ethzanalytics/gpt2-xl-conversational’, ‘taesiri/HuggingGPT-Lite’, ‘taka-yamakoshi/tokenizer-demo’, ‘autonomous019/image_story_generator’, ‘shangdatalab-ucsd/LDB’, ‘Guinnessgshep/AI_story_writing’, ‘SeaLLMs/SeaLLM-7B-v2’, ‘gagan3012/ViTGPT2’, ‘luis112/text-generation-webui’, ‘abdullahmeda/detect-ai-text’, ‘yhavinga/dutch-tokenizer-arena’, ‘BigSalmon/InformalToFormal’, ‘xzuyn/Token-Count-Comparison’, ‘alistairmcleay/cambridge-masters-project’, ‘codeparrot/apps_metric’, ‘Catmeow/Text_Generation_Fine_Tune’, ‘j43fer/MagicPrompt-Stable-Diffusion’, ‘Daniton/MagicPrompt-Stable-Diffusion’, ‘ashhadahsan/ai-book-generator’, ‘Chakshu123/sketch-colorization-with-hint’, ‘koajoel/PolyFormer’, ‘felixz/open_llm_leaderboard’, ‘ehristoforu/Rensor’, ‘Kaludi/Stable-Diffusion-Prompt-Generator_App’], safetensors=SafeTensorsInfo(parameters={‘F32’: 137022720}, total=137022720))

Reproduction

No response

Logs

No response

System info

- huggingface_hub version: 0.23.2
- Platform: Windows-10-10.0.22631-SP0
- Python version: 3.11.7
- Running in iPython ?: No
- Running in notebook ?: No
- Running in Google Colab ?: No
- Token path ?: C:\Users\hanna\.cache\huggingface\token
- Has saved token ?: True
- Who am I ?: HannaDataScientist
- Configured git credential helpers:
- FastAI: N/A
- Tensorflow: N/A
- Torch: 2.3.0
- Jinja2: 3.1.4
- Graphviz: N/A
- keras: N/A
- Pydot: N/A
- Pillow: 10.3.0
- hf_transfer: N/A
- gradio: N/A
- tensorboard: N/A
- numpy: 1.26.4
- pydantic: N/A
- aiohttp: N/A
- ENDPOINT: https://huggingface.co
- HF_HUB_CACHE: C:\Users\hanna\.cache\huggingface\hub
- HF_ASSETS_CACHE: C:\Users\hanna\.cache\huggingface\assets
- HF_TOKEN_PATH: C:\Users\hanna\.cache\huggingface\token
- HF_HUB_OFFLINE: False
- HF_HUB_DISABLE_TELEMETRY: False
- HF_HUB_DISABLE_PROGRESS_BARS: None
- HF_HUB_DISABLE_SYMLINKS_WARNING: False
- HF_HUB_DISABLE_EXPERIMENTAL_WARNING: False
- HF_HUB_DISABLE_IMPLICIT_TOKEN: False
- HF_HUB_ENABLE_HF_TRANSFER: False
- HF_HUB_ETAG_TIMEOUT: 10
- HF_HUB_DOWNLOAD_TIMEOUT: 10

I managed to resolve this finally by running the following code directly in jupyter notebook:

import sys !{sys.executable} -m pip install huggingface_hub