qnguyen3 / chat-with-mlx

An all-in-one LLMs Chat UI for Apple Silicon Mac using MLX Framework.
https://twitter.com/stablequan
MIT License
1.47k stars 134 forks source link

`ModuleNotFoundError` running chat-with-mlx, `huggingface_hub.utils._errors` not found #247

Open bdruth opened 2 weeks ago

bdruth commented 2 weeks ago

With PIP install or Conda install, same error:

❯ chat-with-mlx
Traceback (most recent call last):
  File "/Users/bdruth/radioconda/envs/mlx-chat/bin/chat-with-mlx", line 5, in <module>
    from chat_with_mlx.app import main
  File "/Users/bdruth/Projects/chat-with-mlx/chat_with_mlx/app.py", line 10, in <module>
    from mlx_lm import load, stream_generate, generate
  File "/Users/bdruth/radioconda/envs/mlx-chat/lib/python3.11/site-packages/mlx_lm/__init__.py", line 3, in <module>
    from .utils import convert, generate, load, stream_generate
  File "/Users/bdruth/radioconda/envs/mlx-chat/lib/python3.11/site-packages/mlx_lm/utils.py", line 17, in <module>
    from huggingface_hub.utils._errors import RepositoryNotFoundError
ModuleNotFoundError: No module named 'huggingface_hub.utils._errors'

I'm not well-versed in Conda, but trying to manually run conda install -c conda-forge huggingface_hub, while completing successfully, doesn't work, either. Same error.

❯ python --version
Python 3.11.10
❯ python -c "import sys; print(list(filter(lambda s: s.find('packages') > -1, sys.path)))"
['/Users/bruth/radioconda/envs/mlx-chat/lib/python3.11/site-packages']
❯ conda list | grep hugging
huggingface_hub           0.25.2             pyh0610db2_0    conda-forge
❯ pip list | grep hugging
huggingface_hub                          0.25.2

fwiw, on macOS Sequoia (15.0.1), with default python 3.10.13, just doing a pip install chat-with-mlx (easy mode) succeeds but also fails in the same way as above.

Cheers! Happy to help with any further debugging.

bdruth commented 2 weeks ago

https://github.com/ml-explore/mlx-examples/issues/994 suggests running pip install -U mlx-lm which runs with an ominous warning/error at the end, but it does push mlx-lm to version 0.19.0, and running chat-with-mlx succeeds.

bdruth commented 2 weeks ago

Even with this, I wasn't able to get anything to work - connection error trying to download any model. Errors in the console indicated something w/ pydantic. I had this even from downgrading mlx-lm huggingface-hub==0.24.7

kriebe commented 2 weeks ago

Had the same issue. I'm on an M1 Mac with Sonoma 14.5. I originally tried it in Pinokio and go this error, so I figured I'd try a manual install and got the same.