ml-explore / mlx-examples

Examples in the MLX framework
MIT License
5.96k stars 847 forks source link

`mlx_lm.chat` command is not available after installing `mlx-lm` #1018

Closed gh640 closed 2 hours ago

gh640 commented 2 hours ago

README in llms/ says as follows:

image

but the command mlx_lm.chat is not available after I installed mlx-lm with uv.

Steps to reproduce:

uv add mlx-lm
uv run mlx_lm.chat

I confirmed the other commands such as mlx_lm.generate works well.

uv run mlx_lm.generate --model 'mlx-community/Llama-3.2-1B-Instruct-8bit' --prompt 'Hello'

It looks like mlx_lm.chat is in the entry_points in setup.py.

https://github.com/ml-explore/mlx-examples/blob/fca087be4906332ccc086013e96da93981e57037/llms/setup.py#L32-L44

Am I missing something? Or, is it not yet been included in the latest version?

I use the following versions.

mlx                0.18.0
mlx-lm             0.19.0

My environment:

sw_vers
ProductName:        macOS
ProductVersion:     15.0.1
BuildVersion:       24A348

If there's any information to add here, please let me know. Thanks in advance.

awni commented 2 hours ago

You're not missing anything. It hasn't been released to PyPi yet. If you want to use asap you can build from source.. I'll plan to get a release out shortly. FYI it should be version 0.19.1 to have the mlx_lm.chat command.

gh640 commented 2 hours ago

@awni

Thank you for your kind comment.

I'll try building from source and playing around with it for now. I'll try it once the version 0.19.1 is released! Thanks!

gh640 commented 1 hour ago

I could use mlx_lm.chat after building the package from source using the main branch. Thanks!