Closed gh640 closed 2 hours ago
You're not missing anything. It hasn't been released to PyPi yet. If you want to use asap you can build from source.. I'll plan to get a release out shortly. FYI it should be version 0.19.1 to have the mlx_lm.chat
command.
@awni
Thank you for your kind comment.
I'll try building from source and playing around with it for now. I'll try it once the version 0.19.1 is released! Thanks!
I could use mlx_lm.chat
after building the package from source using the main branch. Thanks!
README in
llms/
says as follows:but the command
mlx_lm.chat
is not available after I installedmlx-lm
with uv.Steps to reproduce:
I confirmed the other commands such as
mlx_lm.generate
works well.It looks like
mlx_lm.chat
is in theentry_points
insetup.py
.https://github.com/ml-explore/mlx-examples/blob/fca087be4906332ccc086013e96da93981e57037/llms/setup.py#L32-L44
Am I missing something? Or, is it not yet been included in the latest version?
I use the following versions.
My environment:
If there's any information to add here, please let me know. Thanks in advance.