AtomGradient / phi2-mps

phi2 slm command line tools with apple mps support
MIT License
3 stars 0 forks source link

How to fix this error #1

Open SHIMURA0 opened 4 months ago

SHIMURA0 commented 4 months ago

╰─(phi2-mps) ○ phi2-mps --model "/Users/shimura/Documents/weights.npz" --prompt "Hi how are you" --max_tokens 256 Traceback (most recent call last): File "/Users/shimura/miniforge3/envs/phi2-mps/bin/phi2-mps", line 8, in sys.exit(main()) ^^^^^^ File "/Users/shimura/miniforge3/envs/phi2-mps/lib/python3.11/site-packages/phi2_mps/cli.py", line 38, in main model, tokenizer = phi2.load_model(args.model) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/shimura/miniforge3/envs/phi2-mps/lib/python3.11/site-packages/phi2_mps/phi2.py", line 157, in load_model weights = mx.load(modelPath) ^^^^^^^^^^^^^^^^^^ ValueError: [load] Input must be a file-like object opened in binary mode, or string

shuhongwu commented 4 months ago

Apple's MLX team has changed the loading method of the LLM model, and I strongly recommend that you use the following method to run it: https://github.com/ml-explore/mlx-examples/blob/main/llms/README.md

here is the sample codes: ` from mlx_lm import load, generate

model, tokenizer = load("microsoft/phi-2")

response = generate(model, tokenizer, prompt="hello", verbose=True)

print(response)

` phi2-mps will support it later~