jankais3r / LLaMA_MPS

Run LLaMA (and Stanford-Alpaca) inference on Apple Silicon GPUs.
GNU General Public License v3.0
583 stars 47 forks source link

AssertionError: ./models/tokenizer.model #3

Closed MZeydabadi closed 1 year ago

MZeydabadi commented 1 year ago

Running the command python3 chat.py --ckpt_dir ./models/7B --tokenizer_path ./models/tokenizer.model --max_batch_size=8 --max_seq_len=512 I get this error: File "/home/LLaMA_MPS/chat.py", line 106, in main generator = load(ckpt_dir, tokenizer_path, max_seq_len, max_batch_size) File "/home/LLaMA_MPS/chat.py", line 80, in load tokenizer = Tokenizer(model_path=tokenizer_path) File "/home/LLaMA_MPS/llama/tokenizer.py", line 16, in __init__ assert os.path.isfile(model_path), model_path AssertionError: ./models/tokenizer.model I checked, and there was no tokenizer.model under /models

jankais3r commented 1 year ago

Hi, that's a file that comes from the torrent/Facebook download package together with the model weights.