turboderp / exllamav2

A fast inference library for running LLMs locally on modern consumer-class GPUs
MIT License
3.53k stars 272 forks source link

KeyError: 'measurement' #389

Closed Katehuuh closed 6 months ago

Katehuuh commented 6 months ago

using measurement.json, python convert.py -i C:\Model-7B -o C:\temp-8.0bpw-h8-exl2 -cf C:\Model-7B-8.0bpw-h8-exl2 --bits 8.0 -hb 8 --length 4096 --measurement_length 4096 --cal_dataset C:\pippa.parquet -m C:\measurement.json:

 !! Warning: calibration rows > 2048 tokens may result in excessive VRAM use
 -- Beginning new job
Traceback (most recent call last):
  File "C:\exllamav2\convert.py", line 119, in <module>
    job["measurement"] = imp_measurement["measurement"]
KeyError: 'measurement'
 !! Warning: calibration rows > 2048 tokens may result in excessive VRAM use
 -- Beginning new job
Traceback (most recent call last):
  File "D:\exllamav2\convert.py", line 119, in <module>
    job["measurement"] = imp_measurement["measurement"]
KeyError: 'measurement'
 !! Warning: calibration rows > 2048 tokens may result in excessive VRAM use
 -- Beginning new job
Traceback (most recent call last):
  File "C:\exllamav2\convert.py", line 119, in <module>
    job["measurement"] = imp_measurement["measurement"]
KeyError: 'measurement'
turboderp commented 6 months ago

Can you share that measurement.json file? It seems to be corrupt somehow.

Katehuuh commented 6 months ago

Can you share that measurement.json file? It seems to be corrupt somehow.

Empty.. Sorry i miss doc arg -om.