Open baiSongL opened 9 months ago
i change the code in smoothquant/calibration.py get_act_scales() with " dataset = load_dataset( 'allenai/c4', 'allenai--c4', data_files={'validation': 'en/c4-validation.00000-of-00008.json.gz'}, split='validation' ) " then can get a scales
Inspiration from @SSshuishui . I simply run generate_act_scales.py and generate a scales myself.
“ python examples/generate_act_scales.py \ --model-name ../models/opt/opt-13b \ --output-path ./opt-13b-scales.pt \ --num-samples 200 \ --seq-len 2048 \ --dataset-path ../datasets/lambada/lambada.py ”
Here i use lambada dataset.
But i run smoothquant in a server which has no internet access to hub. So i need to modify smoothquant/calibration.py to load dataset from local directory, otherwise it will failed. I change the code in smoothquant/calibration.py get_act_scales() functio with
" dataset = load_dataset(dataset_path, split="train") "
Then i can load a dataset locally rather than download from hugging face.
Certainly after modifying code, you should run python setup.py install
again to install the code after modified.
i change the code in smoothquant/calibration.py get_act_scales() with " dataset = load_dataset( 'allenai/c4', 'allenai--c4', data_files={'validation': 'en/c4-validation.00000-of-00008.json.gz'}, split='validation' ) " then can get a scales
how to solve the error: pickle.UnpicklingError: invalid load key, 'v'.