OpenGVLab / OmniQuant

[ICLR2024 spotlight] OmniQuant is a simple and powerful quantization technique for LLMs.
MIT License
626 stars 49 forks source link

Checksums didn't match for dataset source files #65

Closed hsb1995 closed 2 months ago

hsb1995 commented 4 months ago

image Can the. zst file be provided?

image Unable to match dataset resources at the beginning of downloading the dataset

hsb1995 commented 4 months ago

Your work has been very fulfilling, and I would like to reproduce it, but the following issues have arisen. I would like to ask for your help in answering.

hsb1995 commented 4 months ago

image image image

hsb1995 commented 4 months ago

image I searched for this issue on Google and their response was that the version of the datasets package is too high (2.18.0), so I tried many methods according to the installation you provided (datasets=2.0.0), but was unsuccessful. Could you please help me take a look.

hsb1995 commented 4 months ago

Another Google scholar said: pip install git+ https://github.com/huggingface/datasets.git Then set: dataset=load_dataset ("multinews", download_mode="force-redownload"). I changed the ’datautils‘ file according to this setting: traindata=load_dataset (path='wikitext ', name='wikitext-2-raw v1', split='train ', download_mode="force-redownload") Testdata=load_dataset (path='wikitext ', name='wikitext-2-raw v1', split='test ', download_mode="force-redownload") then the bug is image image

I have tried many ways but still haven't solved it. Could you please take some time to take a look

hsb1995 commented 4 months ago

Oh my goodness, I finally solved the bug. Thank you to the author. I downloaded the dataset and loaded it locally. however image image

hsb1995 commented 4 months ago

Dear author, I have been researching experiments on quantifying large-scale pre trained models recently and have seen that you have done such a perfect job. I would like to reproduce your code. This error occurred on the way, please help resolve it.

[2024-03-13 21:49:20 root](omniquant.py 50): INFO Starting ... [2024-03-13 21:49:22 root](omniquant.py 193): INFO === Start quantize layer 0 === Traceback (most recent call last): File "/home/sam/Doctorproject/OmniQuant-main/main.py", line 382, in main() File "/home/sam/Doctorproject/OmniQuant-main/main.py", line 352, in main omniquant( File "/home/sam/Doctorproject/OmniQuant-main/quantize/omniquant.py", line 213, in omniquant fp_inps[j] = qlayer(fp_inps[j].unsqueeze(0), attention_mask=attention_mask,position_ids=position_ids)[0] File "/home/sam/anaconda3/envs/omniquant/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl return forward_call(*args, kwargs) File "/home/sam/Doctorproject/OmniQuant-main/models/int_llama_layer.py", line 241, in forward hidden_states, self_attn_weights, present_key_value = self.self_attn( File "/home/sam/anaconda3/envs/omniquant/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl return forward_call(*args, *kwargs) File "/home/sam/Doctorproject/OmniQuant-main/models/int_llama_layer.py", line 124, in forward cos, sin = self.rotary_emb(value_states, seq_len=kv_seq_len) File "/home/sam/anaconda3/envs/omniquant/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl return forward_call(args, kwargs) File "/home/sam/anaconda3/envs/omniquant/lib/python3.10/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) TypeError: LlamaRotaryEmbedding.forward() missing 1 required positional argument: 'position_ids'

Process finished with exit code 1

hsb1995 commented 4 months ago

it is done, transformer's version is too high, please set the autuor's provide image