FlagOpen / FlagEmbedding

Retrieval and Retrieval-augmented LLMs
MIT License
7.46k stars 536 forks source link

SafetensorError: Error while deserializing header: HeaderTooSmall #520

Open IcyFeather233 opened 8 months ago

IcyFeather233 commented 8 months ago

Environments

Python: 3.9 OS: Ubuntu 20.04 FlagEmbedding 1.2.5 transformers 4.33.1

Details

my test python file bge-test.py:

from FlagEmbedding import BGEM3FlagModel

model = BGEM3FlagModel('models/bge-m3',  
                       use_fp16=True) # Setting use_fp16 to True speeds up computation with a slight performance degradation

sentences_1 = ["What is BGE M3?", "Defination of BM25"]
sentences_2 = ["BGE M3 is an embedding model supporting dense retrieval, lexical matching and multi-vector interaction.", 
               "BM25 is a bag-of-words retrieval function that ranks a set of documents based on the query terms appearing in each document"]

embeddings_1 = model.encode(sentences_1, 
                            batch_size=12, 
                            max_length=8192, # If you don't need such a long length, you can set a smaller value to speed up the encoding process.
                            )['dense_vecs']
embeddings_2 = model.encode(sentences_2)['dense_vecs']
similarity = embeddings_1 @ embeddings_2.T
print(similarity)
# [[0.6265, 0.3477], [0.3499, 0.678 ]]

run python bge-test.py, and the output:

/xxx/env_root/anaconda3/envs/OPO_env/lib/python3.9/site-packages/transformers/utils/generic.py:311: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead.
  torch.utils._pytree._register_pytree_node(
/xxx/env_root/anaconda3/envs/OPO_env/lib/python3.9/site-packages/transformers/utils/generic.py:311: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead.
  torch.utils._pytree._register_pytree_node(
Traceback (most recent call last):
  File "/xxx/workspace/OPO/bge-test.py", line 3, in <module>
    model = BGEM3FlagModel('models/bge-m3',  
  File "/xxx/env_root/anaconda3/envs/OPO_env/lib/python3.9/site-packages/FlagEmbedding/bge_m3.py", line 36, in __init__
    self.model = BGEM3ForInference(
  File "/xxx/env_root/anaconda3/envs/OPO_env/lib/python3.9/site-packages/FlagEmbedding/BGE_M3/modeling.py", line 40, in __init__
    self.load_model(model_name, colbert_dim=colbert_dim)
  File "/xxx/env_root/anaconda3/envs/OPO_env/lib/python3.9/site-packages/FlagEmbedding/BGE_M3/modeling.py", line 76, in load_model
    self.model = AutoModel.from_pretrained(model_name)
  File "/xxx/env_root/anaconda3/envs/OPO_env/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 563, in from_pretrained
    return model_class.from_pretrained(
  File "/xxx/env_root/anaconda3/envs/OPO_env/lib/python3.9/site-packages/transformers/modeling_utils.py", line 2883, in from_pretrained
    state_dict = load_state_dict(resolved_archive_file)
  File "/xxx/env_root/anaconda3/envs/OPO_env/lib/python3.9/site-packages/transformers/modeling_utils.py", line 467, in load_state_dict
    with safe_open(checkpoint_file, framework="pt") as f:
safetensors_rust.SafetensorError: Error while deserializing header: HeaderTooSmall
staoxiao commented 8 months ago

A possible issue is the old version of transformers. You can try to upgrade the transformers.