OFA-Sys / OFA

Official repository of OFA (ICML 2022). Paper: OFA: Unifying Architectures, Tasks, and Modalities Through a Simple Sequence-to-Sequence Learning Framework
Apache License 2.0
2.39k stars 248 forks source link

UnpicklingError when loading OFA-base pretrained model in transformers #379

Open Clement25 opened 1 year ago

Clement25 commented 1 year ago

I met a problem when loading OFA-base model using hugggingface, the code snippet is below

my torch version is 1.13.1

from PIL import Image
from torchvision import transforms
from transformers import OFATokenizer, OFAModel
from transformers.models.ofa.generate import sequence_generator

mean, std = [0.5, 0.5, 0.5], [0.5, 0.5, 0.5]
resolution = 384
patch_resize_transform = transforms.Compose([
        lambda image: image.convert("RGB"),
        transforms.Resize((resolution, resolution), interpolation=Image.BICUBIC),
        transforms.ToTensor(), 
        transforms.Normalize(mean=mean, std=std)
    ])

ckpt_dir = './OFA-base/'
path_to_image = './assets/icon_gardener.jpeg'

tokenizer = OFATokenizer.from_pretrained(ckpt_dir)

txt = " what does the image describe?"
inputs = tokenizer([txt], return_tensors="pt").input_ids
img = Image.open(path_to_image)
patch_img = patch_resize_transform(img).unsqueeze(0)

# using the generator of fairseq version
model = OFAModel.from_pretrained(ckpt_dir, use_cache=True)

The error is

[./OFA-base/](https://vscode-remote+ssh-002dremote-002b7b22686f73744e616d65223a22525458313139227d.vscode-resource.vscode-cdn.net/home/henry/clip-vqa/OFA-base/)
<super: <class 'OFATokenizer'>, <OFATokenizer object>>
Output exceeds the [size limit](command:workbench.action.openSettings?%5B%22notebook.output.textLineLimit%22%5D). Open the full output data [in a text editor](command:workbench.action.openLargeOutput?6835a532-b127-49ee-a376-b7eb8d660390)
---------------------------------------------------------------------------
UnpicklingError                           Traceback (most recent call last)
[~/miniconda3/envs/emnlp/lib/python3.7/site-packages/transformers/modeling_utils.py](https://vscode-remote+ssh-002dremote-002b7b22686f73744e616d65223a22525458313139227d.vscode-resource.vscode-cdn.net/home/henry/clip-vqa/~/miniconda3/envs/emnlp/lib/python3.7/site-packages/transformers/modeling_utils.py) in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
   1439                 try:
-> 1440                     state_dict = torch.load(resolved_archive_file, map_location="cpu")
   1441                 except Exception as e:

[~/miniconda3/envs/emnlp/lib/python3.7/site-packages/torch/serialization.py](https://vscode-remote+ssh-002dremote-002b7b22686f73744e616d65223a22525458313139227d.vscode-resource.vscode-cdn.net/home/henry/clip-vqa/~/miniconda3/envs/emnlp/lib/python3.7/site-packages/torch/serialization.py) in load(f, map_location, pickle_module, **pickle_load_args)
    594                 return _load(opened_zipfile, map_location, pickle_module, **pickle_load_args)
--> 595         return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)
    596 

[~/miniconda3/envs/emnlp/lib/python3.7/site-packages/torch/serialization.py](https://vscode-remote+ssh-002dremote-002b7b22686f73744e616d65223a22525458313139227d.vscode-resource.vscode-cdn.net/home/henry/clip-vqa/~/miniconda3/envs/emnlp/lib/python3.7/site-packages/torch/serialization.py) in _legacy_load(f, map_location, pickle_module, **pickle_load_args)
    763 
--> 764     magic_number = pickle_module.load(f, **pickle_load_args)
    765     if magic_number != MAGIC_NUMBER:

UnpicklingError: invalid load key, 'v'.

I try to redownload the model file with

git clone https://huggingface.co/OFA-Sys/OFA-base

but the issue remains

SilyRab commented 1 year ago

I have the same problem

SilyRab commented 1 year ago

I have solved this problem through git lfs clone https://huggingface.co/OFA-Sys/OFA-base