huggingface / transformers

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
https://huggingface.co/transformers
Apache License 2.0
132.31k stars 26.35k forks source link

OSError: Unable to open file (file signature not found) #12078

Closed Holy-Shine closed 3 years ago

Holy-Shine commented 3 years ago

python version: 3.7.6 transformers: 4.6.1 tensorflow-cpu: 2.3.1

my code:

from transformers import TFAutoModel
model = TFAutoModel.from_pretrained("./chinese-bert-wwm-ext")

and chinese-bert-wwm-ext is a model dir that is downloaded from https://huggingface.co/models。 After I run this code in my jupyter notebook, I get an OSError:

---------------------------------------------------------------------------
OSError                                   Traceback (most recent call last)
~\Anaconda3\lib\site-packages\transformers\modeling_tf_utils.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
   1291         try:
-> 1292             missing_keys, unexpected_keys = load_tf_weights(model, resolved_archive_file, load_weight_prefix)
   1293         except OSError:

~\Anaconda3\lib\site-packages\transformers\modeling_tf_utils.py in load_tf_weights(model, resolved_archive_file, _prefix)
    470     # Read the H5 file
--> 471     with h5py.File(resolved_archive_file, "r") as f:
    472         # Retrieve the name of each layer from the H5 file

~\Anaconda3\lib\site-packages\h5py\_hl\files.py in __init__(self, name, mode, driver, libver, userblock_size, swmr, rdcc_nslots, rdcc_nbytes, rdcc_w0, track_order, **kwds)
    407                                fapl, fcpl=make_fcpl(track_order=track_order),
--> 408                                swmr=swmr)
    409 

~\Anaconda3\lib\site-packages\h5py\_hl\files.py in make_fid(name, mode, userblock_size, fapl, fcpl, swmr)
    172             flags |= h5f.ACC_SWMR_READ
--> 173         fid = h5f.open(name, flags, fapl=fapl)
    174     elif mode == 'r+':

h5py\_objects.pyx in h5py._objects.with_phil.wrapper()

h5py\_objects.pyx in h5py._objects.with_phil.wrapper()

h5py\h5f.pyx in h5py.h5f.open()

OSError: Unable to open file (file signature not found)

During handling of the above exception, another exception occurred:

OSError                                   Traceback (most recent call last)
<ipython-input-8-724814da42c1> in <module>
----> 1 model = TFAutoModel.from_pretrained('./chinese-bert-wwm-ext/')

~\Anaconda3\lib\site-packages\transformers\models\auto\auto_factory.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
    379         if type(config) in cls._model_mapping.keys():
    380             model_class = _get_model_class(config, cls._model_mapping)
--> 381             return model_class.from_pretrained(pretrained_model_name_or_path, *model_args, config=config, **kwargs)
    382         raise ValueError(
    383             f"Unrecognized configuration class {config.__class__} for this kind of AutoModel: {cls.__name__}.\n"

~\Anaconda3\lib\site-packages\transformers\modeling_tf_utils.py in from_pretrained(cls, pretrained_model_name_or_path, *model_args, **kwargs)
   1293         except OSError:
   1294             raise OSError(
-> 1295                 "Unable to load weights from h5 file. "
   1296                 "If you tried to load a TF 2.0 model from a PyTorch checkpoint, please set from_pt=True. "
   1297             )

OSError: Unable to load weights from h5 file. If you tried to load a TF 2.0 model from a PyTorch checkpoint, please set from_pt=True. 
vishal-burman commented 3 years ago

Hi @Holy-Shine Try:

from transformers import TFAutoModel
model = TFAutoModel.from_pretrained("hfl/chinese-bert-wwm-ext")
Holy-Shine commented 3 years ago

@vishal-burman thanks! It works for me. And I found that my tf_model.h5 file in my local dir definitely too "thin" that model loader cannot figure out it.