TsinghuaAI / CPM-2-Finetune

Finetune CPM-2
MIT License
83 stars 21 forks source link

How to use BMInf to inference 100000.tar 11B model? #25

Closed linjianz closed 2 years ago

linjianz commented 2 years ago

I install bminf from docker just like:

docker run -it --gpus 1 -v ${100000_MODEL_FILE_PATH}:/root/.cache/bigmodels --rm openbmb/bminf python3 examples/fill_blank.py

Where 100000_MODEL_FILE_PATH containers 4 .pt files in my localhost. However, I got the following error

Loading model
Failed to connect to the source server
Traceback (most recent call last):
  File "examples/fill_blank.py", line 29, in <module>
    main()
  File "examples/fill_blank.py", line 24, in main
    cpm2 = bminf.models.CPM2()
  File "/usr/local/lib/python3.6/dist-packages/bminf/models/cpm2.py", line 60, in __init__
    super().__init__(config)
  File "/usr/local/lib/python3.6/dist-packages/bminf/arch/t5/model.py", line 72, in __init__
    model_path = data.ensure_file(config.MODEL_NAME, "checkpoint.pt")
  File "/usr/local/lib/python3.6/dist-packages/bminf/data/__init__.py", line 49, in ensure_file
    raise ConnectionError("Failed to connect to the source server")
ConnectionError: Failed to connect to the source server

I found that bminf/arch/t5/model.py required only one checkpoint.pt file, and vocab.txt file? So how to load the 4 splited model files.

linjianz commented 2 years ago

This issue is moved to CPM-2-Pretrain