Hi, I installed bioGPT in a docker (repbioinfo/biogpt), I have downloaded the Pre-trained BioGPT-Large model checkpoint and here is the script (/BioGPT/script.py):
import torch
from src.transformer_lm_prompt import TransformerLanguageModelPrompt
m = TransformerLanguageModelPrompt.from_pretrained(
"/scratch/QA-PubMedQA-BioGPT-Large/",
"checkpoint_avg.pt",
"/BioGPT/data/BioGPT-Large",
tokenizer='moses',
bpe='fastbpe',
bpe_codes="/BioGPT/data/bpecodes",
max_len_b=1024,
beam=1)
m.cuda()
src_text="what DNA is today"
src_tokens = m.encode(src_text)
generate = m.generate([src_tokens], beam=args.beam)[0]
output = m.decode(generate[0]["tokens"])
but unfortunately i get this error.
THanks !
2023-03-01 20:50:43 | INFO | fairseq.file_utils | loading archive file /scratch/QA-PubMedQA-BioGPT-Large/
2023-03-01 20:50:43 | INFO | fairseq.file_utils | loading archive file /BioGPT/data/BioGPT-Large
Traceback (most recent call last):
File "/BioGPT/script.py", line 3, in
m = TransformerLanguageModelPrompt.from_pretrained(
File "/usr/local/lib/python3.10/dist-packages/fairseq/models/fairseq_model.py", line 267, in from_pretrained
x = hub_utils.from_pretrained(
File "/usr/local/lib/python3.10/dist-packages/fairseq/hub_utils.py", line 73, in from_pretrained
models, args, task = checkpoint_utils.load_model_ensemble_and_task(
File "/usr/local/lib/python3.10/dist-packages/fairseq/checkpoint_utils.py", line 432, in load_model_ensemble_and_task
task = tasks.setup_task(cfg.task)
File "/usr/local/lib/python3.10/dist-packages/fairseq/tasks/init.py", l
ine 46, in setup_task
return task.setup_task(cfg, **kwargs)
File "/BioGPT/src/language_modeling_prompt.py", line 133, in setup_task
raise Exception(
Exception: Could not infer language pair, please provide it explicitly
I do have a question. how did you download the BioGPT large? using the URL gives me an error that is unable to load the parameters from the checkpoint. Did you use something else to download it?
Hi, I installed bioGPT in a docker (repbioinfo/biogpt), I have downloaded the Pre-trained BioGPT-Large model checkpoint and here is the script (/BioGPT/script.py):
but unfortunately i get this error.
THanks !
ine 46, in setup_task return task.setup_task(cfg, **kwargs) File "/BioGPT/src/language_modeling_prompt.py", line 133, in setup_task raise Exception( Exception: Could not infer language pair, please provide it explicitly