thunlp / KnowledgeablePromptTuning

kpt code
206 stars 21 forks source link

Error of running zero-shot on dbpedia dataset #14

Open XuandongZhao opened 2 years ago

XuandongZhao commented 2 years ago

I have already installed the latest openprompt package. And the script for KPT is

PYTHONPATH=python3
BASEPATH="./"
DATASET=dbpedia #agnews dbpedia imdb amazon yahoo
TEMPLATEID=0 # 1 2 3
SEED=144 # 145 146 147 148
SHOT=0 # 0 1 10 20
VERBALIZER=kpt #
CALIBRATION="--calibration" # ""
FILTER=tfidf_filter # none
MODEL_NAME_OR_PATH="roberta-large"
RESULTPATH="results_zeroshot"
OPENPROMPTPATH="../OpenPrompt"

cd $BASEPATH

CUDA_VISIBLE_DEVICES=0 $PYTHONPATH zeroshot.py \
--model_name_or_path $MODEL_NAME_OR_PATH \
--result_file $RESULTPATH \
--openprompt_path $OPENPROMPTPATH \
--dataset $DATASET \
--template_id $TEMPLATEID \
--seed $SEED \
--verbalizer $VERBALIZER $CALIBRATION \
--filter $FILTER

And I got the error:

##Num of label words for each label: [217, 256, 214, 211, 324, 294, 215, 766, 345, 408, 880, 181, 246, 666]
Traceback (most recent call last):
  File "/mnt/data2/xuandong/prompt/KPT/zeroshot.py", line 113, in <module>
    support_dataloader = PromptDataLoader(dataset=dataset["support"], template=mytemplate, tokenizer=tokenizer,
  File "/mnt/data2/xuandong/prompt/KPT/openprompt/pipeline_base.py", line 100, in __init__
    self.wrap()
  File "/mnt/data2/xuandong/prompt/KPT/openprompt/pipeline_base.py", line 126, in wrap
    wrapped_example = self.template.wrap_one_example(example)
  File "/mnt/data2/xuandong/prompt/KPT/openprompt/prompt_base.py", line 207, in wrap_one_example
    text = self.incorporate_text_example(example)
  File "/mnt/data2/xuandong/prompt/KPT/openprompt/prompt_base.py", line 107, in incorporate_text_example
    text[i] = d["add_prefix_space"] + d.get("post_processing", lambda x:x)(getattr(example, d['placeholder']))
TypeError: 'str' object is not callable

Could you please check about it?

XuandongZhao commented 2 years ago

I solved it by modifying the code in openprompt:

    def incorporate_text_example(self,
                                 example: InputExample,
                                 text = None,
                                ):
        if text is None:
            text = self.text.copy()
        else:
            text = text.copy()

        for i, d in enumerate(text):
            if 'placeholder' in d:
                ##################
                tempa = d.get("post_processing", lambda x:x)
                if tempa == "lambda x:x.strip('.')":
                    tempa = lambda x:x.strip('.')
                tempb = tempa(getattr(example, d['placeholder']))
                text[i] = d["add_prefix_space"] + tempb
                ##################
            elif 'meta' in d:
                text[i] = d["add_prefix_space"] + d.get("post_processing", lambda x:x)(example.meta[d['meta']])
            elif 'soft' in d:
                text[i] = ''; # unused
            elif 'mask' in d:
                text[i] = '<mask>'
            elif 'special' in d:
                text[i] = d['special']
            elif 'text' in d:
                text[i] = d["add_prefix_space"] + d['text']
            else:
                raise ValueError(f'can not parse {d}')
        return text
znsoftm commented 2 years ago

can you pr ?

XuandongZhao commented 2 years ago

What do you mean by "pr"?

jackking2333 commented 2 years ago

how do you put this repo with openprompt? my code goes wrong with "OSError: ../plm_cache/roberta-large is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo with use_auth_token or log in with huggingface-cli login and pass use_auth_token=True." I guess it might be the location problem , i put this repo with openprompt in the same folder

zhouyukun0328 commented 1 year ago

how do you put this repo with openprompt? my code goes wrong with "OSError: ../plm_cache/roberta-large is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models' If this is a private repository, make sure to pass a token having permission to this repo with use_auth_token or log in with huggingface-cli login and pass use_auth_token=True." I guess it might be the location problem , i put this repo with openprompt in the same folder

Maybe you need to change the path of MODEL_NAME_OR_PATH