HPC-FAIR / docs

Docs for HPCFAIR APIs
0 stars 0 forks source link

KeyError: "Unknown task code-similarity.. #2

Open chunhualiao opened 1 year ago

chunhualiao commented 1 year ago

My macbook pro has original Transformer installed first.

I then followed instructions on https://hpcfair.readthedocs.io/en/latest/pipelines/similarity_checking.html

I got the following error. This may be caused by conflicts or precedence of the original vs. the modified transformer packages.

(pt) [liao@Transformer:pytorch]vi code_similarity.py (pt) [liao@Transformer:pytorch]python code_similarity.py Downloading (…)lve/main/config.json: 100%|██████████████████████████| 570/570 [00:00<00:00, 297kB/s] Traceback (most recent call last): File "/Users/liao/workspace/transformers/examples/pytorch/code_similarity.py", line 4, in pipe = pipeline(model="bert-base-uncased", task="code-similarity") File "/Users/liao/workspace/transformers/src/transformers/pipelines/init.py", line 717, in pipeline normalized_task, targeted_task, task_options = check_task(task) File "/Users/liao/workspace/transformers/src/transformers/pipelines/init.py", line 461, in check_task return PIPELINE_REGISTRY.check_task(task) File "/Users/liao/workspace/transformers/src/transformers/pipelines/base.py", line 1180, in check_task raise KeyError( KeyError: "Unknown task code-similarity, available tasks are ['audio-classification', 'automatic-speech-recognition', 'conversational', 'depth-estimation', 'document-question-answering', 'feature-extraction', 'fill-mask', 'image-classification', 'image-segmentation', 'image-to-text', 'ner', 'object-detection', 'question-answering', 'sentiment-analysis', 'summarization', 'table-question-answering', 'text-classification', 'text-generation', 'text2text-generation', 'token-classification', 'translation', 'video-classification', 'visual-question-answering', 'vqa', 'zero-shot-classification', 'zero-shot-image-classification', 'zero-shot-object-detection', 'translation_XX_to_YY']"

LChenGit commented 1 year ago

Some of the pipelines are in the working branch. I will fix it on Monday.