thunlp / OpenPrompt

An Open-Source Framework for Prompt-Learning.
https://thunlp.github.io/OpenPrompt/
Apache License 2.0
4.38k stars 455 forks source link

Is there a way to get the actual probability of each class from the model prediction? #256

Open brunoedcf opened 1 year ago

brunoedcf commented 1 year ago
from openprompt import PromptForClassification
promptModel = PromptForClassification(
    template = promptTemplate,
    plm = plm,
    verbalizer = promptVerbalizer,
)

import torch

promptModel.eval()
with torch.no_grad():
     for batch in data_loader:
         logits = promptModel(batch)
         preds = torch.argmax(logits, dim = -1)
         print(tokenizer.decode(batch['input_ids'][0], skip_special_tokens=True), classes[preds])

I have 3 classes and I want the probability of each one instead of the the higher one.

Protiva commented 1 year ago

The logits variable gives you the probability of each class.

preds = torch.argmax(logits, dim = -1) is selecting the class with the highest value.

brunoedcf commented 1 year ago

Yes, but what if I want the "brute" value of each probability instead of only the highest one.

For example:

12% for class A 7% for class B 0.5% for class C

xiyang-aads-lilly commented 7 months ago

@brunoedcf did you figure out how to? I think we can apply softmax on logits to get probability distribution. Any thoughts?