HamedBabaei / LLMs4OL

LLMs4OL:‌ Large Language Models for Ontology Learning
MIT License
87 stars 8 forks source link

Create templates for FSL on datasets #47

Closed HamedBabaei closed 1 year ago

HamedBabaei commented 1 year ago

@jd-coderepos

Unfortunately, the templates that I have created are not performing well, the models took this kind of yes/no modeling as a binary classification.

However, I have found the following kind of templates as well:

        ("What happens next in this paragraph?\n\n{context}\n{options_}", "{answer}"),
        ("Continue writing the next sentence in this paragraph:\n\n{context}\n\n{options_}", "{answer}"),
        ("Continue writing the next sentence.\n\n{context}\n\n{options_}", "{answer}"),
        ("This is a test of commonsense. Complete the next sentence:\n\n{context}\n\n{options_}", "{answer}"),
        ("Write the next sentence in this paragraph:\n\n{context}\n\n{options_}", "{answer}"),
        ("How does the next paragraph end?\n\n{context}\n\n{options_}", "{answer}"),
        ("What most naturally follows?\n\n{context}\n\n{options_}", "{answer}"),
        ("What happens next?\n\n{context}\n\n{options_}", "{answer}"),
        ("What is the most logical next event?\n\n{context}\n\n{options_}", "{answer}"),
        ("Write the next sentence in the following story.\n\n{context}\n\n{options_}", "{answer}"),

these templates play a text generation role so I am thinking of adding this kind of samples during the training process as well!

Any idea about this approach?