CogComp / TacoLM

Temporal Common Sense Acquisition with Minimal Supervision, ACL'20
20 stars 4 forks source link

TacoLM

TemporAl COmmon Sense Language Model

A variation of BERT that is aware of temporal common sense.

Introduction

This is the code repository for our ACL 2020 paper Temporal Common Sense Acquisition with Minimal Supervision. This package is built upon huggingface/transformers at its April 2019 version.

Installation

Out of the box

Here are some things you can do with this package out of the box.

Train the main model

The script is set to default parameters and will export the model to models/. You can configure differently by editing the script.

The training process will generate one directory to store the loss logs, as well as NUM_EPOCH directories for each epoch's model. You will need to add BERT's vocab.txt to the epoch directories for evaluation. See more detail in the next section on pre-trained models.

The training data is pre-generated and formatted. More details here.

Experiments

You can download pre-trained models in models/ at Google Drive (0.4 G each), or follow the training procedure in the previous section.

General Usage

You can do many things with the model by just treating it as a set of transformer weights that fit exactly into a BERT-base model. Have an on-going project with BERT? Give it a try!

Intrinsic Experiments

The intrinsic evaluation relies on pre-formatted data.

TimeBank Experiment

HiEVE Experiment

MC-TACO Experiment

See MC-TACO.

Citation

See the following paper:

@inproceedings{ZNKR20,
    author = {Ben Zhou, Qiang Ning, Daniel Khashabi and Dan Roth},
    title = {Temporal Common Sense Acquisition with Minimal Supervision},
    booktitle = {ACL},
    year = {2020},
}