GCYZSL / MoLA

89 stars 3 forks source link

Where can I get the "Training of Quick Start" #9

Closed Car-pe closed 3 months ago

Car-pe commented 3 months ago

Hello authors!

I'm just wondering where can I get the "Training of Quick Start" so that I can be able to use the detail hyper parameters to reproduce your results.

GCYZSL commented 3 months ago

Hello, to reproduce our direct downstream instruction fine-tuning results, you could follow the instructions in Readme: 1) Run Data Preparation Scripts, 2) Training on ScienceQA data, and 3) Evaluation on ScienceQA. To reproduce the pretraining and fine-tuning results, you can also follow the instructions in Readme: 1) Doing Data Preparation, 2) Training on Instruction data, 3) Training on ScienceQA, and 4) Evaluation on ScienceQA. The grid search is used to get the best-performance models. Thanks!

GCYZSL commented 3 months ago

The details of what each hyperparameter means are shown in the top part of the Readme. Current hyperparameters in the Readme can reproduce our results tested on 8A100 and 3A6000 settings.

Car-pe commented 3 months ago

Thank you! Then what about the other task, such as CoLA

GCYZSL commented 3 months ago

You can process the samples in the CoLA dataset following the instructions in Readme. Our way to process the data is the following:

hypothesis = data_sample["sentence"]
answer = ["unacceptable", "acceptable"][data_sample["label"]]
print(data_sample["label"], answer)
data_sample = {}
data_sample['input'] = ""
data_sample['instruction'] = f"Tell me if the statement unacceptable, acceptable.\nSentence: {hypothesis}\n"
data_sample['output'] = f"Answer: {answer}."
data_sample['answer'] = answer

Please note that the corresponding evaluation code should be modified as well.

Car-pe commented 3 months ago

Thank you! But my question is regarding the hyper parameters used in CoLA (or other glue task), such as learning rate and batch size. Are these kept the same as the ScienceQA task?

GCYZSL commented 3 months ago

The hyperparameters are the same. Thank you!

Car-pe commented 3 months ago

ok,thank you!