AGI-Edgerunners / LLM-Adapters

Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"
https://arxiv.org/abs/2304.01933
Apache License 2.0
1.02k stars 92 forks source link

Upload math10k dataset and fixed a few bugs #18

Closed HZQ950419 closed 1 year ago

HZQ950419 commented 1 year ago

This time, we collected 10K math word problem dataset with ChatGPT API. The test set is the same as the previous COT papers now. We update the performance table and also upload the dataset and model checkpoints.

If you have any questions, please create a new issue. Looking forward to your feedback.