AGI-Edgerunners / LLM-Adapters

Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient Fine-Tuning of Large Language Models"
https://arxiv.org/abs/2304.01933
Apache License 2.0
1.02k stars 92 forks source link

Why did you use Bloom instead Bloomz #1

Closed linhduongtuan closed 1 year ago

linhduongtuan commented 1 year ago

Hello everyone, I would like to inquire as to why you have chosen Bloom-7b1 as your paper description. As far as I know, BigScience recommends using Bloomz variants, which can be found by following this link: https://huggingface.co/bigscience/bloomz.

I am concerned that your evaluation results shown in Table 1 may be biased. Would it be possible for you to use Bloomz-7b1 instead and re-evaluate? Thank you in advance. Best regards, Linh

demoleiwang commented 1 year ago

@linhduongtuan Thank you for your suggestion! We will explore Bloomz and make updates to the arXiv paper promptly.