RUCAIBox / LLMSurvey

The official GitHub page for the survey paper "A Survey of Large Language Models".
https://arxiv.org/abs/2303.18223
10.13k stars 798 forks source link

Comments on paper #31

Closed ToddMorrill closed 1 year ago

ToddMorrill commented 1 year ago

First - nicely done. This must have been a herculean effort to review all of these papers. Here are some ideas:

  1. It would be nice to include more information about Falcon when their paper is released (still "coming soon" per HF). In particular, it seems that the creators of Falcon made a decision to use multi-query attention with an eye toward inference speed. It might nice to provide a little more detail about how different architecture choices (e.g. attention mechanisms, etc.) impact tokens generated per second, which is what engineers and the open source community are very focused on (and quality of generation of course too). Tokens/second really impacts the user experience plus I would love to see how people are thinking about truly enormous context sizes.
  2. This is a small point and feel free to disregard it but the word "besides" has a certain usage pattern among native English speakers. It's commonly used as follows: make a claim about something in your first sentence, then say, "besides", and then make an even stronger claim that basically says, feel free to disregard the first claim because here's an even stronger claim. Here's an example: Tom would never survive life in the army; he's not tough enough. Besides, he's too old to be accepted. The point here is that every time you use "besides" in the paper, you undermine the strength of the sentence before "besides", which is not what you're trying to do. One final note is that "besides" is pretty colloquial and is seldom used in professional writing. What you're really looking for here are the following three linking phrases: also, in addition, and furthermore.

Again, thank you for your work here. There's so much happening in the LLM space so up-to-date reviews like this are really helpful.

EliverQ commented 1 year ago

Thanks for your suggestions! We'll add Falcon in the upcoming update, and we will also gradually address the issue of "besides". We would like to include you in the acknowledgments. Could you please provide your name?

ToddMorrill commented 1 year ago

Sure, it's Todd Morrill.

nmm5060 commented 1 year ago

First - nicely done. This must have been a herculean effort to review all of these papers. Here are some ideas:

  1. It would be nice to include more information about Falcon when their paper is released (still "coming soon" per HF). In particular, it seems that the creators of Falcon made a decision to use multi-query attention with an eye toward inference speed. It might nice to provide a little more detail about how different architecture choices (e.g. attention mechanisms, etc.) impact tokens generated per second, which is what engineers and the open source community are very focused on (and quality of generation of course too). Tokens/second really impacts the user experience plus I would love to see how people are thinking about truly enormous context sizes.

  2. This is a small point and feel free to disregard it but the word "besides" has a certain usage pattern among native English speakers. It's commonly used as follows: make a claim about something in your first sentence, then say, "besides", and then make an even stronger claim that basically says, feel free to disregard the first claim because here's an even stronger claim. Here's an example: Tom would never survive life in the army; he's not tough enough. Besides, he's too old to be accepted. The point here is that every time you use "besides" in the paper, you undermine the strength of the sentence before "besides", which is not what you're trying to do. One final note is that "besides" is pretty colloquial and is seldom used in professional writing. What you're really looking for here are the following three linking phrases: also, in addition, and furthermore.

Again, thank you for your work here. There's so much happening in the LLM space so up-to-date reviews like this are really helpful.

Chaim