It’s important to avoid ever using claims that are dependent on thousands of variables changing each day, such as, “With a context length of over 8,000 tokens, the StarCoder models can process more input than any other open LLM,” unless you’re going to fetch this using some logic that benchmarks daily. Or include a disclaimer footnote with the date of the claim on that same line. Already OpenAI models have surpassed this three times since this was written.
It’s important to avoid ever using claims that are dependent on thousands of variables changing each day, such as, “With a context length of over 8,000 tokens, the StarCoder models can process more input than any other open LLM,” unless you’re going to fetch this using some logic that benchmarks daily. Or include a disclaimer footnote with the date of the claim on that same line. Already OpenAI models have surpassed this three times since this was written.