Closed fabriziosalmi closed 5 months ago
Well, let me tell a bedtime story:
In 1981, At a computer trade show, Bill Gates supposedly uttered this statement, in defense of the just-introduced IBM PCβs 640KB usable RAM limit: β640K ought to be enough for anybody.β^1
The initial PC was based on the Intel 8088, which was a hybrid 8/16-bit processor β thus the reason for the 640KB memory limit. Though tiny by todayβs standards β 64-bit systems can support up to 128GB of memory β 640KB at that time was an order of magnitude larger than the 64KB limit that faced users of 8-bit computers, such as the Apple II+ and the Commodore 64.
The context here was the rapid pace of technological advancement and the assumptions made at the time. Just as computer memory expanded far beyond what early developers imagined, the demand for more tokens in AI models could grow similarly as applications become more sophisticated. Setting a token limit might seem prudent now to ensure stability, but in a demo, itβs not critical. Understanding the balance between current needs and future potential is crucial, but it's also important to remain flexible and open to scaling as technology evolves. Have a nice dev journey π» [^2]
[^2]: This paragraph is generated by ChatGPT.
Lovely tale πππ
Helo here, not regarding the code but the demo, where i was able to set 5000000 tokens to generate, maybe can be limited to lower cap :) Have a nice dev journey π»