sagemathinc / cocalc

CoCalc: Collaborative Calculation in the Cloud
https://CoCalc.com
Other
1.15k stars 209 forks source link

llm: ridiculously high spend limit #7505

Closed haraldschilly closed 3 months ago

haraldschilly commented 3 months ago

today I noticed the possible spend limits for LLMs are very high. this only creates the impression that they're very expensive – which they're not.

Screenshot from 2024-04-30 09-43-46

novoselt commented 3 months ago

That was part of the solution to deal with too low limits (like $5 is not quite enough for large context models). I think the right solution is to get rid of these buttons completely AND have a sane default soft limit, like $20 for LLMs and $100 for compute servers.

williamstein commented 3 months ago

I think the right solution is to get rid of the limits entirely and instead have an optional feature to notify the user in 1 or more configurable ways when they hit various overall spend thresholds. It's a lot simpler to use and understand, and is exactly what some clouds do. What we have now causes confusion and data loss.

novoselt commented 3 months ago

Related #7506

williamstein commented 3 months ago

Instead do #7506