raidendotai / cofounder

ai-generated apps , full stack + generative UI
https://cofounder.openinterface.ai/
MIT License
4.12k stars 431 forks source link

Hitting rate limits pretty quickly #13

Open dctmfoo opened 4 days ago

dctmfoo commented 4 days ago

Need a way to build a prototype because the PRD looks extensive and features probably not required during the first run

text: >- Project 'quizfare' :

An app where users can upload or paste a document, generating a set of 15 to 30 questions with keys and exportable to Google Forms. attachments: [] design: aesthetics: text: It has a two-panel layout with a content area on the left for adding quiz material and a preview area on the right where generated questions appear. Users can type text, upload a file, or enter a URL to create a quiz. There are options to select question types (like multiple-choice), difficulty level, and the number of questions. Create Quiz button is present but disabled until content is added. The interface has a clean, purple-themed design timestamp: 1730413415757

image

rickylenon commented 4 days ago

Same here, can't even finish a simple project.

serkandyck commented 4 days ago

same

bibop commented 3 days ago

Same. Is there a way to run COFOUNDER using a local llm like ollama or LMSTUDIO?

PierrunoYT commented 3 days ago

OpenRouter pls

seoguypt commented 2 days ago

same here rate limit to anthropic is too low

Sinopsys commented 2 days ago

same.

QRMarketing commented 1 day ago

same, i sent message to sales to see if limits can be increased. Perhaps wait a day and resume when the daily limits reset?

czd commented 5 hours ago

It seems like the rate limits are per model. Is it possible to switch which model is being used to keep on going?

czd commented 4 hours ago

Can confirm, I updated the model in api/utils/anthropic.js to use a newer version of the claude 3.5 sonnet model and it kept on working. If I use up my tokens again I'll update the model to use another model. Looks like the limits are per model.

QRMarketing commented 4 hours ago

Can confirm, I updated the model in api/utils/anthropic.js to use a newer version of the claude 3.5 sonnet model and it kept on working. If I use up my tokens again I'll update the model to use another model. Looks like the limits are per model.

I just raised the limits to tier 4. Contact sales and they seem quite happy to do this, for obvious reasons.

UmutKeremOzen commented 2 hours ago

Thanks for your upgrade on the solution

UmutKeremOzen commented 2 hours ago

Same. Is there a way to run COFOUNDER using a local llm like ollama or LMSTUDIO?

It is not possible with the repo's current methodology. However, there are people who are willing to migrate this project to Python which would be a great stepping stone to using local models as LLMs since the input and output tokens prices would be a headache in the long run.