Open dctmfoo opened 4 days ago
Same here, can't even finish a simple project.
same
Same. Is there a way to run COFOUNDER using a local llm like ollama or LMSTUDIO?
OpenRouter pls
same here rate limit to anthropic is too low
same.
same, i sent message to sales to see if limits can be increased. Perhaps wait a day and resume when the daily limits reset?
It seems like the rate limits are per model. Is it possible to switch which model is being used to keep on going?
Can confirm, I updated the model in api/utils/anthropic.js
to use a newer version of the claude 3.5 sonnet model and it kept on working. If I use up my tokens again I'll update the model to use another model. Looks like the limits are per model.
Can confirm, I updated the model in
api/utils/anthropic.js
to use a newer version of the claude 3.5 sonnet model and it kept on working. If I use up my tokens again I'll update the model to use another model. Looks like the limits are per model.
I just raised the limits to tier 4. Contact sales and they seem quite happy to do this, for obvious reasons.
Thanks for your upgrade on the solution
Same. Is there a way to run COFOUNDER using a local llm like ollama or LMSTUDIO?
It is not possible with the repo's current methodology. However, there are people who are willing to migrate this project to Python which would be a great stepping stone to using local models as LLMs since the input and output tokens prices would be a headache in the long run.
Need a way to build a prototype because the PRD looks extensive and features probably not required during the first run