BerriAI / litellm

Python SDK, Proxy Server (LLM Gateway) to call 100+ LLM APIs in OpenAI format - [Bedrock, Azure, OpenAI, VertexAI, Cohere, Anthropic, Sagemaker, HuggingFace, Replicate, Groq]
https://docs.litellm.ai/docs/
Other
14.13k stars 1.67k forks source link

having this error while using crawl4ai #6560

Open Trail87 opened 3 weeks ago

Trail87 commented 3 weeks ago

Give Feedback / Get Help: https://github.com/BerriAI/litellm/issues/new LiteLLM.Info: If you need to debug this error, use `litellm.set_verbose=True'.

Rate limit error: litellm.RateLimitError: RateLimitError: OpenAIException - Error code: 429 - {'error': {'message': 'You exceeded your current quota, please check your plan and billing details. For more information on this error, read the docs: https://platform.openai.com/docs/guides/error-codes/api-errors.', 'type': 'insufficient_quota', 'param': None, 'code': 'insufficient_quota'}} Waiting for 4 seconds before retrying...

trying to build a web scraper using google colab by coping the 1Littlecoder Youtuber url https://www.youtube.com/watch?v=wTy0cDqRxeQ&t=616s can you guide me how to solve this, never done coding and new to python and ai

superpoussin22 commented 3 weeks ago

It means you try to access a model for which you don't have resources, can you check your quotas == how many requests per minutes you can do and how many tokens per minute you can consume ?