openai / openai-cookbook

Examples and guides for using the OpenAI API
https://cookbook.openai.com
MIT License
58.58k stars 9.29k forks source link

Website Q & A bot crawler Rate limit and adjust crawl speed #134

Closed meatinggroup closed 1 year ago

meatinggroup commented 1 year ago

Hi Team,

I have been trying to build a chatbot for my website.

It seems that I have hit some rate limit from openai API. How to solve this issue? web-qa.py:167: FutureWarning: The default value of regex will change from True to False in a future version. serie = serie.str.replace('\n', ' ') Traceback (most recent call last): File "web-qa.py", line 285, in df['embeddings'] = df.text.apply(lambda x: openai.Embedding.create(input=x, engine='text-embedding-ada-002')['data'][0]['embedding']) File "/root/env/lib/python3.8/site-packages/pandas/core/series.py", line 4771, in apply return SeriesApply(self, func, convert_dtype, args, kwargs).apply() File "/root/env/lib/python3.8/site-packages/pandas/core/apply.py", line 1105, in apply return self.apply_standard() File "/root/env/lib/python3.8/site-packages/pandas/core/apply.py", line 1156, in apply_standard mapped = lib.map_infer( File "pandas/_libs/lib.pyx", line 2918, in pandas._libs.lib.map_infer File "web-qa.py", line 285, in df['embeddings'] = df.text.apply(lambda x: openai.Embedding.create(input=x, engine='text-embedding-ada-002')['data'][0]['embedding']) File "/root/env/lib/python3.8/site-packages/openai/api_resources/embedding.py", line 33, in create response = super().create(*args, **kwargs) File "/root/env/lib/python3.8/site-packages/openai/api_resources/abstract/engine_apiresource.py", line 153, in create response, , api_key = requestor.request( File "/root/env/lib/python3.8/site-packages/openai/api_requestor.py", line 227, in request resp, got_stream = self._interpret_response(result, stream) File "/root/env/lib/python3.8/site-packages/openai/api_requestor.py", line 620, in _interpret_response self._interpret_response_line( File "/root/env/lib/python3.8/site-packages/openai/api_requestor.py", line 680, in _interpret_response_line raise self.handle_error_response( openai.error.RateLimitError: Rate limit reached for default-global-with-image-limits in organization org-Z1lI2wI8wQYzDBcaQnfYIAaY on requests per min. Limit: 60.000000 / min. Current: 70.000000 / min. Contact support@openai.com if you continue to have issues. Please add a payment method to your account to increase your rate limit. Visit https://platform.openai.com/account/billing to add a payment method.

And is there a way I could slow down the crawler? The website WAF might block this activity if too many requests were detected.

logankilpatrick commented 1 year ago

Hey! The crawler is not what is causing the wait limit, it is the subsequent requests to our API that are rate limited. I suggest reading the rate limit guide we have to get a better sense of how to resolve this issue: https://platform.openai.com/docs/guides/rate-limits