deedy5 / duckduckgo_search

Search for words, documents, images, videos, news, maps and text translation using the DuckDuckGo.com search engine. Downloading files and images to a local hard drive.
MIT License
927 stars 117 forks source link

Ratelimit exception when searching for more than 10 keywords #195

Closed maxlorax closed 3 months ago

maxlorax commented 3 months ago

Hello, I still am still encountering Ratelimit exceptions when trying to search for more than 10 queries at a time (in a for loop). I have found a work around consisting of calling time.sleep(0.4) every run but there has to be some way thats more elegant right?

for query in tqdm(queries, desc="Searching for queries"):
                ddg_results = list()
                query_no+=1

                temp = DDGS().text(keywords=query, max_results=max_num_of_results, safesearch="moderate")
                for result in temp:
                    print(result.__getitem__("href"))
                ddg_results += temp

                results += [entry.__getitem__("href") for entry in ddg_results]

                time.sleep(0.4)
File "/venv/lib/python3.11/site-packages/duckduckgo_search/duckduckgo_search_async.py", line 94, in _aget_url
    raise DuckDuckGoSearchException("Ratelimit")
duckduckgo_search.exceptions.DuckDuckGoSearchException: Ratelimit
deedy5 commented 3 months ago

1) Just use a proxy. The api response size is very small, about 15 Kb. That is, there are more than 50000 responses per 1 gigabyte of traffic.

2) Also you can try another backend, especially if you need no more than 23 results. temp = DDGS().text(keywords=query, max_results=max_num_of_results, safesearch="moderate", backend="html")

maxlorax commented 3 months ago

Thank you very much my guy! I'll be looking into the first option you suggested because I need to perform bursts of about 50-100 searches.

iamaziz commented 3 months ago

Thank you. I came here because I have the same RateLimit issue. Using backend='html' solved the problem.