lushan88a / google_trans_new

A free and unlimited python API for google translate.
MIT License
393 stars 170 forks source link

429 (Too Many Requests) from TTS API. Probable cause: Unknown #28

Open uplinux opened 3 years ago

uplinux commented 3 years ago

when use the api,i get this error. ... google_trans_new.google_trans_new.google_new_transError: 429 (Too Many Requests) from TTS API. Probable cause: Unknown

jmagdeska commented 3 years ago

I get the same error, is there any solution?

WDZEthan commented 3 years ago

Same error. I am using it for batch (size=50) translation from English to Chinese.

Mat37 commented 3 years ago

Same here, have to change IP using a VPN quite often

TKM6403 commented 3 years ago

Is there any word on the actual limit of translation? Is it in terms of character count/entity count/...? This would help greatly in tryafter loops.

noman00910 commented 3 years ago

I found a solution, which i don't know how worked. But it is working for me. Add random valid proxy from the internet in the Google Translator function as it is asked in the library. After executing script one time,(if it works) remove that proxy and this library will work as it was working before.

I don't know why but this method is working for me every time I am being blocked by tts API.

TKM6403 commented 3 years ago

Can you give us a short description of what you mean using some code? I am a bit confused on how to add the random proxy.

TKM6403 commented 3 years ago

Can you give us a short description of what you mean using some code? I am a bit confused on how to add the random proxy.

noman00910 commented 3 years ago

translator = google_translator(url_suffix="hk",timeout=5,proxies={'http':'xxx.xxx.xxx.xxx:xxxx','https':'xxx.xxx.xxx.xxx:xxxx'}) when you call google translator function, add some random proxy which is valid and working. You can try from the internet. If proxy is valid, library will work. After that, remove it from the function and it will work fine.

I tried my dedicated proxies (paid proxies) and those are working in this function.

TKM6403 commented 3 years ago

Where can I find a random proxy? Not familiar with this. Any help would be appreciated.

TKM6403 commented 3 years ago

ConnectTimeout: HTTPSConnectionPool(host='translate.google.cn', port=443): Max retries exceeded with url: /_/TranslateWebserverUi/data/batchexecute (Caused by ConnectTimeoutError(<urllib3.connection.VerifiedHTTPSConnection object at 0x000001C44C912730>, 'Connection to (proxy) timed out. (connect timeout=5)')).

How do I avoid this?

Wikidepia commented 3 years ago

ConnectTimeout: HTTPSConnectionPool(host='translate.google.cn', port=443): Max retries exceeded with url: /_/TranslateWebserverUi/data/batchexecute (Caused by ConnectTimeoutError(<urllib3.connection.VerifiedHTTPSConnection object at 0x000001C44C912730>, 'Connection to (proxy) timed out. (connect timeout=5)')).

How do I avoid this?

Try to increase timeout

Mat37 commented 3 years ago

Where can I find a random proxy? Not familiar with this. Any help would be appreciated.

Have you gotten a proxy ? I am too searching for one

noman00910 commented 3 years ago

search on the internet: free http proxy ... and you will find a number of websites that provide list of proxies (these are public proxies with no security)

noman00910 commented 3 years ago

you can get 10 free proxies after sign up on this website as well: https://www.webshare.io/

jmagdeska commented 3 years ago

search on the internet: free http proxy ... and you will find a number of websites that provide list of proxies (these are public proxies with no security)

I tried this approach but now I'm getting another error: Failed to connect. Probable cause: timeout

Although I put quite long timeout. Are you facing the same problem?

noman00910 commented 3 years ago

search on the internet: free http proxy ... and you will find a number of websites that provide list of proxies (these are public proxies with no security)

I tried this approach but now I'm getting another error: Failed to connect. Probable cause: timeout

Although I put quite long timeout. Are you facing the same problem?

that is because the proxy you are using may not have a good reputation. try to use your own dedicated proxy. you can get 10 free dedicated proxies on sign up on https://www.webshare.io/.

bjquinniii commented 3 years ago

Slightly different approach... I was getting locked out for "too many requests" (it does eventually reset... 8 hours or so if I remember correctly). I just slowed down my requests by throwing a 2 sec time delay into the loop. It it taking a bit longer, but I am no longer getting the timeout errors and I'm currently doing 10,000 requests per batch.

TKM6403 commented 3 years ago

@bjquinniii Can you leave a sample of the code that incorporates the delay ? That would be extremely helpful

bjquinniii commented 3 years ago

it's just a matter of sticking a time.sleep(2) statement in...

so for mine it looks something like this:

from bs4 import BeautifulSoup as bs from google_trans_new import google_translator import mysql.connector as dbconnector import time

I have a db table with html code I've scraped from a site, the relevant text is in a big
block within another

dbConn = dbconnector.connect(host='localhost', port=3306, user=XXX, password=XXX, database='text', charset='utf8') dbCur1 = dbConn.cursor() dbCur2 = dbConn.cursor()

sql = 'select postID, divBlock from diaryPosts limit 10000;' dbCur1.execute(sql) singles = dbCur.fetchall() dbCur1.close() for s in singles: postID = s[0] divBlock = s[1] soup = bs(divBlock, 'html.parser') postTextObj = soup.find('div', {'class': 'texto-dou'}) postText = postTextObj.text.strip() postTextTrans = translator.translate(postText, lang_tgt='en') sql = 'update diaryPosts set postText = %s, postTextTrans = %s where postID = %s;' dbCur2.execute(sql, (postText, postTextTrans, postID)) dbCur2.commit() time.sleep(2)

anyways, that's probably more code than you were asking for, but I figure it will stave of other potential questions... the part you need is the time.sleep(2), I'm using it to slow down the loop that is making the calls.

bjquinniii commented 3 years ago

ok, lesson learned, if you put a hash symbol in to denote a comment, it shows up as large bold text, sorry about that.

TKM6403 commented 3 years ago

@bjquinniii Haha no problem. Thank you so much for your code response. I think this will help a lot of us.

bjquinniii commented 3 years ago

I have noticed something odd about using this library/service, it is a little flakey sometimes... it can run flawlessly for hours (current script just rolled by 6900 (out of 10000) without any errors. But sometimes I will come back and there will be some sort of odd random error. When that happens, I'll just restart it, and even though because of the way the query is written it ends up restarting at exactly the same entry that errored out, it will pick up, succeed, and roll on fine. I guess eventually I'll add a try block around the translate call, but so far it's happening infrequently enough that I haven't gotten around to it. ... just rolled by 7000 So far I have translated a bit over 43000 blocks of text on this particular project. I used the older version last fall to do about 180,000 on another project.

uplinux commented 3 years ago

Thanks,But when use vpn, i get the same error. so i try to use my phone to share wifi for the computer, it run.....

B-Yassine commented 3 years ago

same problem here ! did you find a solution guys !

cj2001 commented 3 years ago

Same problem for me ( 429 (Too Many Requests) from TTS API. Probable cause: Unknown ).

LostInDarkMath commented 3 years ago

Same problem here.

ghost commented 3 years ago

getting same error.

emrecan-balgun commented 3 years ago

Getting same error.

YasineNifa commented 3 years ago

Hello guys, Did you find a solution to this ?

SquareKudu commented 2 years ago

when use the api,i get this error. ... google_trans_new.google_trans_new.google_new_transError: 429 (Too Many Requests) from TTS API. Probable cause: Unknown

I would just get a VPN. it will allow you to change domain when you have too many requests.