argasi / google-bigquery

Automatically exported from code.google.com/p/google-bigquery
0 stars 0 forks source link

Bunch of 503 errors while streaming using the Python API #193

Closed GoogleCodeExporter closed 9 years ago

GoogleCodeExporter commented 9 years ago
What steps will reproduce the problem?
1. Streaming of rows using the Python API -> insertAll() method. 
2. It works fine for a few minutes and we then get a bunch of 503s : 
Connections error exceptions. 

What is the expected output? What do you see instead?
Success response, we see an exception.

What version of the product are you using? On what operating system?
BigQuery Python API on an Ubuntu box.

Please provide any additional information below.

Project_ID : 513990544911
Writing into a bunch of Datasets and tables. One random 503 was for the dataset 
'amp_126513' and 
tableId : '126513_feed' around 2014-11-20 19:41:21 UTC

We were also facing a bunch of 500s and 502s yesterday, retrying worked in most 
of the times but the error rate was higher than normal. 

Additional question : Can the bigquery_service object (build('bigquery', 'v2', 
http=http)) be resused? Does it need to be refreshed often?

Thanks. 

Original issue reported on code.google.com by nir...@amplitude.com on 20 Nov 2014 at 8:15

GoogleCodeExporter commented 9 years ago
I continue to see a bunch of 500s this morning. 
One example : <HttpError 500 when requesting 
https://www.googleapis.com/bigquery/v2/projects/513990544911/datasets/amp_342/ta
bles/342_application___hlapplicationlifecycleobserver_onappwillresignactive_obse
rver_withpriority__/insertAll?alt=json returned "Backend Error">

Can someone help me diagnose if there is something incorrect on my side or at 
BigQuery's side? If not on my side, can I avoid this exception by making some 
changes?

Thanks, 
Nirmal

Original comment by nir...@amplitude.com on 24 Nov 2014 at 7:59

GoogleCodeExporter commented 9 years ago
Sorry for the late response.
500 errors are mostly transient and caused by internal service hiccups. 
User-side caching and retry are usually sufficient ways to mitigate the 
problem. 
More information:
https://cloud.google.com/bigquery/streaming-data-into-bigquery#troubleshooting

I'll close this issue for now. Please feel free to reopen it if you have more 
questions.
Thanks!

Original comment by che...@google.com on 2 Jan 2015 at 11:51