Open GoogleCodeExporter opened 8 years ago
Can you provide more detail about the failure? What are you trying to do, and
what errors are you seeing?
Original comment by jcon...@google.com
on 4 Mar 2016 at 4:43
[deleted comment]
[deleted comment]
[deleted comment]
[deleted comment]
We tried to insert 20 k of data using the document link provided below;
https://github.com/GoogleCloudPlatform/csharp-docs-samples/blob/master/bigquery/
api/GettingStarted/Program.cs.
Now its throwing the error that out of memory exception in the program.
From sql we are selecting top 18,000 rows of data for each set of insert.
For 1k or 2k the test was successful and we can see the data in the console.
FOr more than that data is not displaying in the console.
Code for validating and inserting data to Bigquery
http://my.jetscreenshot.com/11825/20160306-g40j-128kb
Json Creation Code
http://my.jetscreenshot.com/11825/20160306-ji3l-106kb
By using the above code we already inserted more than 10k of data ,but now its
not working,we are not getting any error while inserting data.Just completing
the job.That is the issue we are experiencing now.
This is what we got when we created a trace file for the execution of first 20
rows of data.
10 rows Information
**********************************************
Content-Google.Apis.Bigquery.v2.Data.TableDataInsertAllRequest
|Kind-bigquery#tableDataInsertAllRequest
|ProjectId-pragmatic-lead-112816
|DataSetId-ReportDB
|DestinationTable-Impression
Content.Rows-System.Collections.Generic.List`1[Google.Apis.Bigquery.v2.Data.Tabl
eDataInsertAllRequest+RowsData]
|Response-ETag"nwg3tKAm7RiC5vqWthFIuCNSGxs/lbjupGdW6_f2Sf9XhLr91b-fLf0"|InsertEr
rors-ETagSystem.Collections.Generic.List`1[Google.Apis.Bigquery.v2.Data.TableDat
aInsertAllResponse+InsertErrorsData]
***************************************************
10 rows Information
**********************************************
Content-Google.Apis.Bigquery.v2.Data.TableDataInsertAllRequest
|Kind-bigquery#tableDataInsertAllRequest
|ProjectId-pragmatic-lead-112816
|DataSetId-ReportDB
|DestinationTable-Impression
Content.Rows-System.Collections.Generic.List`1[Google.Apis.Bigquery.v2.Data.Tabl
eDataInsertAllRequest+RowsData]
|Response-ETag"nwg3tKAm7RiC5vqWthFIuCNSGxs/lbjupGdW6_f2Sf9XhLr91b-fLf0"|InsertEr
rors-ETagSystem.Collections.Generic.List`1[Google.Apis.Bigquery.v2.Data.TableDat
aInsertAllResponse+InsertErrorsData]
***************************************************
After executing these data ,we can see no data in the bigquery console
http://my.jetscreenshot.com/11825/20160306-cmqu-76kb
And also we can see all the data are inserting perfectly to bigquery.
But we losing the data in between.
Please help us to resolve this issue.......
Original comment by nidhees...@mainadv.com
on 6 Mar 2016 at 9:04
Can you please clarification:
* Is a BigQuery API command encountering an out of memory exception, or is this
only in the client code?
* Are you encountering an issue with tabledata.insertAll, or executing a query?
If it is an issue with inserting, then can you please provide the name of the
dataset and table. If it is an a query, can you please provide the job id of
the query?
Original comment by bch...@google.com
on 7 Mar 2016 at 10:38
We are using Bigquery stream data insert ie,only in client code in your words
,i think so.
Sample link used to Test
https://github.com/GoogleCloudPlatform/csharp-docs-samples/blob/master/bigquery/
api/GettingStarted/Program.cs.
Actually when we insert the data of 1 row of information or 1000 row of
information we didnt get errors in the code side but the inserted rows were not
showing in the Project console.https://bigquery.cloud.google.com.
That is nothing will be inserting ,somewhere we loosing the inserted data
Project ID:pragmatic-lead-112816
Dataset Name :ReportDB
Table Name:Impression
Original comment by nidhees...@mainadv.com
on 8 Mar 2016 at 12:40
Content-Google.Apis.Bigquery.v2.Data.TableDataInsertAllRequest
|Kind-bigquery#tableDataInsertAllRequest
|ProjectId-pragmatic-lead-112816
|DataSetId-ReportDB
|DestinationTable-Impression
Content.Rows-System.Collections.Generic.List`1[Google.Apis.Bigquery.v2.Data.Tabl
eDataInsertAllRequest+RowsData]
|Response-ETag"nwg3tKAm7RiC5vqWthFIuCNSGxs/lbjupGdW6_f2Sf9XhLr91b-fLf0"|InsertEr
rors-ETagSystem.Collections.Generic.List`1[Google.Apis.Bigquery.v2.Data.TableDat
aInsertAllResponse+InsertErrorsData]
Original comment by nidhees...@mainadv.com
on 8 Mar 2016 at 12:42
How are you checking for the data in the BigQuery UI
(https://bigquery.cloud.google.com)? For example, are you viewing the table
details? If so, tabledata.insertAll is a high-throughput API and the table's
details are not guaranteed to be updated immediately. However, data is expected
to be available and can be queried a few seconds after inserting. Can you try
executing a query, such as "SELECT COUNT(*) FROM [ReportDB.Impression];" to
verify that your data is available?
Additional details from the BigQuery documentation:
https://cloud.google.com/bigquery/streaming-data-into-bigquery#dataavailability
Original comment by bch...@google.com
on 8 Mar 2016 at 7:08
Original issue reported on code.google.com by
nidhees...@mainadv.com
on 4 Mar 2016 at 12:03