In the recent month you encounter a problem which the "bq load" don't load all
the rows from large json files.
What steps will reproduce the problem?
--------------------------------------
1. Adding at the end of the status log another row, which elaborate the number
of rows which were loaded in to the Table.
This will allow the user to know if he actually loaded all of his data (such as
"check sum")
What should Google do ?
-----------------------
Can you please add at the end of the loading Status a log which specify the
number of rows which had been loaded ?
Such as below
What is the expected output?
----------------------------
Waiting on bqjob_asdfghjjkkl_1 ... (49s) Current status: RUNNING
Waiting on bqjob_asdfghjjkkl_1 ... (50s) Current status: RUNNING
Waiting on bqjob_asdfghjjkkl_1 ... (50s) Current status: DONE
Loaded Rows: 145356
What version of the product are you using? On what operating system?
Windows 7 professional.
This is BigQuery CLI 2.0.22
Original issue reported on code.google.com by l...@jellybtn.com on 16 Mar 2016 at 11:52
Original issue reported on code.google.com by
l...@jellybtn.com
on 16 Mar 2016 at 11:52