Closed vitaliel closed 9 years ago
Thanks again for opening the issue. We'll get right on it.
It's strange, I succeded only with a csv file with size < 5_000_000 bytes and 39250 rows.
Hi @vitaliel, and thank you for reporting this!
This Broken pipe (Errno::EPIPE)
error that you appear to have encountered is a known issue that is the root cause of three of the 39 currently open issues in google-api-ruby-client (upon which Gcloud depends.) They are:
A solution, documented in two of the issues above as well as in this Stack Overflow answer, is to add this line before your code (right after requiring gcloud
.) You will also need to add httpclient as a dependency in your project.
Faraday.default_adapter = :httpclient
Can you give this a try and let us know if it solves the problem? If so, I will add the solution to the documentation for Table#load
, and close this issue.
Thank you @blowmage for providing the background story on this.
@quartzmo Thanks, it worked.
@vitaliel Great. I will add documentation of this issue to the API doc for Table#load
, and in a Cloud Storage method where it is also possible. Then I will close this issue. Thanks again.
FYI, the updated docs will be included in the next point release (0.3.1), but the release after that (0.4.0) will most likely switch dependencies from Faraday to Hurley, meaning this guidance will change. Hopefully Hurley will be an improvement on Faraday and not have this issue in the default provider. :)
Hi,
I'm trying to upload 100Mb csv file to bigquery, but I get Errno::EPIPE errors.
Snippet:
I get the error after 10 seconds, but If I do not pass chunk_size, it fails after 50 seconds.
Exception: