labbots / google-drive-upload

Bash scripts to upload files to google drive
https://labbots.github.io/google-drive-upload/
MIT License
698 stars 145 forks source link

Curl Error on finalizing upload #67

Closed bauti-defi closed 4 years ago

bauti-defi commented 4 years ago

gupload -o -D myFile.txt gDriveFolder Produces the following stacktrace.

++ curl --compressed -X PUT -H 'Authorization: Bearer private token' -H 'Content-Type: ' -H 'Content-Length: 21843581' -H 'Slug: ArgBonds.txt' -T ArgBonds.txt -o- --url 'https://www.googleapis.com/upload/drive/v3/files/1949LFiM5G-v00nTssaPSr3t5VE4PSYFv?uploadType=resumable&supportsAllDrives=true&supportsTeamDrives=true&upload_id=AAANsUlf0Ykqj1rqmoY6BADMxyW8jLsY8ctWWrREOoJQlRgoadswBSGge4m6HJXezA9qTxHePN_KlnF4K6lujs5IMrTeONVtDA' --globoff -#
##################################################################################################################################################################################################### 100.0%
curl: (92) HTTP/2 stream 1 was not closed cleanly: PROTOCOL_ERROR (err 1)
+ upload_body=
+ rm -f /tmp/tmp.HCe05w5o9vSUCCESS
+ rm -f /tmp/tmp.HCe05w5o9vERROR

I am getting this error when uploading a file to my google drive. The error is not consistent given that it doesn't always occur (50% of the time). I think the error is rooted in Content-Length of the curl request. It seems to happen on larger files (10MB +) and not on smaller files (~3MB)

Akianonymus commented 4 years ago

@Bautista-Baiocchi-lora I am actually not able to reproduce the issue, i tried uploading a large file, a 4 gb file and some small files 30 mb +, multiple times( about 10 times ), but never got the error.

Can you upload your file and tell the curl version ( curl -V ).

bauti-defi commented 4 years ago

@Akianonymus Here is the exact file: https://drive.google.com/file/d/1949LFiM5G-v00nTssaPSr3t5VE4PSYFv/view?usp=sharing

curl -V

curl 7.58.0 (x86_64-pc-linux-gnu) libcurl/7.58.0 OpenSSL/1.1.1 zlib/1.2.11 libidn2/2.0.4 libpsl/0.19.1 (+libidn2/2.0.4) nghttp2/1.30.0 librtmp/2.3
Release-Date: 2018-01-24
Protocols: dict file ftp ftps gopher http https imap imaps ldap ldaps pop3 pop3s rtmp rtsp smb smbs smtp smtps telnet tftp 
Features: AsynchDNS IDN IPv6 Largefile GSS-API Kerberos SPNEGO NTLM NTLM_WB SSL libz TLS-SRP HTTP2 UnixSockets HTTPS-proxy PSL 

Another note: I have the file upload scheduled as a crontab task (Twice a day). Could this be a factor?

Akianonymus commented 4 years ago

@Bautista-Baiocchi-lora ok, i tried the file, uploads fine, i have the same curl version and even same platform ( x86 ).

A crontab should be fine though.

are you on latest version ? 🤔 If not please update it, also give the exact command you are using and crontab details.

btw, you can also use sync.sh to sync a folder between specified intervals, in background.

bauti-defi commented 4 years ago

results from gupload --info:

REPO="labbots/google-drive-upload"
COMMAND_NAME="gupload"
SYNC_COMMAND_NAME="gsync"
INSTALL_PATH="/home/bautista/.google-drive-upload/bin"
CONFIG="/home/bautista/.googledrive.conf"
TYPE="release"
TYPE_VALUE="latest"
SHELL_RC="/home/bautista/.bashrc"
LATEST_INSTALLED_SHA="v2.6"

Crontab

5 12,17 * * 1-5 /home/bautista/.google-drive-upload/bin/gupload -q -o /home/bautista/Finance-Scrapers/data/ArgBonds.txt financialData >> /home/bautista/Finance-Scrapers/backup.log 2>&1
5 12,17 * * 1-5 /home/bautista/.google-drive-upload/bin/gupload -q -o /home/bautista/Finance-Scrapers/data/ArgOptions.txt financialData >> /home/bautista/Finance-Scrapers/backup.log 2>&1
5 12,17 * * 1-5 /home/bautista/.google-drive-upload/bin/gupload -q -o /home/bautista/Finance-Scrapers/data/ArgFutures.txt financialData >> /home/bautista/Finance-Scrapers/backup.log 2>&1
5 12,17 * * 1-5 /home/bautista/.google-drive-upload/bin/gupload -q -o /home/bautista/Finance-Scrapers/data/ArgStocks.txt financialData >> /home/bautista/Finance-Scrapers/backup.log 2>&1
5 12,17 * * 1-5 /home/bautista/.google-drive-upload/bin/gupload -q -o /home/bautista/Finance-Scrapers/data/ArgCEDEARS.txt financialData >> /home/bautista/Finance-Scrapers/backup.log 2>&1

I have tried gsync too but it also fails on larger files (~10MB+)

@Akianonymus

Akianonymus commented 4 years ago

@Bautista-Baiocchi-lora Seems fine.

I actually don't have a single clue whats going on, here it all works fine.

I have been running a sync job since 2 days and it uploads fine.

Do this only occurs on sync job or all normal uploads too ?

bauti-defi commented 4 years ago

Both gupload and gsync fail on larger files. I will try to diagnose the problem further on my side.

@Akianonymus

Akianonymus commented 4 years ago

@Bautista-Baiocchi-lora I hope someone else can reproduce the issue, so maybe we can have more logs.

btw, can you attach a full log when uploading a big file when it fails, use -D flag

seems like it's happening to you even on manual upload, shouldn't be hard to obtain a log.

@labbots can you check if you have any issue pike like this ?

bauti-defi commented 4 years ago

After further testing i've concluded that the upload error is caused by my scrapers. I have multiple simple python scrapers (a simple http request and parse) running every minute on the same machine. Uploading a file of size 10MB+ fails unless the scrapers are turned off. Attempting to upload a file of ANY size works like a charm as long as NO scrapers are running.

How do you recommend I go about running the scrapers and g-uploader simultaneously ? @Akianonymus

Akianonymus commented 4 years ago

@Bautista-Baiocchi-lora

Given the information you gave, seems like a bandwidth issue, maybe my script is not able to use the internet properly creating such problems.

Also, give me the log i asked before, would be helpful.

labbots commented 4 years ago

@Bautista-Baiocchi-lora do you still have this problem? If so could you please provide the logs requested by @Akianonymus which would aid us in further investigation on it.

labbots commented 4 years ago

Closed as there is no activity on the ticket