Open delgadom opened 6 years ago
I assume this is on a machine running Mac OSX? The only times I've seen this error are in such cases where the OS is trying to prevent users from messing with the operating system's important files (as mentioned in this StackOverflow thread) via SIP.
Could you try throwing in a debugging statement to print out the name of the problematic file? E.g. at the top of the _UploadFileToObject
method in "/Users/delgadom/google-cloud-sdk/platform/gsutil/gslib/copy_helper.py" (defined at line 1803, based on your stack trace), you could add the statement print(src_url.object_name)
. From there, you should be able to investigate the file and see if it's secured via special permissions.
Thanks for the response @houglum! I am indeed on OSX : High Sierra, but the file shouldn't be a protected system file. After inserting that debug file, it turned out to be:
/Volumes/Seagate Expansion Drive/Michael Delgado/Mike/Archive/Stanford/2009-2010/Earth Systems Research/Earth Systems Research/Icon_
Not sure what that file is, but it doesn't seem like it could be a core system file.
Anyway, I wrapped that block in a try: ... except IOError: return, and now it's humming along just fine. probably not a great idea, but it's working!
System specs:
OSX High Sierra version 10.13.2 (17C88)
MacBook Pro (13-inch, 2017, Two Thunderbolt 3 ports)
Processor 2.3 GHz Intel Core i5
Memory 8 GB 2133 MHz LPDDR3
Python 2.7.14 |Anaconda, Inc.| (default, Oct 5 2017, 02:28:52)
[GCC 4.2.1 Compatible Clang 4.0.1 (tags/RELEASE_401/final)] on darwin
gsutil version: 4.28
Huh - the other folder was failing on a similar file:
/Volumes/Seagate Expansion Drive/Michael Delgado/Pictures/alaska/Mike's Phone/Icon_
Yes, we have to add things like "-x '(.gnupg|.cache|.*Icon\r$)'" to our gsutil rsync...
I'm trying to use rsync to upload from a read-only filesystem on an external disk to a google cloud bucket. The process worked for a handful of folders, but after successfully uploding thousands of files has begun consistently failing for one of my directories (total contents: 40 GB). I tried switching to another similarly sized directory and it worked for a while but is now failing as well.
I've tried uninstalling the google cloud SDK but I get the same error. I've tried poking around in the files mentioned by the stack trace but haven't been able to find anything. This has been going on for a couple days, so it's not a rate limit issue.
I've upgraded my account to a full paid plan and have a card attached, so I don't think it's a payments issue. Is there some maximum bucket contents size setting somewhere that I've missed?
At the very least, a more descriptive error message would be really helpful. Thanks!
Now attempting the same operation on a different folder. It worked for thousands of files, but is now failing: