Closed RobCruzo closed 2 years ago
(Phff, Deutsch ist so anglisiert. /s)
I'm using x86 Monterey (16 GiB RAM) for a 94 GiB ftp job (~100k directories, ~1m files).
It's currently at 49.58 GiB virt size, 1 GiB RSS, and 51 GiB of swap. There's minimal memory pressure otherwise.
The issue started after the initial directory scan and it's going up slightly as the job progresses.
Basic CS math says, even at 2m files with a whopping 256 bytes per file or dir structure on 64-bit platforms, it shouldn't use more than 0.5 GiB of RAM. Anything over 1 Gig for that is likely a bug. Looks like one or more memory leaks. It's difficult to say what is and is not the same bug. It probably needs profiling.
I did a large transfer yesterday of approx. a 100 GB collection of files & folders and noticed this runaway memory usage as well. I would stop the transfer and allow the swap usage to clear up and start again every 10-15 GB transferred.
This is Monterey 12.1 on an M1 MacBook Air.
I am seeing the same on my M1 Macbook Pro running macOS 12.1 and with Cyberduck 8.2.3. I was also seeing it with 8.2.2. It is happening downloading 50-100GB from S3. For example, here is a current transfer:
Unfortunately, I am using 8.3.2 and still encountering this error. Monterey 12.3.1, MacBook Pro (M1 Pro), Cyberduck version 8.3.2 (37449). I am downloading 10's of GB from S3, typically larger files (either 250kb or 20 MB).
I have a similar memory issue with version 8.4.2 on a MacBook Pro (16 inch 2019) with Catalina 10.15.7.
I need to download some backup and it just eats up all my memory until there is none. I tried to disable the Preferences > Transfers > General > Segmented downloads... with no luck
.
I have a similar memory issue with version 8.4.2 on a MacBook Pro (16 inch 2019) with Catalina 10.15.7.
I need to download some backup and it just eats up all my memory until there is none. I tried to disable the Preferences > Transfers > General > Segmented downloads... with no luck
.
What's the protocol (also AWS S3?) in your case?
I am connecting to a cloud service called Hubic to download my data. it seems to be using HTTPS through their API (like for S3 I believe).
As a quick workaround for now, please let me know if there a version that I could test to check if it works.
(Sorry, it is in German, but I guess you can figure it out anyways.)
I connected via OpenStack Swift Keystone 3 and tried to download a 100GB file.
Reading through other issues regarding memory hogging, I tried the transfer with deactivated segmented transfer option, which did not bring any improvement.