Backblaze / B2_Command_Line_Tool

The command-line tool that gives easy access to all of the capabilities of B2 Cloud Storage
Other
539 stars 123 forks source link

B2 CLI Finder not Responding on MBP M2 Max #860

Open buzaw0nk opened 1 year ago

buzaw0nk commented 1 year ago

Trying to download 2 large snapshot files and getting stuck. Just upgraded Homebrew to latest release and reinstalled B2 tools. Used both Sync and Download-File-By-Name with the same result.

Running right now:

b2 download-file-by-name b2-snapshots-xxxxxxxxxxxx bzsnapshot_2023-xx-xx-xx-xx-xx.zip "/volumes/xxxxxx/bzsnapshot_2023-xx-xx-xx-xx.zip"

It starts and shows 0%| | 1.05M/1.53T [00:01<4 but never proceeds although I can hear the NAS working. I am unable to check via Finder, because Finder just throws a spinning beach ball and Activity Monitor shows Finder in red and (not Responding).

Computer: MBP M2 Max 12/38 32 1Tb Dock: Caldigit TS4 2.5gbe Network: 10Gbe backbone with 1 gig cable internet NAS: QNAP 10gbe ZFS MacOS:13.2.1

HOMEBREW_VERSION: 4.0.3 ORIGIN: https://github.com/Homebrew/brew HEAD: ba6d87eed9ce31f42fa2b33a4e6067d9cf17f13f Last commit: 4 days ago Core tap origin: https://github.com/Homebrew/homebrew-core Core tap HEAD: 74b16ab3305a9adca88d2079d728c67320531740 Core tap last commit: 58 minutes ago Core tap branch: master Core tap JSON: 24 Feb 15:23 UTC HOMEBREW_PREFIX: /opt/homebrew HOMEBREW_CASK_OPTS: [] HOMEBREW_MAKE_JOBS: 12 Homebrew Ruby: 2.6.10 => /System/Library/Frameworks/Ruby.framework/Versions/2.6/usr/bin/ruby CPU: dodeca-core 64-bit arm_blizzard_avalanche Clang: 14.0.0 build 1400 Git: 2.37.1 => /Library/Developer/CommandLineTools/usr/bin/git Curl: 7.86.0 => /usr/bin/curl macOS: 13.2.1-arm64 CLT: 14.2.0.0.1.1668646533 Xcode: N/A Rosetta 2: false

b2 command line tool, version 3.7.1

I've tried default threads, 25, 50, 75 with the same result. And I've tried the same via the b2 sync command with the same result. Ctrl-C shows that the process is terminating but after several hours it was still running.

Attempting to restart Finder results in it never coming back until I reboot.

Any insight would be appreciated. Thank you.

ppolewicz commented 1 year ago

Try with a really small number of threads like 2 or even 1, see if that helps. 1-thread has a completely different implementation.

buzaw0nk commented 1 year ago

I tried the pypy install and the default threads and it locked up for a several hours and now it is running ok. It's slow and this snapshot is 1.5tb and after 24 hours I'm less than halfway done. Would this work better on a PC? As it sits, B2 is not a viable option if recovery is going to to take days. I'm trialing the service for production use with client data, and I can only imagine a scenario where an office is down and I can't download the backups.

ppolewicz commented 1 year ago

Cheap NAS devices can have notoriously low performance. I've been doing really fast downloads from B2 using the CLI and it was more than fine. The speed depends on what you run it on. For best performance use a fast CPU, really fast network (DigitalOcean will cut you off at a mere 2Gbps) and the newest version of python. Preferably on linux.

EDIT: perhaps the issue might be with the NAS taking a literal approach to TRIM command, preallocationg space for the target by zeroing it?

buzaw0nk commented 1 year ago

I think it may be the external drive connected to the NAS. Looking into the issue. Computer is an M2 MBP, NAS is QNAP TVS-472XT with an i5, network is 10Gb, but MBR has 2.5Gb connection. I will update if I discover something. Making room on a USB 3.2 G2 external SSD now to test again. Thanks for your help.

ppolewicz commented 1 year ago

You might want to test your device with hdparm and you might also want to test b2 itself on a ram drive - just kill the transfer before you run out of memory.

buzaw0nk commented 1 year ago

Tested with Blackmagic Disk Speed Test on the original target (USB3.0 external connected to NAS) and got all of 12Mbs write speeds, lol. I'll look into why later, but I was able to download a 1.5tb file using the download-file-by-name command successfully, although it did seem to slow down after about 15%. Took 4h24m to complete with 75 threads. The internal drives of the NAS speeds are at around 250mbs write and I will attempt the 6.5Tb file later, I need to rearrange some data first. Thanks for your help.

I will update if I am able to download the larger file.

ppolewicz commented 1 year ago

I'd say this is probably too many threads, though if the array is rebuilding or scanning, that might actually be a correct amount I guess?

buzaw0nk commented 1 year ago

Pawel, I would be remiss if I said I knew enough about threads to make an informed decision. I tried to delve into the concept, but just ended up trying different settings to see what would provide the fastest connection. I have too much on my plate atm to give it the focus it deserves. I've cleared some space on the NAS and I'm ready to try the larger snapshot. How many threads do you recommend?

ppolewicz commented 1 year ago

What type of drives do you have and in which raid type?

buzaw0nk commented 1 year ago

WD Red Pro 10Tb, Raid 5 ZFS, QNAP running QutsHero, 1 Gig cable Internet, 10Gb Network with 2.5Gb connection to Mac.

I was toying with the idea of setting up a VM on the NAS and eliminating the MBP from the loop.

ppolewicz commented 1 year ago

I suggest that you use 24 threads, unless you are in Asia or Australia, then 48.