GenesysGo / shadow-drive-cli

Shadow Drive CLI
Apache License 2.0
13 stars 10 forks source link

Uploading a ton of files simultaneously kills the upload process #8

Open robbestad opened 2 years ago

robbestad commented 2 years ago

Example:

shdw-drive upload-multiple-files -kp walletfile -s 8fdFb93XdP62eMNTfCJk2qzn2sEdtMd4Gj9yor1wNaTY -d video/
This is beta software running on Solana's Mainnet. Use at your own discretion.
Writing upload logs to /home/svena/shdw-drive-upload-16639199087.json.
✔ Collecting all files
✔ Fetching all storage accounts
Upload Progress | ░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░░ | 0% || 0/10220 FilesKilled

About 8GB of data

tracy-codes commented 2 years ago

@robbestad What version of the CLI are you running here? Along with that, do you mind sharing your system specs:

robbestad commented 2 years ago

0.3.4 Running windows 11, but in a ubuntu wsl2 container. 32 GB ram on AMD Ryzen 7 2700X Eight-Core Processor 3.70 GH

Very pertinent information, because the same operation on rather beefy native ubuntu server has no problems running the upload-multiple-files with even more files than above.

tracy-codes commented 2 years ago

0.3.4 Running windows 11, but in a ubuntu wsl2 container. 32 GB ram on AMD Ryzen 7 2700X Eight-Core Processor 3.70 GH

Very pertinent information, because the same operation on rather beefy native ubuntu server has no problems running the upload-multiple-files with even more files than above.

Is there any way you could monitor your memory usage while running this command again on that same dataset? I'd be interested to see what the memory usage looks like on your 32gb ram system. We attempted to optimize by only reading file buffers in small batches instead of all at once. Might need further optimization here though.

robbestad commented 2 years ago

Sure, I'll try again and come back with some data on that

robbestad commented 2 years ago

It exhausts my memory. It seems I've only allocated 8gb or so (default settings).

Clipboard Image (1)

I updated it to 16GB by adding a .wslconfig. The result is that an upload of 3.2GB eventually ate up 3.4GB of memory and proceeded to upload. So the lesson is simply to have more RAM than the stuff you want to put up.

tracy-codes commented 2 years ago

It exhausts my memory. It seems I've only allocated 8gb or so (default settings).

Clipboard Image (1)

I updated it to 16GB by adding a .wslconfig. The result is that an upload of 3.2GB eventually ate up 3.4GB of memory and proceeded to upload. So the lesson is simply to have more RAM than the stuff you want to put up.

Thank you very much for reporting this. The program definitely should be letting go of that memory as uploads are completed so we will dive into that.