microsoft / PowerToys

Windows system utilities to maximize productivity
MIT License
111.11k stars 6.54k forks source link

Fast Delete #1643

Open azchohfi opened 4 years ago

azchohfi commented 4 years ago

Summary of the new feature/enhancement

If you want to delete a lot of files (large number of files, not large files) thru File Explorer, right now it is a very slow process, compared to doing the same thru the command line.

Proposed technical implementation details (optional)

I would like to have a power option that enabled me to fast delete a selection of files from Explorer. Pretty much what this provides: https://www.ghacks.net/2017/07/18/how-to-delete-large-folders-in-windows-super-fast/ Which is pretty much this: DEL /F/Q/S *.* > NUL That would be much better than typing the command!

crutkas commented 4 years ago

I love this idea

wjk commented 4 years ago

I’d love to take a stab at this one.

crutkas commented 4 years ago

@WJK, do you want me to assign it to you?

wjk commented 4 years ago

Yes please.

crutkas commented 4 years ago

done and done :) I'd imagine this would be part of the "File Explorer" section, fyi for enabling / disabling

kapsiR commented 4 years ago

@wjk Any progress here so far? I think this could be part of the Hacktoberfest 2020 😉

Just to have the PowerShell version documented, e.g.:
Remove-Item .\node_modules\ -Force -Recurse

wjk commented 4 years ago

@kapsiR I immediately ran into problems setting up the UI for this feature. Writing a shell extension according to the spec — that’s easy. Getting it integrated into PowerToys’ settings UI in the many disparate ways that are required — that's hard. The documentation and code here really needs a rewrite. I haven't touched this in months. Sorry about that!

crutkas commented 4 years ago

@wjk, please reach out if you have any issues. Lets fix both the collaboration docs fixed as well as get this added in.

Either here or crutkas@microsoft.com

crutkas commented 4 years ago

always remember, programming is a team sport, never go it alone

Hurstwood commented 4 years ago

If you want to delete a lot of files (large number of files, not large files) thru File Explorer, right now it is a very slow process, compared to doing the same thru the command line.

The article you linked to states "It may take ten or twenty minutes, or even longer, to delete a large folder using Explorer on Windows devices.".

Recycling items/deleting items to the recycle bin is super slow. 'Perminant' delete is fast. To perminantly delete, select the files and press shift and del at the same time.

I've just tested this now on my SSD to make sure i wasn't talking nonsense as i don't think about it anymore. It took over 5 minutes to copy 6200 files (10GB) to a new directory. A shift and del perminant delete, deleted all of the files within 20 seconds from when i pressed the buttons on the keyboard to when the dialog box disappeared after it had finished.

Looking at the docs https://docs.microsoft.com/en-us/windows-server/administration/windows-commands/del It appears that cmd del 'perminantly' deletes files too.

Without doing a speed comparison, i imagine they're both similar in speed.

enricogior commented 4 years ago

I did the following test using the PowerToys repo after a full build as test folder to delete:

image

I used a mechanical SATA drive, after creating the test folders and after rebooting Windows to make sure nothing was in the Windows cache: DEL /F/Q/S *.* > NUL took about 24 seconds, note that it didn't delete the 4,000 folders, only the files. Shift + Delete from Explorer took about 29 seconds and deleted all the files and folders.

The difference doesn't seem to justify a new PowerToys module with all the work that will require to maintain and support it. We are a very small team, if something is not a game changer, we should have a conservative approach before adding new functionality.

crutkas commented 4 years ago

@wjk did you ever do some speed tests for this?

wjk commented 4 years ago

@crutkas I'm afraid I didn't. Things have gotten rather busy due to my university schedule. I do agree with @enricogior that this tool might well be unnecessary. If you want to close this issue and the associated PR for this reason, I'm perfectly OK with that. Thanks!

enricogior commented 4 years ago

@wjk OK I'll close the PR and the issue, sorry for not having done more research ahead. Next time we'll try to do a better job, you are always welcomed to contribute to the project.

crutkas commented 3 years ago

@enricogior, i just did this on two sets of PT clones after a full compile.

12 gigs with about 75k files

21 seconds <- directly running DEL /F/Q/S . > NUL 51 seconds <- shift delete

Speed wise, this is pretty drastic

enricogior commented 3 years ago

@crutkas you have to test it against the exact same data after a reboot, otherwise you can get very different results. When I did the test, I did it multiple times and on both SATA harddrives and PCIe SSDs.

crutkas commented 3 years ago

this was exact same data. Base folder was copied to two new dirs on same drive on a SSD not on my c drive.

enricogior commented 3 years ago

@crutkas unless you reboot, part of the data is cached and the results will be affected by that. Also you cannot do the test "in parallel" since different data location can also affect the results. Also, different version of Windows might affect the results. It's tricky to get consistent data.

crutkas commented 3 years ago

was not done in parallel. Did test 1 then test 2. happy to do multiple tests and average but i think 'real world' is critical.

enricogior commented 3 years ago

@crutkas what I mean, is that the entire test must be serialized, including the data creation:

  1. generate the data in a consistent manner (for example copy it form an external source)
  2. reboot the machine
  3. delete the data with method 1
  4. repeat the test for method 2 starting from step 1.

In other words, the data should be recreated after each test, it should not be created for both tests and then run the tests.